MrGeeK
03-10-2006, 05:55 AM
I've read lots of different opinions on duplicate content.
Some people say it's down to the paragraph, others say the page, and yet others say it's an entire site.
I'm trying to decide where I stand on duplicate content, so far no one has been able to show me an example of anything less than an entire site duplication causing them to be hit with duplicate content filters.
My own experiences would tend to back up the theory that the duplicate content filters only kick in when there are multiple exact copies of a site. (ie mirrors) I have a number (3) of sites that share a database and all display at least some of the content in the same "paragraph" format. All of these sites are indexed in Google.
Site 1 has 34K pages in google, Site 2 has just under 40K pages in google. Both of these sites where created early/mid last year and are almost identical, they even use the same template, there are some menu/title text differences on each page though.
I just did some further checking on google, and pages with the exact same content, in the same order with the same title tags are cached and indexed. (ie the only differences are small bits of text in the side bar, like the name of the site)
Both of these sites have the same PR (3) and are hosted on the same IP address.
These sites are both feeders for the 3rd site which is a bigger version of the other two, it has the same content as the other two, plus a lot more. This time the template is completely different. This site was the orginal site and was started in 2000.
So to me that would seem to indicate that at present the duplicate content filters are only effective at the entire site level, or active all the time/target specific segments.
Of course, Big Daddy might just change all that.
I'd be interested in hearing from anyone who has had duplicate content filtering hit them (or not) with information similar to what I posted above.
Thanks
Some people say it's down to the paragraph, others say the page, and yet others say it's an entire site.
I'm trying to decide where I stand on duplicate content, so far no one has been able to show me an example of anything less than an entire site duplication causing them to be hit with duplicate content filters.
My own experiences would tend to back up the theory that the duplicate content filters only kick in when there are multiple exact copies of a site. (ie mirrors) I have a number (3) of sites that share a database and all display at least some of the content in the same "paragraph" format. All of these sites are indexed in Google.
Site 1 has 34K pages in google, Site 2 has just under 40K pages in google. Both of these sites where created early/mid last year and are almost identical, they even use the same template, there are some menu/title text differences on each page though.
I just did some further checking on google, and pages with the exact same content, in the same order with the same title tags are cached and indexed. (ie the only differences are small bits of text in the side bar, like the name of the site)
Both of these sites have the same PR (3) and are hosted on the same IP address.
These sites are both feeders for the 3rd site which is a bigger version of the other two, it has the same content as the other two, plus a lot more. This time the template is completely different. This site was the orginal site and was started in 2000.
So to me that would seem to indicate that at present the duplicate content filters are only effective at the entire site level, or active all the time/target specific segments.
Of course, Big Daddy might just change all that.
I'd be interested in hearing from anyone who has had duplicate content filtering hit them (or not) with information similar to what I posted above.
Thanks