How to Avoid the Google Duplicate Content Filter?


More and more webmasters are building websites with publicly available content (data feeds, news feeds, articles). This results in websites with duplicate content on the Internet. In cases of websites build on news feeds or data feeds you can even find websites that match each other 100% (except for the design). Several copies of the same content in a search engine does not really do any good and so Google apparently decided to weed out some of this duplicate content to be able to deliver cleaner and better search results.

Plain copies of websites were hit hardest. If a webmaster was publishing the exact same content on more than one domain, all domains in question were eventually removed from Google's index. Many websites based on affiliate programs suddenly took a big hit in loss of traffic from Google.com. Shortly after this started some webmaster forums saw the same complaints and stories again and if 1 + 1 was put together a clear picture of the situation was available: a duplicate content filter was applied.

Duplicate content is not always bad and will always exist in one way or the other. News websites are the best example of duplicate content. Nobody expects those to be dropped from Google's index.

So, how can webmasters avoid the duplicate content filter? There are quite a few things webmasters can do when using duplicate content of any sort and still create unique pages and content from it. Let's see some of these options explained here.

1) Unique content on pages with duplicate content.

On pages where duplicate content is being used, unique content should be added. I do not mean like just a few different words or a link/navigation menu. If you (the webmaster) can add 15% - 30% unique content to pages where you display duplicate content the overall ratio of duplicate content compared to the overall content of that page goes down. This will reduce the risk of having a page flagged as duplicate content.

2) Randomization of content

Ever seen those "Quote of the Day" thingies on some websites? It adds a random quote of the day to a page at any given time. Every time you come back the page will look different. Those scripts can be used for many more things than just displaying a quote of the day with just a few code changes. With some creativity a webmaster can use such a script to create the impression pages are always updated and always different. This can be a great tool to prevent Google to apply the duplicate content filter.

3) Unique content

Yes, unique content is still king. But sometimes you just cannot work around using duplicate content at all. That is alright. But how about adding unique content to your website, too. If the overall ratio of unique content and duplicate content is well-balanced chances that the duplicate content filter applies to your website are much lower. I personally recommend that a website has at least 30% of unique content to offer (I admit - I am sometimes having difficulties myself to reach that level but I try).

Will this guarantee that your website stays in Google's index? I don't know. To be most successful a website should be completely unique. Unique content is what draws visitors to a website. Everything else can be found somewhere else, too and visitors have no reason to just visit one particular website if they can get the same thing somewhere else.

About The Author

Christoph Puetz is a successful Entrepreneur and international book author. Websites currently operated by Christoph are Credit Problems Help and Highlands Ranch Colorado. PPC and SEO Services provided by the author can be found at Net Services USA LLC.

Note: This article can be published by anyone as long as the resource box (About the Author) is posted on the website including the links and that these links must be clickable. This last paragraph in italic font informing about the author resource box does not need to be ublished.

home | site map
© 2022