New Year’s Resolution: Eliminate Duplicate Website Content

When it comes to web design and optimizing a site for search engines, content is King. However, search engines count negatively against sites containing considerable amounts of completely matching content or similar content.

Oftentimes, closely-matching content within a site isn’t an attempt to attract search results; rather, duplicate content tends to occur because bulletin boards or blogs are generating content targeted at mobile devices, items from online stores are linked externally through a number of distinct URLs, or versions of web sites are created as printer-only versions.

No longer is it recommended to block robot and crawler access to pages with duplicate content through robots.txt files or any other means. When you prevent a search engine from crawling a page containing duplicate content, you also prevent it from detecting that the URLs within the page direct users to the same content. The robots will then treat the URLs as unique pages.

The recommended method for directing search engine crawlers through duplicate content is to mark the pages as duplicates with the rel=”canonical” link element, which is a URL parameter-handling utility. 301 redirects are another option to use. Crawl rates may also be adjusted in Webmaster Tools, if you find that your duplicate content is creating too much search engine crawling activity on your site.

Unless it is apparent that the use of duplicate content is intended to deceive and manipulate crawlers and search results, having duplicate content often goes unpunished. However, if you don’t follow the above procedures for duplicate content, the search engines will choose on their own which version of the content will show up in search results. This choice may not be your desired choice.

On the other side of the coin, if you are in fact partaking in black-hat and deceptive content practices, your site may be removed from search results completely. When this happens, the next step is to peruse the entire sitemap and content of your website. You may make corrections and resubmit your website to search engines for review.

When another site takes and publishes your content, it won’t necessarily have a negative effect on how your site is ranked on search results pages. However, there are DMCA requests that may be filed with search engines in order to set records straight over who owns the original content.

In deliberate cases, duplicate content is produced across a number of domains in a black-hat attempt to tinker with search engine rankings, or to gain higher volumes of traffic landing on a domain. These deceptive practices result in high bounce rates for users. Search engines may be finding an abundance of the keywords they are looking for, but humans who are reading the content won’t find anything engaging or comprehensible to read.

One of the simplest ways to avoid duplicate content or the penalization of duplicate content use is to utilize best practices in SEO Copywriting. Best practices for SEO copywriting include creating a detailed sitemap for your website, where each page covers one topic only. By doing so, this eliminates the potential of duplicating your content.