Understanding how to deal with duplicate content

Saturday, November 14, 2009

Why does Google care about duplicate content? Search engines and particularly Google generate revenue primarily through paid advertising. Since web searchers are free to use any of the major and minor search engines on the web each one is trying to provide users with the best possible experience in order to increase its popularity and revenue though paid advertising. When users are presented in the search results’ page with links leading to almost identical content users they're understandably annoyed. Here are some useful definitions that will help you understand the problem better: Unique content: Unique content is written by humans and is different from any other content published on the web. When content is original, it is different from any other combinations of letters, symbols and words that can be found online. Unique content is never an alternation of some other web site’s content using any kind of computer text-processing algorithms. Snippets: Snippets are usually small chunks of content like quotes that are copied and re-used. This usually does not cause any problems with the search engines. However, having pages that use only small snippets without any other substantial content is considered dangerous, since search engines might consider that you are trying to increase the visible content of your web site by scraping content from other web sites. Duplicate Content Issues: If our web site has duplicate content issues, this does not necessary mean that it will get penalized but that there is duplicate content that forces the search engines to decide which one is the most appropriate for their search results’ page. Duplicate Content Penalty – Duplicate content penalty” means that your web site will be penalized or removed from the index Dealing with duplicate content • Avoid empty pages: Users do not like to see empty pages when they click on a link provided to them by the search engines and search engines do not like serving users with empty pages because it creates a bad user experience. • 301 redirects: Use 301 directs to point from the non-www version of your web site to the www ones. • Use the preferred version of the webmasters URL: Google and other major search engines give you the option of selecting the format of the URL you want to be used. • Increase the amount of unique content in the pages of your web site that do not contain enough content either block them from being indexed until they have some substantial content. • Rewrite page titles: Page titles are a very important factor in the search engines algorithm. By having unique and descriptive pages titles we can differentiate our content from other similar on the web. Page titles can also improve user experience and click through rates since they are used by the search engines in the search results’ pages. • Rewrite Meta Description: Even though the use of the meta description tag does not affect your web site’s rankings it is still very important when trying to deal with duplicate content issues. If it is very difficult to generate a Meta Description for every page than you should consider dropping the Meta Description tag completely and let the search engines use the content from your web site for the search results’ page snippet. • Lighten your code: Avoid using html for layouts and styling and replace Javascript menus with CSS based ones. • Emphasize Unique Content: One other way to deal with duplicate content is to give emphasis on the unique content of your web site. The use of

,, , , , , , , ,
etc. can help convey more information to the search bots about the actual use of content in each page of your web site.

Why does Google care about duplicate content?

Search engines and particularly Google generate revenue primarily through paid advertising. Since web searchers are free to use any of the major and minor search engines on the web each one is trying to provide users with the best possible experience in order to increase its popularity and revenue though paid advertising. When users are presented in the search results’ page with links leading to almost identical content users they're understandably annoyed.

Here are some useful definitions that will help you understand the problem better:

Unique content: Unique content is written by humans and is different from any other content published on the web. When content is original, it is different from any other combinations of letters, symbols and words that can be found online. Unique content is never an alternation of some other web site’s content using any kind of computer text-processing algorithms.

Snippets: Snippets are usually small chunks of content like quotes that are copied and re-used. This usually does not cause any problems with the search engines. However, having pages that use only small snippets without any other substantial content is considered dangerous, since search engines might consider that you are trying to increase the visible content of your web site by scraping content from other web sites.

Duplicate Content Issues: If our web site has duplicate content issues, this does not necessary mean that it will get penalized but that there is duplicate content that forces the search engines to decide which one is the most appropriate for their search results’ page.

Duplicate Content Penalty: Duplicate content penalty” means that your web site will be penalized or removed from the index

Dealing with duplicate content

Avoid empty pages: Users do not like to see empty pages when they click on a link provided to them by the search engines and search engines do not like serving users with empty pages because it creates a bad user experience.

301 redirects: Use 301 directs to point from the non-www version of your web site to the www ones.

Use the preferred version of the webmasters URL: Google and other major search engines give you the option of selecting the format of the URL you want to be used.

•Increase the amount of unique content in the pages of your web site that do not contain enough content either block them from being indexed until they have some substantial content.

Rewrite page titles: Page titles are a very important factor in the search engines algorithm. By having unique and descriptive pages titles we can differentiate our content from other similar on the web. Page titles can also improve user experience and click through rates since they are used by the search engines in the search results’ pages.

Rewrite Meta Description: Even though the use of the meta description tag does not affect your web site’s rankings it is still very important when trying to deal with duplicate content issues. If it is very difficult to generate a Meta Description for every page than you should consider dropping the Meta Description tag completely and let the search engines use the content from your web site for the search results’ page snippet.

Lighten your code: Avoid using html for layouts and styling and replace Javascript menus with CSS based ones.

Emphasize Unique Content: One other way to deal with duplicate content is to give emphasis on the unique content of your web site. The use of

,, , , , , , , ,
etc. can help convey more information to the search bots about the actual use of content in each page of your web site.

People who read this post also read :



0 comments:

Post a Comment