Avoiding Duplicate Content
Most web site owners know that duplicate content can hurt their rankings. But often they don’t know what to do about this issue. First, it might be helpful to understand how duplicate content is generated.
For those who don’t know, duplicate content is any content that appears online in more than one place with a different URL. Search engines find it difficult to decide which one is more relevant. They often have no method of determining which one is the original. Therefore, in order to give users the best search experience, they only show one version instead of all of them.
All of this results in the site owner being penalized with lower rankings and slower traffic numbers. Search engines also have trouble knowing which content to rank for query results. They cannot determine whether or not they should include the various versions or not. Finally, they have no way of deciding whether to direct the link metrics to a single page or to several different pages.
Causes of Duplicate Content
There are several reasons why you may have duplicate content out there in cyberspace. One of the most common is “printer-friendly” versions. Printer friendly versions of your content might be helpful for site visitors. No one likes having to print out huge colorful banners just to get their favorite recipe. But unfortunately, when you create a printer friendly version of a page, the search engines will index them and then show two copies of the same content.
Facts about URL Parameters
This occurs when a website owner assigns several URL’s that point to the same page. Search engines like Google strive to crawl websites as efficiently as possible. When they find several copies of the same content, it causes their process to become less effective. How do they know which page is best? Normally, what happens is that they group the URL’s into clusters, consolidating properties from the duplicate into one URL that will provide users with better search results.
Sometimes, web sites will give different session ID’s to each site visitor. This creates multiple URL’s that point to the same page. However, the session ID parameters used to track visitors on a site do not cause a problem. A retail store might create several URL’s that point to the same furniture ad. To combat this, Google has created a parameter handling tool designed to provide them with information on how to handle URLs containing specific parameters.
SEO Best Practice
Google recommends that anytime content can be found at multiple URLs, it should be canonicalized for search engines using the parameter handling tool or a 301 redirect to the correct URL.
For pages that you do not want the search engines to index, use the Meta Robots tag, “noindex, follow”. Pages marked like this will be crawled but they will not be included in search engine directories. This method is commonly used when a webmaster is having pagination issues.
Adaptivity Pro offers expert Utah SEO services to businesses throughout the U.S. No matter what issues you may be experiencing with your site, the professionals at Adaptivity Pro can fix it. Rank higher with excellent, well-written meta titles and descriptions! Adaptivity Pro of Utah also builds high end web sites and prides itself on being a number one UT web design firm. You can choose the level of service that best suits your business. Let our experts build and maintain your site for you. We can also provide professionally written content as well as interesting blog posts that can be posted weekly or as often as you like.
Adaptivity Pro partners with companies to help them increase their sales with a great web site that will consistently rank well. For professional Salt Lake City SEO, choose Adaptivity Pro.
Article Provided By:
Adaptivity Pro Web Design
Salt Lake City Utah
P.O. Box 951049
South Jordan, UT