Can't see images?      Printer-friendly version
September 8, 2009
Issue 576
 
InfoSpace Announces Strong Second Quarter Results - Sep 7, 2009
InfoSpace, Inc. (NASDAQ:INSP) today announced financial results for the second quarter ended June 30, 2009. "I am extremely pleased with our results in the second...
(Full article at Your Story)

Google Porn Filter Gained China's Thumbs-up - Sep 6, 2009
China approved of Google's efforts to filter porn from search results on its China portal following state-led criticism of the links, the former...
(Full article at PC World)

Social Search Engine Searchwiki Launches - Sep 5, 2009
The newest social search engine, Searchwiki, has launched and it gives users speedy results from both major search engines and social networking...
(Full article at Examiner.com)

Microsoft to add 'Ping' to Bing to share search results - Sep 4, 2009
Microsoft is testing a new feature called "Bing & Ping" for its Bing search engine that allows people to share search results on social networks Facebook and Twitter. According to a post on...
(Full article at Sn Francisco Chronicle)

 
 

SEO Effect Of Duplicate Content

There are a lot of ways that you can improve your site's page ranking in search engines, unfortunately, not all of them are good.

Some people employ certain methods in acquiring a high page rank in search engines, even if these are considered to be deceitful in the sense that they are designed to trick the search engines - one of these methods is actually duplicating web content.

What is duplicate content?

Duplicate content in SEO is actually any web content that is considered to be similar to another site. Search engines have actually implemented new filters specifically to monitor these types of deceitful attempts to improve site's search engine page rankings.

A lot of people think that by creating multiple but similar replicas of their web pages or content, that they will be able to improve their site's page rankings since they will be able to get multiple listings for their site.

Since search engines are now monitoring these types of trickery, sites using duplicate content can actually end up getting banned from search engine indexes instead of improving their ranking.


 


 

What are considered as duplicate content?

There are a couple of duplicate content types that are being rampantly utilized by a lot of people, each one a bit different in their use, but all of them employed for the same purpose, which is to trick search engines to get better page rankings.

One way of getting duplicate content is by having very similar websites or identical web pages on different sub-domains or domains that offer basically the same content. This may include landing or door pages aside from the content, so make sure that you avoid using this if you don't want your site to become vulnerable to search engines' duplicate content filter.

Another method of creating duplicate content is by simply taking content from another website or page and reorganizing it to make it appear dissimilar to its original form, though it is actually the same.

Product descriptions from many eCommerce sites are actually being utilized by other sites as well. Other sites simply copy the product description of manufacturer's utilized by other competitive markets as well. And add the fact that the product name, as well as the name of artist, manufacturer, writer or creator would be included, a significant amount of content would show up on your page. Although this is much harder to spot, it is still considered to be duplicate content, or spam.

Distribution of copied articles by other sites other than the one that distributed the original article can also be considered to be a duplicate content.

Unfortunately, although some search engines still deem the site where the original article came from as relevant, some however, do not.


 


 

How do search engines filter duplicate content?

Search engines filter for duplicate content by using the same means for analyzing and indexing page ranking for sites, and that is through the use of crawlers or robots. These robots or crawlers go through different websites and catalogues these sites by reading and saving information to their database.

Once this is done, these robots then analyzes and compares all the information it has taken from one website to all the others that It has visited by using certain algorithms to determine if the site's content is relevant, and if it can be considered as a duplicate content or spam.

How to avoid duplicate content?

Although you may not have any intentions to try and deceive search engines to improve your site's page ranking, your site might still get flagged as having duplicate content.

One way that you can avoid this from happening is by checking yourself if there are duplicate contents of your page. Just make sure that you avoid too much similarities with another page's content for this can still appear as duplicate content to some filters, even if it isn't considered to be spam.

About the Author: Craig Edmonson - The Google Top 10 offers SEO Services to U.S. based companies. Stop by for a free web site analysis.


 
















  Advertise Here

Submit Your Site
Express Inclusion lists your site throughout the Entireweb Network within two business days. The network receives over 100 million searches every month!
Advertise Here!
Advertise in this world-class newsletter about search engine optimization and website promotion!
SpeedyAds
A non-invasive, non-annoying, low-cost way of getting your site in front of thousands of people, to announce new projects or boost traffic to your sites.

About Entireweb   |   Our Services   |   Privacy Policy   |   Contact Us

© 2009 Entireweb.com. All Rights Reserved

To discontinue mailings, click here