This Forum has been archived there is no more new posts or threads ... use this link to report any abusive content
==> Report abusive content in this page <==
Post Reply 
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Coping the same website to many different sub domains, can be cause to i get in the google black list?
04-28-2014, 11:31 AM
Post: #2
 
If not a penalty, duplicating content on multiple pages will not rank well in the search engines, you can block search spiders from these duplicated pages with a robots.txt file if they are just for filler and not search ranking.

Normally one would use a canonical tag to indicate which is the master copy of some content, which causes only that master copy to rank.

Some articles on that subject on an authority site:

http://moz.com/community/q/new-client-wa...ort=oldest

http://moz.com/community/q/multiple-citi...te-content

http://moz.com/community/q/multiple-site...te-content

http://www.smallbusinesssem.com/how-to-c...ties/6532/

By Google's SEO spam master
http://www.mattcutts.com/blog/give-each-store-a-url/

Ads

Find all posts by this user
Quote this message in a reply
Post Reply 


Messages In This Thread
[] - Jake - 04-28-2014 11:31 AM
[] - Bigb 007 - 04-28-2014, 11:34 AM

Forum Jump:


User(s) browsing this thread: 2 Guest(s)