This Forum has been archived there is no more new posts or threads ... use this link to report any abusive content
==> Report abusive content in this page <==
Post Reply 
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
SEO problems How do I use GWT to tell Google I no longer have 800 pages?
04-13-2014, 07:23 AM
Post: #1
SEO problems How do I use GWT to tell Google I no longer have 800 pages?
I have a customer who had a company put and indexible IDX on his site. This IDX took all the listings from the local MLS and fooled Google into thinking that every listing or home for sale on the MLS was a page on his website. We discontinued this service a year ago but my SEO quake still shows him with 900 pages that Google indexed with his domain. It seems as though that these pages on the indexible IDX all start with the property address. Here is a sample: http://homesaleaz.com/34714-n-27th-avenu...o--4745875 Is there something I can do to to have Google disregard these pages? Some code I can add to one of the files to redirect any page that begins with a number?

Thanks
I thought maybe a wildcard in the htaccess file? I am sure this indexivble is killing his site, I had reposted one of his listings on a new site I built for an agent to show them how to repost listings with keyword rich content and a new site with 1 article is on page three while this site with 100 articles is almost non existent? This happened twice with taking his listings and reposting to new sites. Maybe I should not post 9 listings in the same subdivision because Google may see that as black hat seo? His domain is http://homesalesaz.com and under listings tab is where we repost listings

Ads

Find all posts by this user
Quote this message in a reply
04-13-2014, 07:37 AM
Post: #2
 
Certainly updating the site man should help, give them time to reindex, I believe there's a page for submitting individual URLs for update, not sure you can request expedited re-index for a bunch.
https://support.google.com/webmasters/an...4033?hl=en

You could create a 410 (gone forever) error page for each url.

On Linux servers you can insert 301 redirects in the .htaccess file, probably have to do each one individually, I'm not sure about wildcarding, not a worthwhile job for soon to be removed links.

Ads

Find all posts by this user
Quote this message in a reply
04-13-2014, 07:52 AM
Post: #3
 
Two things I can recommend.

1). I'm sure you've already done this... Update the Robots.txt file to tell Google which pages/directories to crawl and which to not crawl.

2). In Google Webmaster Tools, there's a menu called "Google Index", then a sub menu for Remove URLs.
The downside is that you have to do it one URL at a time. It's time consuming, but it'll work.

A 3rd option would be to start over with a new URL altogether.
Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)