This Forum has been archived there is no more new posts or threads ... use this link to report any abusive content
==> Report abusive content in this page <==
Post Reply 
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
How many mod_rewritten pages is too many?
04-08-2014, 06:38 PM
Post: #1
How many mod_rewritten pages is too many?
I'm creating a website which aggregates and displays areas where photos have been taken in a particular location. 99% of my human visitors will type a place in the search bar on my website. But I also wanted to provide a 'Browse Locations' section in the site, mainly for SEO to crawl, where areas of the world can be drilled down in to, e.g. United Kingdom > England > London > Trafalgar Square.

To do this I'm going to use the geonames API, and the aim is to target search engine searches for for 'photos in trafalgar square', for example with a 'dedicated', mod_rewrit[ten] page.

My question is really, does this sound dodgy? Obviously there could be hundreds of thousands or millions of seemly static pages if I do this. Not to mention I'll blow my API usage limit when the Browse section gets crawled (I'm assuming there's some kind of 'crawl this section once' directive I can put on the section, but haven't looked into this yet). Is there a point where the crawler 'gives up' or even penalises the site?

I suppose I could limit it to a certain level of place, e.g. stop and London in the heirarchy, but still this will be probably tens of thousands of pages.

Be interested to hear your thoughts.

Ads

Find all posts by this user
Quote this message in a reply
04-08-2014, 06:48 PM
Post: #2
 
From what I could find. Lots of pages isn't inherently bad. In addition to the content creation and social benefits, having many pages allows you to target a lot more keywords with a lot more title tags.

When you get a lot of pages you need to pay more attention to your Googlebot crawl budget. There will be pages that Googlebot may crawl only once *ever* or maybe even not at all, depending on your pagerank. You need to make sure that your pages that get the most traffic and change most frequently get crawled more often. You can set priority and last-modified in your sitemap to make that happen. And you can "sculpt" your pagerank so that your important pages don't get PR 0. You also need to pay more attention to user experience. Google doesn't like it when a page ranks for something popular, but has little content: users end up unsatisfied. You have your navigational structure so that's good.

Things to keep in mind:
- make sure you have a good information architecture in place
- the url structure will be critical at that scale and you want to avoid serving the same content from multiple urls (you don't want 100,000 indexed urls)
- don't publish any of the "stub" pages
- consider carefully what you do with your high value pages - whether they are well-linked, good sources of traffic, important to structure, etc
- have a redirect plan in place

Hal Smith
URLdreamer Consultant

Ads

Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)