I'm creating a website which aggregates and displays areas where photos have been taken in a particular location. 99% of my human visitors will type a place in the search bar on my website. But I also wanted to provide a 'Browse Locations' section in the site, mainly for SEO to crawl, where areas of the world can be drilled down in to, e.g. United Kingdom > England > London > Trafalgar Square. To do this I'm going to use the geonames API, and the aim is to target search engine searches for for 'photos in trafalgar square', for example with a 'dedicated', mod_rewrit[ten] page. My question is really, does this sound dodgy? Obviously there could be hundreds of thousands or millions of seemly static pages if I do this. Not to mention I'll blow my API usage limit when the Browse section gets crawled (I'm assuming there's some kind of 'crawl this section once' directive I can put on the section, but haven't looked into this yet). Is there a point where the crawler 'gives up' or even penalises the site? I suppose I could limit it to a certain level of place, e.g. stop and London in the heirarchy, but still this will be probably tens of thousands of pages. Be interested to hear your thoughts.