So I've been working on a project for a client. As it stands their site doesnt have many pages. About 150 pages, all indexed. They also have about 1500 backlinks pointing to their site. They get very little traffic, about 4-10 hits a day from google. What they wanted me to do is a local search engine optimization campaign for them. What I did was create a database with all the city,state,zip info, and loaded my keywords and page content for each keyword in the database. So as it stands now, I have created over 360,000 highly locally optimized pages. I.E - call to the database "www.xyzsite.com/services.php?st=california&city=irvine&service=widget repair" ANy call to city or state and service will return a result for that service with all the meta, h1,h2,alt tags being specific to that local search term. My question is now... I have over 360,000 pages. Most of the local search terms return less than 5-10k results in Google. I know it would be easy to reach front page. But how do I get all those pages indexed without google banning my site?
If it already has 1500 backlinks, the probability it gets indexed on its own is likely. Of course it depends on the structure of the site and to which pages the current backlinks point and how many clicks the new pages are away from the pages that have backlinks links at the moment. If you don't trust the structure of the website, I would either suggest to re-arrange the structure or either get backlinks to the new pages or pages within one click from them.
Hello, I agree with the above. If those 360k pages do have some content on them, and are not just empty pages to rank high on poorly searched keywords, i can even suggest building a sitemap and linking it from your home page. You can even submit to google sitemaps trough google webmaster tools and will get the sitemap crawled in about 24-48 hours. Even so, it will take a while to get them all into the index, but you can't really rush it without risking anything. Also, those pages would need to pass the duplicate content filter in order to show up in regional results (I think). Hope this helps
I assume they are all linked to from somewhere so they will eventually get indexed. If I understood you correctly, you are afraid that since all new pages have the same structure and textual content (except for the relevant city name, state name, service type etc..) you might get banned for having duplicate content (or in this case a whole lot of almost identical duplicate content)
I really do not get your point by saying having pages indexed without your site getting banned, what's the connection anyway? Many websites have tens of thousands of web pages and hundreds adding to them everyday and getting indexed automatically with no problems. Just make sure each page has at least one static text links from an already indexed page.
The only problem I can see is that you don't seem to have original content on each page. If that is the case then you would probably end up getting most if not all the pages pushed to the supplemental listings.
Well without some kind of insane linking structure Google won't care about those pages anyway with only 1500 backlinks, make sure to create a sitemap to manage all those pages.