Hi, I got hold of some data recently (shop listings), and I am planning to create web pages from this data. i.e business type -> county -> town -> list of shops in town -> shop However the pages on my website will increase from less than 100 to thousands practically overnight. Will this cause any problems with Google, i.e. will it consider these pages as spam, and penalize my site? Thx
how can it be, think about all those big sites like bbc, cnn, etc... there are thousands of pages to be added in everyday, will google consider them as spam?
I've heard lots of debates on this. What we do to be safe is rather than uploading stuff that will take a site from small to 30,000, we simply upload everything, then only link to a few things so that it will keep the rest unaccessible. then gradually activate them.
If you hare a trusted site with hundreds of thousands of backlinks like BBC and CNN, then your pages will be indexed quick. You cannot have more pages indexed than your backlinks will support. How many deep links will you have? How may deep links do you think BBC and CNN have?
What you really have to think about in this case is the authority of the BBC and CNN. That is what makes it possible for them to add 1000s of pages in an instant and get them indexed. The same "authority" isn't given to most sites so it's likly that it will take a very long time to get all (or even most) of the new pages indexed, Let alone ranking for anything.
add a the pages a few at a time....maybe a couple hundred everyday....plus google is not going to spider and index them at once..
Sorry I do not understand what you mean by the last sentence. Also, I thought that maintaining an updated sitemap.xml would help your new pages to get indexed quickly.
The number of pages that is indexed is determined by the pr of your site and the number of inbound links. If the content is useful for your visitors that by all means add it. It should generate some natural backlinks and your users can navigate through the content to find what interests them.