How does Google react to new sites that are fairly large, many pages with breadcrumb and prev/next page nagivation. Would it be better to limit whats open at first and slowly add sections or just open everything up at once and maybe risk getting tagged as spam.
Build sites for people - not search engines. This will only make indexing better as people will be in love with it. Dont limit your growth and production to insure google is happy.
I agree with timallard, you need to build for your audience not for the search engines. Once you build a quality site that will attract visitors, you can fine tune and optimize your site for search engines. As far as rather Google likes large sites or not. They love them as long as the connecting pages are relative to each other and you load the page with original quality content. The whole point of a search engine is to find the best site for that niche and large well informed sites will often rank high.
Yes both of them are showing you the right path In short if you build it for people then ultimately the engines would be happy (if everything is in place) and you would get benefited. Also technically there is no issue with large sites. Best Wishes on your site lunch
i'm not sure i agree with the comments above. if you build a site just for people, well, that's not SEO. the whole idea is to optimize for the search engines right? that's why we are here. so keep the people happy, but also keep the search engines happy. the two are not the same, unfortunately. if the two were the same, we wouldn't be here agonizing about how to rank in the serps. anyway, a large site is better than a small site. more unique content means more honey for the ants to find. but there is a technical problem. you might have problems getting a very large site indexed if it is new. if you suddenly debut a site with a gazillion pages, googsy might index only a tenth of them. there is a trick for getting more pages indexed... and that is to separate content off into various subdomains. google seems to treat each subdomain as a separate site, so if you split your content into subdomains, you can get more content indexed than without subdomains.
Do it in sections definately, i've launched a few 1/4 million page sites recently and the indexing can be very slow without solid deeplinks.
Thanks for the feedback guys, I have sections split into subdomains and think I will open up a new one once a week or so. Google site map should help it find deeper pages, hopefully.
if you are going to start a new and large web site, you will need TIME and so many BACKLINK to get traffic from google
google ranks each page individually, so dont worry about making large sites, i got an 80 page site that competes with 10k page sites....
Oh yes, I agree. But just dont get lost in the SEO. Just follow standard practices. Use H1, H2's etc. Alt text on images. etc etc. I did a test on a few websites, one where I got lost in SEO and totally just did opt after opt after opt and it did just about the same as a "not paying attention" to it type site and the results were very surprising.
good, good... it's good to have sitemaps. however, the oft-misunderstood and most unfortunate thing about sitemaps is that they do not guarantee that all of your site will get indexed.