I see posts on forums and people are talking about launching websites that have 10k, 20k, and 50k pages. How are webmasters building websites with 50k webpages?
Let me answer - all depends on site traffic, and content. If it's static content, - site could be high-traffic - but server could handle requests without problems. But... if you have dynamic web-site - and want it handle high traffic - then a lot of work needed. Main tasks are: 1. analizing of site structure 2. database structure optimization 3. site script optimization 4. cache technologies using 5. clustering at the end
That can be done very easily if you fetch your content from dmoz.org, affiliate data feeds or free article sites.
I think scraping is stealing someone elses content... These feeds I was talking about, are made for re-use. For example > http://www.magportal.com/help/free_feeds/what_is.html
I have a 300,000 page site, but the pages are created from content of a database. So I never really created 300,000 pages, but they do exist. Hope this helps.
bosco, where are you getting 300k pages of content for your database? Are you pulling info from sites and putting into your database? Is it a MFA site that has the same content over and over again with only a few words changed on each of the 300k pages? Or is each page completely different?
http://www.freelance-designers.net I only have about 20 static pages, but my google sitemap has 300,000 pages for this site. Its a new site so there aren't many pages indexed right now, but they should all index. There are so many pages because there is a page for each city/state in each category. Lots of cities.
One template - and lots of database content. Personally, I can't see the benefit of having 50k pages unless they contain relevant content.