i have a site thats about 6 months old, has 10,500 pages indexed on google out of a possible 2 million pages, spiders crawl over 30000 pages per day every day, havent seen the spider leave my site ever but only a few hundred pages get added per day if any. What should i be working on to get more pages indexed? its pretty much impossible to get deep links as i have too many pages so when i do get links i get them to the main page... what would you do other then sit and wait?
2 million pages!!!? wow,well that will take them awhile to crawl I suppose all are linked together some way. maybe a sitemap but but it would take along time for server to build that many pages/links
1. Create a Valid XML Sitemap; add the auto-discovery sitemap directive to your robots.txt & submit the sitemap to Google. 2. Make sure you are properly internally linking your web pages, most issues regarding pages not getting indexed are those pages being buried in your websites internal linking structure. 3. Build backlinks to your internal web pages, aim for web pages that contain many links to even deeper pages. Hell, buildbacklinks to all your pages if you can.
that's a lot of page man...i would be happy if spiders indexed it on daily basis...30000 is not bad... try using a site map, it helped my site and i only have 600 pages...
Thanks everyone for the replies, well as for sitemaps i did have a few but pretty much got rid of them because i dont have a problem getting google to spider my site, every single page is found very easy from the home page, getting spidered is not a problem now getting indexed is. it is a database site, all pages do have unique titles, meta desc and meta keywords. i pretty much just dont get why google elects to list some and not others. Building backlinks to internal pages would probably work the best from what ive learned yet that will be alot of links to look but starting on that now.