I just do not understand Google. Our competitors' sites are getting tens or even hundreds of thousands of pages indexed and we do not even have 1000. For about a week, we had up to 13,000 pages listed and our Alexa traffic rating shot up. Then, all but ~600 pages were deindexed. DP members said we should remove the long repeated title but this has had no effect. Even though so few pages are listed, Google is crawling thousands of pages on our server every day. What can we do to get our page count back up? http://www.google.com/search?hl=en&q=site:bytemycode.com is the query and our competitors get virtually all of their traffic from Google so this is a major disadvantage for us. What are we doing to turn Google off on our site?
You're moaning about only having 1000 pages listed? I only have my homepage listed, so be happy and have a little more patients.
google has been a mess the past few months. I stopped worrying about it and focused on improving my sites for the long haul.
Yep Google is seeing your pages as duplicates. Every page has the same meta description. Just take a look at what Google is displaying for your pages and you'll see what the problem is. Dave
I don't think having the same meta description tag should matter, as long as the page titles and content are different. I don't think Google indexes the description tag anyway - they just use it for snippets (sometimes, dcepending on the search term).
They don't use it for ranking but I'm not so sure they ignore it when filtering. There's so much java on the pages there's not much left to index. Dave
Start off by 301 redirecting the non-www version to the www version or vice versa. Then you need unique meta descriptions for each page. While minstrel may be right in saying that they don't index them I believe that they will help you in this case. Also your site is too new and has too few links to have 10,000 pages in the index. 10,000 pages is a huge amount.
It can't hurt, anyway. And when it comes down to it, who knows what the hell Google is doing these days?
I do have a sitemap generator that spiders the entire site and produces a new file every day. It is not a question of site age/links--one of our competitors with lower PR that only started in June has 27k+ pages in Google. We had 13,000 for a couple of days and our traffic spiked. I would just like to see the pages stay in the index.