although we get visited regularely by google´s bots on all our pages, we cannot get more than 206 pages listed when we ask google (site:-command) for the already indexed pages. Would you please take a look, what could be the reason? www(dot)s-p-h-i-n-x-c-o-n-n-e-c-t(dot)fr (please, remove all the "-"s We provide a google sitemap, a static sitemap, descriptive title-tags and so on, but no success so far. PLEASE HELP! Thx in Advance D.
Any backlinks ? edit hm, pr3, 15 backlinks oh my man, http://www.sphinxconnect.fr/robots.txt # robots.txt for http://www.sphinxconnect.fr/ # file created: 01.10.2004 User-agent: * Disallow: /produktdb/ Disallow: /global/ Disallow: /reddiXL/ Disallow: /index.php Disallow: /index_fr.php Host: www.sphinxconnect.fr edit3 make this just these two lines: User-agent: * Disallow: edit4 ahhhhhhhhhhh.... on the pages: <meta name="robots" content="noarchive"> <meta name="googlebot" content="nocache"> <meta name="siteinfo" content="/robots.txt"> Thats the worst case scenario I have ever seen ! Remove INSTANTLY these 3 metas from your pages!