Hi, I got the following robots.txt file from a robots.txt generatoir: # All robots will spider the domain User-agent: * Disallow: # Disallow directory /admanager/ User-agent: * Disallow: /admanager # Disallow directory /wp-admin/ User-agent: * Disallow: /wp-admin # Disallow directory /wp-includes/ User-agent: * Disallow: /wp-includes # Disallow directory /mag/ User-agent: * Disallow: /mag Code (markup): Google has thousands of pages indexed, so does MSN but Yahoo falls behind with only 133 pages in the last week. Could it have something to do with this robots.txt ?
have a look here ROBOT.TXT VALIDATION Worth a try - dont think you need User-agent: * each time though Jamie
here is all you need the above the allows search robots to index the entire site exept from the directories listed.
i would recommend what "just-for-teens" had said too. Why do you have to make this robots.txt so long ??