I am doing a fresh install of Drupal v5.7. I saw that there is a robots.txt file included with the install package. The following is a listing of all the "Disallow" statements. I know some disallows are good like, "database", "includes" and "cron.php". But is there anything below that I should delete to ALLOW Search Engines to Crawl? User-agent: * Crawl-delay: 10 # Directories Disallow: /database/ Disallow: /includes/ Disallow: /misc/ Disallow: /modules/ Disallow: /sites/ Disallow: /themes/ Disallow: /scripts/ Disallow: /updates/ Disallow: /profiles/ # Files Disallow: /xmlrpc.php Disallow: /cron.php Disallow: /update.php Disallow: /install.php Disallow: /INSTALL.txt Disallow: /INSTALL.mysql.txt Disallow: /INSTALL.pgsql.txt Disallow: /CHANGELOG.txt Disallow: /MAINTAINERS.txt Disallow: /LICENSE.txt Disallow: /UPGRADE.txt # Paths (clean URLs) Disallow: /admin/ Disallow: /aggregator/ Disallow: /comment/reply/ Disallow: /contact/ Disallow: /logout/ Disallow: /node/add/ Disallow: /search/ Disallow: /user/register/ Disallow: /user/password/ Disallow: /user/login/ # Paths (no clean URLs) Disallow: /?q=admin/ Disallow: /?q=aggregator/ Disallow: /?q=comment/reply/ Disallow: /?q=contact/ Disallow: /?q=logout/ Disallow: /?q=node/add/ Disallow: /?q=search/ Disallow: /?q=user/password/ Disallow: /?q=user/register/ Disallow: /?q=user/login/
No, those are areas with no content, at least content that would be useful for your SEO. I have a digg clone thingie running under drupal and it is very good indexed and has more than 70% search engines trafic. It has been online for more than 2 years, btw. So, don't sweat it.