Hi have this way: User-Agent: * Allow: / Code (markup): After reading a wiki about robots i think is this way to allow all connections. Kiss
By default, all links and images will be crawled and indexed automatically. There is no need to type in Allow command. And you can have your xml sitemap shown to all search engines through robots.txt. Just type the following in robots.txt User-Agent: * Disallow: Sitemap: www dot your-domain dot com/sitemap.xml