I currently have the following robots.txt in the root level of my site: User-agent: * Allow: / Will this allow the indexing of all directories below the root? Is there anything else I should add? also (somewhat related).. how does the sitemap.xml impact the indexing?
Yes it will allow indexing/ or in other words will permit the search ngines to visit your site. but it is left to the crawlers to decide if they should index an what they should index. Robots.txt is for the so called Good robots only
Here is a link to a robots.txt generator which will help you with your customization http://www.seochat.com/seo-tools/robots-generator/
thanks! I've noticed that blogger.com has "deny all" in the automatically generated robots.txt. Any way around that?
Most people also like to specify location of their sitemap file in robots.txt By default an empty file or a missing robots.txt would also result into your site getting indexed.
I would want google to pickup my blogger.com pages.. but the default is "DENY ALL" - is there a way to change that?