User-agent: * Disallow: / In the above robots.txt file * means all the user agents of all the search engines and Disallow: / means what? If / means home page then one wants to prevent whole site? If we write, Disallow: /anyfolder/ The above means it prevents only the folder (i.e. anyfolder in the above eg.) or the folder with its all subfolders? If we don't want to prevent whole site but some folders then the below statements are right? Allow: / Disallow: /anyfolders/ Please help me if I am on the right path or not as I listed the above... And I want to know one more thing if I have a wordpress site, then is there any other method for robots.txt or as other domains follow?
Yes you are on right path. User-agent: * Disallow: / Means that nothing should be indexed. Allow: / Disallow: /anyfolders/ Means dont indexed some folders.
Disallow: / means any page having / e.g. /index.html /services.php etc. will not be indexed (effectively the whole site). Interestingly Google allows some regex expressions in robots.txt though you should be very careful while using them.
Yes you are right, very good!! but I suggest dont use robots.txt until you reach last step of optimization. Why to use it without any specific reason, right?