If you are verifying the syntax of the file, you can do that within the Google Webmaster application: https://www.google.com/webmasters/tools
Just check for the www.domain.com/robots.txt... If there is any robots.txt in the website..it can be accessed then...
I like the Firefox SEO plugin. It allows to check seo and robots.txt for any site you want. Pretty handy to me...
I just tried typing domain.tld/robots.txt to check for robots on my website and nothing happened. Is there something I am doing wrong? http://nopolicestate.net , http://nopolicestategirl.net
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. It works likes this: a robot wants to visit a Web site URL, say domain.com/welcome.html. Before it does so, it firsts checks for domain.com.com/robots.txt. By default you don't have that file in there, you have to create it. Example: User-agent: * Disallow: /myfolder/ Hope this helps, Cheers
you can just goto submitexpress.com and click on meta tags analyzer. You can type in your domain name and it will check all your meta tags plus also check to see if you have a robots.txt file on your site. If you have a robot.txt file, it will also check to see if it has any errors on that file.
I can find on both domains the robots.txt... I couldn't post a direct link to it, because my account is to young...
It is directly after your domain. So it would be yourdomain.com/robots.txt And, it should show you what is in it. If it shows up blank, then you have nothing there. You can change it by editing it under public_html. I hope I helped.