I have just recently learned about this and I am interested in what at the most basic level I should have in my robot. Or maybe you might share what you have in yours?
What are you trying to achieve? Want to block content? Block certain spiders? At the very minimum, you just don't have such a file and everything will be indexed. Study other people's robot file to see what they've done. DP: User-agent: * Disallow: /tools/suggestion/? Disallow: /search.php Disallow: /go.php Disallow: /ads/ Code (markup): First line tells it applies to all user agents (spiders). Second line blocks all dynamic pages but not the main page (before a KW is submitted) Last one blocks an entire folder.
Or if you want everything indexed without error messages appearing in your logs, use: user-agent: * Disallow:
I was wondering the same thing. I got some errors in my google “Adsense spider†account and in the instructions they advised me to create a robot.txt file but I don’t know how.
From google help page What must I do? Is this a good idea or what. I have a WP blog with godaddy hosting.
If you already have a robots.txt file, cut and paste the following into it. User-agent: Mediapartners-Google* Disallow: If you don't, create a text file called robots.txt. Cut and paste User-agent: Mediapartners-Google* Disallow: into it and upload it to your public_html folder.
your public_html folder, home, www, wwwroot, or htdocs directory. Depending on your hosting services the home directory may have different names.