Hi there, Just been looking at some of the 'Error pages' stats on my website and in a week there was 96 page views of my robots.txt file - and this is shown in the 'error pages' section. Does this mean there is something wrong? Thanks for any help. Matt
Sorry but how can I check - i've got google webmasters set up etc, but how do I check my robots.txt file is okay? Thanks
In the tools section, the link is something like this: https://www.google.com/webmasters/tools/robots?siteUrl=http://www.yoursite.com/&hl=en
I'm at the dashboard - where now though. If I click the website with the robots.txt problem, where do I go from there? Thanks for the help so far
If you don't have a robots.txt file that is OK. it is a file that search engines and other web bots look for when they visit a site and it tells them what they can and cannot look at URL wise (such as disallowing admin pages). If you are seeing your 'error pages' as the robots.txt being a 404 error, that means it is being found. You can create a blank robots file if you like to avoid seeing those 404 errors.
Yes, just basically create the robots.txt file in your web directory folder. You can make it blank or start blocking some search engines and/or web bots in the file. If you do not know how to do that, i'm sure you will find some way in the forums here.