Hi, What is a robots.txt? How do I know if my site has one? How do I create one? And what do I need to do to ensure it is created correctly? Thanks Jimbob
Robots.txt file gives a set of rules (which you set) to bots that enter you website. EG, Google bot, yahoo bot etc.. etc.. If you didn't add the txt file top your server, then your site does not have one. Unless you bought the site. It will be in your main directory if there is one, where the index page of your site is. Go and google for robot.txt help/tutorials and you can see instructions.
A robots.txt is a permissions file that can be used to control which webpages of a website a search engine indexes. If you have some private files or folders in your website and you don't want to show those files in search engine result pages, you can exclude those pages using robots.txt like: User-agent: * Disallow: /your private file or folder