Hello, I am beginner in SEO, I want to know step by step process in adding robot.txt file. How we can add to the website.? Please explain. Thank you in advance...
Google was nice enough to make a page just for this, and will probably answer this better than anyone else could.. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
Just create a file robots.txt and upload it on your server via filizilla or other ftp client.Hosting companies like hostgator generate these files automatically.
Well the Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard is different from, but can be used in conjunction with, Sitemap, a robot inclusion standard for websites. If a site owner wishes to give instructions to web robots they must place a text file called robots.txt in the root of the web site hierarchy (e.g. www.example.com/robots.txt). This text file should contain the instructions in a specific format (see examples below). Robots that choose to follow the instructions try to fetch this file and read the instructions before fetching any other file from the web site. If this file doesn't exist web robots assume that the web owner wishes to provide no specific instructions. A robots.txt file on a website will function as a request that specified robots ignore specified files or directories when crawling a site. This might be, for example, out of a preference for privacy from search engine results, or the belief that the content of the selected directories might be misleading or irrelevant to the categorization of the site as a whole, or out of a desire that an application only operate on certain data. Links to pages listed in robots.txt can still appear in search results if they are linked to from a page that is crawled. It is simple two line code - User-agent: * Disallow:
Simply create a robots.txt and upload it on your servers in public_html directory.. and you are good to go.
just create a simple file with the name robots.txt and write the names of the robots to whom you dont want to allow access to your website contents. E.g.if i want to give access full access to all the robots than i will write in that file : User-agent:* Disallow: and if i want to disallow something like wp-admin than i will write User-agent:* Disallow: /wp-admin/.Thats it.