I just want to know how to create a robot text on my site i have no idea how can i do this? What is the first step for doing this?
Step 1. generate a sitemap using google webmaster tool Step 2. visit the tool at http://www.mcanerin.com/EN/search-engine/robots-txt.asp and have it point to your sitemap.xml.gz file Here is a sample output it provided for my site: # robots.txt generated at http://www.mcanerin.com User-agent: * Disallow: Disallow: /cgi-bin/ Sitemap: http://www.icsql.com/google_sitemap.xml.gz
A robots.txt is a permissions file that can be used to control which webpages of a website a search engine indexes. If you have some private files or folders in your website and you don't want to show those files in search engine result pages, you can exclude those pages using robots.txt like: User-agent: * Disallow: /your private file or folder