The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows: User-agent: Disallow: “User-agent†are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:†and “disallow:†entries, you can include comment lines – just put the # sign at the beginning of the line: # All user agents are disallowed to see the /temp directory. User-agent: * Disallow: /temp/