Robots.txt file has some simple parameters which manages the bots. These are: User-agent: this is one of the parameter defines, for which bots the next parameters will be valid. * is a symbol by which means all bots or Googlebot for Google will allowed to search. Disallow: defines which folders or files will be excluded from search. None means nothing will be excluded, / means everything will be excluded. /folder name/ or /filename can be used to specify the values to excluded. Folder name between slashes: /folder name/ = means that only folder name/default.html will be excluded. Folder name with one slash: /folder name = means all content inside the folder name folder will be excluded.
Hi,i have a doubt regarding robot.txt, can we allow search engines to crawl members or users information? is it good from SEO side??or simply can we ingnore the member page, user registration page or password recovery page? thank u