Very important, you can keep them from spending time on stupid pages you don't want indexed. Here's an example of my robots.txt: That keeps them from spending time looking at (and indexing) stuff you don't need them to, like for example, the registration page, the new topic / thread page, and so forth.
I thought I'd share my robots.txt for a phpbb2 forum: User-agent: * Disallow: /phpBB2/admin/ Disallow: /phpBB2/faq.php Disallow: /phpBB2/groupcp.php Disallow: /phpBB2/login.php Disallow: /phpBB2/memberlist.php Disallow: /phpBB2/modcp.php Disallow: /phpBB2/posting.php Disallow: /phpBB2/privmsg.php Disallow: /phpBB2/profile.php Disallow: /phpBB2/search.php Disallow: /phpBB2/viewonline.php Code (markup):
Yeah... you can always keep them out from the crawling the pages which are not necessary. But you can also leave it as is, as the size of those files is just a few MB's & nothing a big deal for the crawlers... Peace!
Yes, it is very important so that the search bots would be properly guided what and what no to index.
If you have a phpbb forum and SEO mods are installed, it is very important to have robots.txt because Google does not like double content. If you have IPB, you have to disable Lo-Fi version with robots.txt because it's still double content And I am not sure about vbullettin never used it
There are discussions about this issue in simplemachines.org/community. You'd hear words right from the horses mouth.
Ideal so long as members' forums are installed in a subdirectory oh phpbb2 madk I have my forum installed in / and by clever use of .htaccess file, I ensure that the home page is the portal and not the index.php. I use the excellent phpbbSEO which ensures I have 0 dupes and THE best, free SEO for phpbb2 (and now 3) forums that one can possibly have.