Google just index limited pages from each site everyday depending of the strongth of it.Robots.txt will be useful for the newly site to index the important pages firest You can also use this If you don`t want spider prevent any thing of your pages # Allow all User-agent: * Disallow: Code (markup):
you r right skionxb if u really want to disallow anything then it matters otherwise i dont think its really important..to include robots.txt
Yes, you can use a robots.txt for .php pages if you want to - to keep bots out of places that you don't want to appear in SE results. If your forum is in a folder called forum, you would use a robots.txt to exclude bots from pages like avatar.php and faq.php as follows. User-agent: * Disallow: /forum/avatar.php Disallow: /forum/faq.php
I don't know if anyone has posted it yet, but there is some nice information here: http://www.thesitewizard.com/archive/robotstxt.shtml