What the use of robot.txt file. robot.txt file used to avoid index or crawl by Google some unused pages.by default robot.txt all page allow. If you want not crawl by crawler than you can use robot.txt. file and disallow it.
I recommend to use robots.txt always as is recommended by google, for example did you know you can add your sitemap file on robots.txt as a directive , take a look at this example ---- robots.txt ---- User-Agent: * Allow: / Sitemap: http://www.yourdomain.com/sitemap.xml hope it help