You use robots.txt to tell search engine spiders which sections or which types of content you'd like indexed and which you'd like them to ignore. These types of commands can be issued manually on each page of your site with HTML, however this makes the HTML source code of your pages longer. The problem with longer source code is that some SEOs believe that the search engines favor lean efficient code and penalize sites that have an unfavorable code to content ratio.
Robots.txt allow for spiders to tell the search engines what should be indexed and what shouldn't be indexed.