I honestly don't unless I have some special thing to tell Google. I don't want to block any directory or other content, so I don't use it. Although, Google recommends it, it sounds like it's compulsory: "Make use of the robots.txt file on your web server" What is your opinion? Are there any downsides if you don't use one?
The upside of using a robots.txt is that it's yet another document Google can consult when deciding what to index and what not to index on your site. Sometimes, there are duplicate URL issues (in fact, it's not the URLs that are duplicate, but the content on them but you get the idea). To an extent this problem is solved with the XML sitemap and/or canonical tags. But a robots.txt file could be a good confirmation. Actually to me it's never daunting, since a robots.txt generator+editor comes with the on page optimisation audit tool I use.
Ya, You can use robots.txt for a formalty because google prefer robots.txt file and in robots.txt file you can simply write User-agent: * Allow: / You have no need to block any one url or any thing.
Yes! I also use it just like a formality nothing else, because I don't have any URL to block in my website.
So you are saying I should just use a "blank" one? On another Google page, Google says "No", it's not compulsory. But in the mentioned quote (in my first post here), it seems like they do recommend one. Ah well, I guess it's no major fuss about it...
Google just reminds you (if your are missing something.. a Robots.txt file etc.). Also its not compulsory and you can omit it easily. Some of my sites also don't have it all. If you are in doubt, then create one (as others suggested) and do not block anything.