My robots.txt file goes like this: User-agent: * Disallow: Code (markup): Is that okay? Does it block anything?
That is telling all the bots to block nothing. In this case you typically don't have a robots.txt file at all.
if you really want to disallow then try User-agent: * Disallow: / it will disallow all pages to all search engine to crawl, if you want to disallow any particular page then put your page name instead of /