It will disallow access (to whatever directories/pages are listed below the "Disallow:") to whatever bot/crawler obeys the robots.txt file. If I wanted to instruct all bots to not crawl the /forum directory on my site, I would have this in my robots.txt file: If I wanted to block just Google's crawler, I would put this in: Read a lot more here: http://sitemaps.blogspot.com/2006/02/using-robotstxt-file.html
this means nothing and all the pages will be crawled because one thing is missing in it and that is forward slash (/). Had there been a forward slash after Disallow: then no pages would have been crawled