If a search engine spider doesn't find a robots.txt file, it assumes that it can spider the whole site. First of all, it is simply good practice. If the robot is going to look for the file, for the one minute that it takes to create it, you may as well do it. Secondly, If the robot finds a blank robots.txt file, it knows that you have nothing to hide in you site. The spider knows it has your absolute permission to look at and index every page on your site, because you have left him the message with the robots.txt file. The spider knows that you are not trying to hide or cloak any pages and in our experience, that is seen by certain robots, as a good thing.
Dont really understand your question... Try searching Google and you will come up with a definition of robots.txt file
Their is not Blank Robots.txt Robots.txt allows you to specify which pages should not be crawled. Pages that don't get crawled can still rank for keywords and show up in search results.