We use Robot txt file , so that google knows these pages need not be crawled... But i dont understand:- do we really need that google do not crawl our pages.. i mean everyone wants to be crawled then for which type of pages anyone can mention inRobot.txt file.. Can anyanybody help me on this?
Well I use robots.txt to keep google out of the places where I do not want it crawling, for example on a forum you might not want it to keep crawling admincp.php as it's just a waste of bandwidth. You can use it to allow or decline all the bots from different areas.