Suppose i want all robots to access or scrawl every web page of my site then do i need to upload robots.txt? and i need to disallow some particular pages then what would i need to do?
You need to upload a robot.txt in your ftp and then disallow all the pages which you need to not index on the Google. Let me know if you needed any more help..
if u want to restrict some of your web page, that google bot index, u can put into robot.txt with / command.
upload robots.txt by which google will crawl all the pages of the website except the one disallow on the file..