use of robots.txt..need help?

Discussion in 'Search Engine Optimization' started by mathay, Sep 11, 2010.

  1. #1
    Suppose i want all robots to access or scrawl every web page of my site then do i need to upload robots.txt?

    and i need to disallow some particular pages then what would i need to do?
     
    mathay, Sep 11, 2010 IP
  2. 3glogic

    3glogic Peon

    Messages:
    338
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #2
    You need to upload a robot.txt in your ftp and then disallow all the pages which you need to not index on the Google. Let me know if you needed any more help..
     
    3glogic, Sep 11, 2010 IP
  3. newlogo

    newlogo Peon

    Messages:
    3,931
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    0
    #3
    if u want to restrict some of your web page, that google bot index, u can put into robot.txt with / command.
     
    newlogo, Sep 11, 2010 IP
  4. indiamart

    indiamart Peon

    Messages:
    300
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    upload robots.txt by which google will crawl all the pages of the website except the one disallow on the file..
     
    indiamart, Sep 12, 2010 IP