Robots.txt for wordpress seo

Discussion in 'WordPress' started by @SHFAQ, Nov 26, 2007.

  1. #1
    Recently i have read on many blogs that robots.txt is much important for wordpress, as if search engines found duplicate contents on our blog they will penalize our blog and our blog prove us as a nightmar.
    There was a solution to use robots.txt to stop search engine finding duplicat contents and also include not to crawl some folders like "wp-admin", i didnt understand how to setup robots.txt for my blog as i am using %postname% as a permalink so plz help me making robots.txt, which commands to includes in it ?
     
    @SHFAQ, Nov 26, 2007 IP
  2. YoungMaster

    YoungMaster Peon

    Messages:
    330
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Just open up a text editor and create a plain text file named robots.txt. Then upload it so it's in the root directory of your blog like this www.example.com/robots.txt. You will need to add what you want to disallow to the text file.
     
    YoungMaster, Nov 26, 2007 IP
  3. @SHFAQ

    @SHFAQ Well-Known Member

    Messages:
    257
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    103
    #3
    Thanks dear for reply but i didnt know what to put in robots.txt file for my wordpress blog.
    I am using %postname% OR %postname.html% permalink and i dont want that search engines get duplicate contents from my blog so plz tell me what should i have to put in robots.txt ?
     
    @SHFAQ, Nov 27, 2007 IP
  4. newzone

    newzone Well-Known Member

    Messages:
    2,865
    Likes Received:
    52
    Best Answers:
    0
    Trophy Points:
    135
    Digital Goods:
    1
    #4
    in the root of your server...just upload it using a FTP client
     
    newzone, Nov 27, 2007 IP
  5. @SHFAQ

    @SHFAQ Well-Known Member

    Messages:
    257
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    103
    #5
    I know how to put robots.txt in root of my blog site server but i didnt know what commands to put in it to stop search engines getting duplicate contents from my blog and also plz remember i am using %postname% permalink
     
    @SHFAQ, Nov 27, 2007 IP
  6. lateuk

    lateuk Active Member

    Messages:
    317
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    58
    #6
    lateuk, Nov 28, 2007 IP
  7. CypherHackz

    CypherHackz Well-Known Member

    Messages:
    447
    Likes Received:
    24
    Best Answers:
    0
    Trophy Points:
    155
    #7
    CypherHackz, Nov 28, 2007 IP
  8. bbrian017

    bbrian017 Well-Known Member

    Messages:
    2,990
    Likes Received:
    66
    Best Answers:
    0
    Trophy Points:
    170
    #8
    Overall what's the point of a robots txt I mean how does it help your blog?
     
    bbrian017, Nov 28, 2007 IP
  9. lateuk

    lateuk Active Member

    Messages:
    317
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    58
    #9
    It stops search engines listing useless pages like /wp-login.php, it can also be used to stop duplicate content which affects your rank.

    Late
     
    lateuk, Nov 30, 2007 IP
    bbrian017 likes this.
  10. live-cms_com

    live-cms_com Notable Member

    Messages:
    3,128
    Likes Received:
    112
    Best Answers:
    0
    Trophy Points:
    205
    Digital Goods:
    1
    #10
    I use -

    User-agent: *
    Disallow: /wp-
    Allow: /wp-content/uploads/
    Disallow: /trackback/
    Disallow: /comments/
    Disallow: /page/
    Disallow: /date/
    Disallow: /comments/
    Disallow: /2007/
    Disallow: /2008/
    Disallow: /2009/
    Disallow: /cgi-bin/
    Disallow: /category/*/*
     
    live-cms_com, Nov 30, 2007 IP
  11. bbrian017

    bbrian017 Well-Known Member

    Messages:
    2,990
    Likes Received:
    66
    Best Answers:
    0
    Trophy Points:
    170
    #11
    Is this file pre set and if so what would be some recommended changes?
     
    bbrian017, Nov 30, 2007 IP
  12. lateuk

    lateuk Active Member

    Messages:
    317
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    58
    #12
    Have a look at the blog link earlier in the post, that helped me.
     
    lateuk, Nov 30, 2007 IP
  13. Pixelrage

    Pixelrage Peon

    Messages:
    5,083
    Likes Received:
    128
    Best Answers:
    0
    Trophy Points:
    0
    #13
    there's an equally important one for vBulletin, that blocks all of the duplicate pages (print, archives, etc)
     
    Pixelrage, Nov 30, 2007 IP
  14. Dan Schulz

    Dan Schulz Peon

    Messages:
    6,032
    Likes Received:
    436
    Best Answers:
    0
    Trophy Points:
    0
    #14
    Dan Schulz, Nov 30, 2007 IP