Will a Robots.txt File Restrictions affect PageRank?

Discussion in 'HTML & Website Design' started by Vozzek, Jan 30, 2009.

  1. #1
    Hi all,

    I'm currently building a site for a friend. Whenever I build a site, I normally I use the robots.txt file to block the search engines at the root level while the site is under construction... then I lift it when it's finished (or nearly finished).

    My friend's domain has a pre-existing PR3 (I don't know why). So my question is, while I build the site should I still use the robots.txt to restrict access? Or will that harm the search engine's current opinion of the site?

    Thanks in advance!
     
    Vozzek, Jan 30, 2009 IP
  2. mmerlinn

    mmerlinn Prominent Member

    Messages:
    3,197
    Likes Received:
    819
    Best Answers:
    7
    Trophy Points:
    320
    #2
    If you are building a site it should not be available online. No one likes to visit pages "under construction" or pages that look horrible or pages that do nothing for the viewer. People visiting pages like that leave and don't come back.

    Once pages are available online, you should not use restrictions on pages that you want indexed. Google typically indexes on a 3-6 month cycle so if there are restrictions on available pages, it may delay indexing for months.
     
    mmerlinn, Jan 30, 2009 IP
  3. crivion

    crivion Notable Member

    Messages:
    1,669
    Likes Received:
    45
    Best Answers:
    0
    Trophy Points:
    210
    Digital Goods:
    3
    #3
    Dont use robots.txt to restrict access
    It will be harmfull to wait again for them to come and follow new rules / if they would come again.
     
    crivion, Jan 30, 2009 IP