Will Robot.txt file effect PR

Discussion in 'Search Engine Optimization' started by gldean, Mar 10, 2009.

  1. #1
    Is it possible that adding robot files to block pages could lower your pr? I have added a few to one of my sates and am wondering if it may have had an effect. Any ever have this happen?

    Thanks,
    GD
     
    gldean, Mar 10, 2009 IP
  2. shailendra

    shailendra Peon

    Messages:
    1,225
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #2
    the pages you have blocked in robots.txt will definitely be affected as far as PR is concerned. PR means the trust that Google attaches to the web page. if Google can't crawl the page then how is it going to allocate PR to it???
     
    shailendra, Mar 10, 2009 IP
  3. flnazrael

    flnazrael Peon

    Messages:
    15
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Just because a page is blocked from crawling doesn't mean it won't be assigned PR, if other sites are linking to it.

    PR also has nothing to do with trust - it's a link popularity measure only.
     
    flnazrael, Mar 10, 2009 IP
  4. catanich

    catanich Peon

    Messages:
    1,921
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    0
    #4
    By adding a disallow command to your robots.txt file will not affect the PR calculation for now. It may down the road.

    The only way would be to use the WMT's URL Removal Tool to romove the pages' cache. This would remove them from the PR calculations.
     
    catanich, Mar 10, 2009 IP