robots.txt is ok but bot still restricts urls

Discussion in 'Google Sitemaps' started by Mares, Mar 12, 2009.

  1. #1
    Hi,

    I have a problem in google webmasters tool.

    When I set up my site i didnt make robots.txt file. After some period, I saw restricted urls by robots (in google webmasters tool) thus i decided to build robots.txt file.

    Even these restricted urls are not harming my site (they are just hidden affiliate .php links), I just want to have my site clear of errors.

    Ok, so I built robots.txt file and upladed it. After some time I saw the number of errors started to reduce, but just when I expected to come to zero, new errors (restricted urls) started to appear.

    My robots.txt file is built properly with no mistakes.
    It goes like this:

    User-Agent: *
    Disallow: /example1.php
    Disallow: /example2.php

    Can someone explain me why google bots still restrict urls that are specified in robots file? and how to repair this?

    Tnx...
     
    Mares, Mar 12, 2009 IP
  2. buldozerceto

    buldozerceto Active Member

    Messages:
    1,137
    Likes Received:
    43
    Best Answers:
    0
    Trophy Points:
    88
    #2
    If you don't want google to restrict your urls remove the robots.txt
     
    buldozerceto, Mar 12, 2009 IP
  3. Mares

    Mares Peon

    Messages:
    58
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    No, you didnt understand.

    These files are not important for my site, they are just affiliate links, but I want to make my statistics in google webmasters tool be withouth any errors and restrictions.... so i decided to put all these affiliate links into robots.txt file and make them disalloed to robots ....

    but google bot still restricts these files even they are specified in robots.txt file ....

    how to make this thing work?
     
    Mares, Mar 12, 2009 IP
  4. 4chp

    4chp Peon

    Messages:
    163
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    0
    #4
    I'm pretty sure G! will always report (restricted urls) in the scenario that you have in place.

    You could create an redirect something along the lines of

    http://www.mysite.com/visit/affiliatesite/ 
    Code (markup):
    which would redirect to your affiliated site instead of the way you have it atm.

    Create an folder call "visit" or something, then create an folder inside that folder for each affiliate like "affiliatesite". Inside that folder create a index.php file with the following code

    <?php
    Header( "HTTP/1.1 301 Moved Permanently" );
    Header( "Location: http://www.your-affiliate-link-to-affiliatesite.com );
    ?>
    Code (markup):
    Open up your robots.txt file and disallow the "visit" folder.

    Then whenever you link to your affiliate program on your site use the following
    http://www.mywebsite.com/visit/affiliatename/
    Code (markup):
    There may be an better way to achieve the same effect however this has worked fine for what I needed it to do :)
     
    4chp, Mar 12, 2009 IP
  5. dickieknee

    dickieknee Active Member

    Messages:
    441
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    80
    #5
    it takes google a number of weeks to remove errors even though you may have fixed the error or restricted it via robots.txt just be patient, it will be corrected EVENTUALLY
     
    dickieknee, Mar 12, 2009 IP
  6. Mares

    Mares Peon

    Messages:
    58
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    @4chip ... tnx for suggestion but it is kinda a hard to do that now. I should then change all linkings on every page of my site and this would take some time, so I will bypass this.

    @dickie ... yes I realized it needs some time for G to remove errors. But what confuse me is that there are new errors appearing after I uploaded robots file.

    eg. google restricted 5 links before I upladed robots file (lets say it was before march).
    On the 1st march I upladed robots file. After one week errors started to reduce and came to only 1.
    But, after lets say 7 days of upladed robots file, new errors started to appear.

    This is what confuse me.
     
    Mares, Mar 13, 2009 IP