Traffic/rank drop after robots.txt change

Discussion in 'Google' started by lopes, Jun 7, 2013.

  1. #1
    In 30 May I disallowed a directory that contained only projects created by my site's users. Also, I set all user's folders to return a 404 error when accessed.

    The next day GWT reported a drop in tracked pages from about 800 to about 50.

    After 10 days I started noticing a moderate traffic/rank drop, and it's been getting worse since then.

    What was responsible for the drop? The robots.txt policy or the 404 responses?
     
    lopes, Jun 7, 2013 IP
  2. lopes

    lopes Well-Known Member

    Messages:
    230
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    120
    #2
    My bad, I didn't explain it clearly. What I really did was:

    I disallowed crawling using a robots.txt policy AND set .htaccess to return a 403 error when the directories were accessed, using (Options -Indexes).
     
    lopes, Jun 7, 2013 IP
  3. Arup Ghosh

    Arup Ghosh Well-Known Member

    Messages:
    19
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    108
    #3
    The 404 error may be a reason behind the traffic drop as Google does not likes them, you should try redirecting the links to the pages of your website having similar topic and restore the robots.txt to the previous condition.
     
    Arup Ghosh, Jun 9, 2013 IP
  4. Expectation

    Expectation Member

    Messages:
    219
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    28
    #4
    no mate it is not issue for robots.txt 5days back google panda update that is main reason many people loss traffic may be that is reason
     
    Expectation, Jun 9, 2013 IP