1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Too Many Crawler Errors - Can I restrict all the URL's In Robots.txt

Discussion in 'Search Engine Optimization' started by mnvamsi, Sep 19, 2009.

  1. #1
    Google webmaster tools shows too many errors for my blog www.itomo.org and the blog is under penalty. Recently I made a major shift from Blogger to wordpress and I think that caused the issue.
    I don know how to resolve this. What do I do now?

    I just want to restrict all the URL's in the crawler error page using robots.txt . Will it solve the issue or make things worse.

    Do you think redirecting 404 pages to home page would help?

    Somebody please let me know.

    Thanks.
     
    mnvamsi, Sep 19, 2009 IP
  2. lanta99

    lanta99 Peon

    Messages:
    160
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Well, you can sign up for google webmasters tool, and submit a request to remove the 404 urls. That would be much easier than using the robots.txt
     
    lanta99, Sep 19, 2009 IP
  3. mnvamsi

    mnvamsi Active Member

    Messages:
    206
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    53
    #3
    Oh thanks mate!! Well I expected many more answers for this one.
     
    mnvamsi, Sep 19, 2009 IP