How to create robots.txt for bookmarking website. Is it necessary?

Discussion in 'robots.txt' started by wayne_hoton, Jan 11, 2015.

  1. #1
    Hi,

    I am little confused about robots.txt in bookmarking website. I have created a bookmarking website and saw that after sometime my post deleted as per my requirement and then google showing that post as 404 error which in obvious.

    So how can I block google to crawl these pages or categories. How can I write robots.txt to prevent 404.

    Thanks
     
    wayne_hoton, Jan 11, 2015 IP
  2. wayne_hoton

    wayne_hoton Greenhorn

    Messages:
    55
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    16
    #2
    yes I know about this but like digg.com. Digg post are not appear in google search except videos. Suppose I have a category news with 100 post and after sometime I have delete all the post then google will show 404 error in GWM. So I want to prevent this issue from starting.
     
    wayne_hoton, Jan 14, 2015 IP