Robots txt

Discussion in 'Google' started by ashish123, Feb 26, 2009.

  1. #1
    If i disallow some of pages of my website using robot.txt , then will it be good to optimize those pages??
     
    ashish123, Feb 26, 2009 IP
  2. seork

    seork Well-Known Member

    Messages:
    551
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    125
    #2
    Hey

    Its just a waste of time if you put robots.txt file for the pages and still you want to optimize both cant be done. It does not matter how much SEO efferts you put for that page it will result in to zero as search engines wont index that page.

    so be careful for robots.txt files

    ok buddy
     
    seork, Feb 26, 2009 IP
  3. Strontium Dog

    Strontium Dog Guest

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Please forgive me but, to what end?

    Normally we use no follow robots.txt commands to disallow the indexing of certain parts of the website, like certain directory's you may not like turning up on Google or yahoo, or pages which need authorisation.

    So I don't see the point of carrying out SEO on pages which are never going to be seen by anyone other than yourself.

    If I've got the bull by the wrong ball and misunderstood you please accept my apology. Its been a long night without sleep. :rolleyes:

    Dog
     
    Strontium Dog, Feb 26, 2009 IP
  4. ashish123

    ashish123 Peon

    Messages:
    72
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Then shld i put the meta tag for other website which having the same content and i hav put robot.txt not to index by search engine in order to escape blacklist by google.??
     
    ashish123, Feb 26, 2009 IP
  5. TheImtiaz

    TheImtiaz Peon

    Messages:
    6
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Are you disallowing all the SE Bots and Robots (or) just few?

    If you've restricted all of them, then there is no use of optimizing those pages.
     
    TheImtiaz, Feb 26, 2009 IP
  6. ashish123

    ashish123 Peon

    Messages:
    72
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    I have to disallow for all SEO bots to avoid blacklisting bcoz of duplicate content..
     
    ashish123, Feb 26, 2009 IP
  7. ezinequeen

    ezinequeen Peon

    Messages:
    132
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #7
    If you disallow that pages and if those pages are useful for you and you want to optimize those pages then you have to optimize your content and remove from robots.txt
     
    ezinequeen, Feb 26, 2009 IP
  8. ashish123

    ashish123 Peon

    Messages:
    72
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #8
    But if i remove robot.txt from my second website which is having entirely duplicate content from the first one that is indexed in search engines .. IT can be possible that google blacklist my both of websites..
     
    ashish123, Feb 27, 2009 IP