Should I use robots text file for SEO?

Discussion in 'Search Engine Optimization' started by jasonvan, May 11, 2009.

  1. #1
    I've read everything I can on robots.txt files. I still don't know the answer to this question.

    If I have a dating website, should I use a robots.txt file to block the majority of the pages. There are tens of thousands of pages for Google to index. Should I use the robots.txt file to narrow this down to say like 20.

    I was afraid that thousands of pages might dilute my page rank, especially since most of these pages don't contain my keywords, so I blocked everything except for about 20 pages that were important and fully optimized (onsite optimization). However, since doing this, I've actually dropped in the ranks (and can't find any other explanation - though I know there could be a million reasons)

    I've looked at all the top dating and chat sites, and I've noticed that most don't use a robots.txt file at all. However, a few of the top sites do.

    It would make sense to have Google only look at what is fully optimized. However, Google may not like being blocked from thousands and thousands of pages.

    Does anyone know what the answer is? :D
     
    jasonvan, May 11, 2009 IP
  2. Canonical

    Canonical Well-Known Member

    Messages:
    2,223
    Likes Received:
    141
    Best Answers:
    0
    Trophy Points:
    110
    #2
    If you are blocking pages with robots.txt then those pages will be dropped from the index. If those pages are not in their index and they have links on them to any of the 20 pages that are NOT blocked then those 20 pages that are not blocked will not get credit for those inbound links & link text and those other pages on your site cannot pass your 20 indexed pages PR.

    IMO it is a very bad idea to keep the engines from indexing the majority of your site... You could also possibly pick up long tail traffic from some of the other pages.
     
    Canonical, May 11, 2009 IP
  3. djingel

    djingel Active Member

    Messages:
    310
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    66
    #3
    G knows how a dating site looks like, by blocking almost all the pages G will have more difficulties recognizing your site as a dating site... Don't block the pages would be my natural advice.
     
    djingel, May 11, 2009 IP
  4. Kyleh86

    Kyleh86 Peon

    Messages:
    31
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    I would advise against it as well... Build seo friendly directories if you can ... www.yourdomain/longtailkeyword Chances are a lot of those keywords will have little/no competition. Don't forget internal linking either.

    -Kyle
     
    Kyleh86, May 11, 2009 IP
  5. dcristo

    dcristo Illustrious Member

    Messages:
    19,797
    Likes Received:
    1,201
    Best Answers:
    7
    Trophy Points:
    470
    Articles:
    5
    #5
    You don't use robots.txt for SEO. It's used to block parts of your website from being crawled.
     
    dcristo, May 11, 2009 IP
  6. vinnu

    vinnu Peon

    Messages:
    36
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    If you will use robots.txt then it's stop the crawler to crawl your page where u used robots.txt
     
    vinnu, May 11, 2009 IP
  7. jasonvan

    jasonvan Member

    Messages:
    363
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    35
    #7
    Thanks so much for the replies. I have removed the robots.txt file. I think you guys are totally right. I will see if my site goes back up.

    As one person said, I'm sure that Google knows what a dating site looks like. I'll just open up the coat and let it all flap in the wind...and hope for the best.
     
    jasonvan, May 12, 2009 IP
  8. catanich

    catanich Peon

    Messages:
    1,921
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    0
    #8
    A robots.txt file is not used for SEO except to help the site get indexed by the 2nd tier search engines.

    Robots.txt File Coding Example
    # Robots.txt file created by 10/03/2008
    # For domain: http://www.catanich.com
    #
    # All other robots will spider the domain
    User-agent: *
    Disallow: /Tex-Mex-Recipes%5C_vti_cnf/
    Disallow: /Tex-Mex-Recipes\_vti_cnf/
    Disallow: /_common/
    Disallow: /_private/
    Disallow: /_ScriptLibrary/
    Disallow: /aspnet_client/
    Disallow: /_client/
    Sitemap: http://www.catanich.com/site-map.xml
    Sitemap: http://www.catanich.com/site-map.xml.gz
    Sitemap: http://www.catanich.com/urllist.txt
    Sitemap: http://www.catanich.com/urllist.txt.gz

    This will tell the SEs where the sitemaps are.

    As for blocking page, DUMB! DUMB! The more pages you have, the higher your page rank calculation will be. Just make sure that ever page on the site has a link back to the home page [PR feedback].

    If you obtain a PR5 or above, then you can start selling "ad space" like Match and make money.
     
    catanich, May 12, 2009 IP