I've read everything I can on robots.txt files. I still don't know the answer to this question. If I have a dating website, should I use a robots.txt file to block the majority of the pages. There are tens of thousands of pages for Google to index. Should I use the robots.txt file to narrow this down to say like 20. I was afraid that thousands of pages might dilute my page rank, especially since most of these pages don't contain my keywords, so I blocked everything except for about 20 pages that were important and fully optimized (onsite optimization). However, since doing this, I've actually dropped in the ranks (and can't find any other explanation - though I know there could be a million reasons) I've looked at all the top dating and chat sites, and I've noticed that most don't use a robots.txt file at all. However, a few of the top sites do. It would make sense to have Google only look at what is fully optimized. However, Google may not like being blocked from thousands and thousands of pages. Does anyone know what the answer is?
If you are blocking pages with robots.txt then those pages will be dropped from the index. If those pages are not in their index and they have links on them to any of the 20 pages that are NOT blocked then those 20 pages that are not blocked will not get credit for those inbound links & link text and those other pages on your site cannot pass your 20 indexed pages PR. IMO it is a very bad idea to keep the engines from indexing the majority of your site... You could also possibly pick up long tail traffic from some of the other pages.
G knows how a dating site looks like, by blocking almost all the pages G will have more difficulties recognizing your site as a dating site... Don't block the pages would be my natural advice.
I would advise against it as well... Build seo friendly directories if you can ... www.yourdomain/longtailkeyword Chances are a lot of those keywords will have little/no competition. Don't forget internal linking either. -Kyle
Thanks so much for the replies. I have removed the robots.txt file. I think you guys are totally right. I will see if my site goes back up. As one person said, I'm sure that Google knows what a dating site looks like. I'll just open up the coat and let it all flap in the wind...and hope for the best.
A robots.txt file is not used for SEO except to help the site get indexed by the 2nd tier search engines. Robots.txt File Coding Example # Robots.txt file created by 10/03/2008 # For domain: http://www.catanich.com # # All other robots will spider the domain User-agent: * Disallow: /Tex-Mex-Recipes%5C_vti_cnf/ Disallow: /Tex-Mex-Recipes\_vti_cnf/ Disallow: /_common/ Disallow: /_private/ Disallow: /_ScriptLibrary/ Disallow: /aspnet_client/ Disallow: /_client/ Sitemap: http://www.catanich.com/site-map.xml Sitemap: http://www.catanich.com/site-map.xml.gz Sitemap: http://www.catanich.com/urllist.txt Sitemap: http://www.catanich.com/urllist.txt.gz This will tell the SEs where the sitemaps are. As for blocking page, DUMB! DUMB! The more pages you have, the higher your page rank calculation will be. Just make sure that ever page on the site has a link back to the home page [PR feedback]. If you obtain a PR5 or above, then you can start selling "ad space" like Match and make money.