I am just curious, I run a free classifieds & networking site http://www.jihoy.com, I noticed some search engine traffic for certain user names on my site. However, I am currently disallowing robot.txt to crawl my user profiles, as people tends to put in tons of links in their profiles toward various type of sites that are probably less than reputable or even spam sites. However, I am thinking of removing this disallow so those people pages are indexed by google, and I get some search traffic from their friends searching for certain user names. What I want to know, if I structured my disallows in robot.txt in such a way, that my main page doesn't point directly toward their profile page, I shouldn't lose much page rank? Inner pages shouldn't affect the main landing page if main landing page don't point directly toward those inner pages? I am not sure, how disallows works in channeling page rank juice within a site. Anyone have any idea?
My robots.txt is fairly simple. User-agent: * Disallow: /landing/google_translate.rhtml Disallow: /post/index Disallow: /users/profile Disallow: /profile Disallow: /friend Disallow: /message Disallow: /browse Disallow: /contacts Diaallow: /admin