Hi, Guys i m currently working on an e-commerce website. This website having a complete dedicated directory of "WEBSITES DIRECTORY". That is having large number of outbound links from every page. Through these outbound links the link value of my pages might have been leaked. With the help of robots.txt restricting these pages from crawlling, will my site get better page rank? Pls get me some more ideas on it............. Thanks
yeah u might be right. but have u ever thought how google come to know about any page or backlink. i m not sure but i relate it with google's crawling. Google may explore new links or page by crawling. if any link which is not been crawled by google that mean google is not aware of that link. and if google is not aware of any link then how google will consider it. Is it possible that link value passes without come into consideration of google. My theory may be wrong and even u may laugh on this.. but really i need some help to understand it better....
Then who will submit sites in your directory.. If you restrict this pages, you will probably be saved from being penalized by google for putting links and thus increase PR
That is the different case... lets forget this point but tell me, what do u think by restricting crap pages of ur sites will inturn help u to enhance ur page rank? is it valid or not?
it will not enhance your pagerank,, it will save from penalizing.. Pagerank only depends on backlinks - backlinks from PR pages/sites.
I get what your saying mikelorentz. If google dont crawl pages with lots of out bound links, then they wont, know about them. If they dont know about them then they cant act against you. So you can give reciprocals to sites, they will think they have a reciprocal, which they will, but google won know about it. So therefore it wont cancel out your inbound, hence increasing your page rank? It makes sense to me, I am not sure if it works though. Josh
It is quite rubbish that robot.txt helps in improving PR of any web page or website... There is no relation between PR and robot.txt. so don't waste time on it..
Robot.txt only tells search engine crawlers whether any webpage has to be crawled or not. If we restrict SE crawlers to crawl any web page, there is no meaning of increase in pr of that website. when that page is not being indexed, how increase in PR can be expected.....
channel your PR in specific pages by making rest of the links "nofollow" that is the technique I would use
I thing robots.txt is not helpful in PR the main purpose of this file is only to show crowler to crow. This file content some permission related to site page cowling.
put the links property as no follow, then u r PR does not links.... and if possible use script for the links....it also saves your PR sharingggggggg
If you use the robots.txt file to block the indexing to retain the Page Rank, you will also block the pages from being indexed (No SERPS). It would be better if you add the NoFollow attribute to the outbound links. This will save the pages and block the PR bleed.