If i disallow some of pages of my website using robot.txt , then will it be good to optimize those pages??
Hey Its just a waste of time if you put robots.txt file for the pages and still you want to optimize both cant be done. It does not matter how much SEO efferts you put for that page it will result in to zero as search engines wont index that page. so be careful for robots.txt files ok buddy
Please forgive me but, to what end? Normally we use no follow robots.txt commands to disallow the indexing of certain parts of the website, like certain directory's you may not like turning up on Google or yahoo, or pages which need authorisation. So I don't see the point of carrying out SEO on pages which are never going to be seen by anyone other than yourself. If I've got the bull by the wrong ball and misunderstood you please accept my apology. Its been a long night without sleep. Dog
Then shld i put the meta tag for other website which having the same content and i hav put robot.txt not to index by search engine in order to escape blacklist by google.??
Are you disallowing all the SE Bots and Robots (or) just few? If you've restricted all of them, then there is no use of optimizing those pages.
If you disallow that pages and if those pages are useful for you and you want to optimize those pages then you have to optimize your content and remove from robots.txt
But if i remove robot.txt from my second website which is having entirely duplicate content from the first one that is indexed in search engines .. IT can be possible that google blacklist my both of websites..