Hello guys, I made the mistake of not using a robots.txt file for certain pages of my sites. The pages are in google, if i now use the robots.txt file to disallow the robots, will google de-index it ? or do i need to do this manually using google webmaster tools ? Cheers Kes
in webmaster tools there is an option to block URLS from being spidered. just add the url that that and google will stop spidering it and hopefully remove it.
I would do both - webmaster tools to make sure - and robots.txt to keep them from being indexed again.
this is exactly what i wrote about 2 days ago here http://forums.digitalpoint.com/showthread.php?t=418772
if you add a page which is indexe by google to your robots.txt file, google will delete that page from the index