I've done a silly mistake. Google indexed some pages that must not be seen by general public or the search engines. What I did now is added the robots.txt and disallowed the path. Will those pages be de-indexed in the next crawl or do I have to do anything else?
just add those links in google webmaster tools.There is an option to add in google webmaster tools. Regards Alex
The robots.txt disallow should be enough to de-index them, but it will take until the next crawl to remove them, and yes there is the option to delete URLs via google webmaster tools. BUT THIS IS NOT ENOUGH - other search engines and sources can and may find your pages. If you have pages not meant for the public put a password on them.
Thanks thats all I needed to hear. I have many duplicate issues on the website (multiple urls to the same content), causing serps to go lower. Its not something that would need password protection. About Webmaster Tools: Google webmaster tools only allows 100 urls to be removed at once and I have more. Rep given.