In order to request a site removed from Google, I applied the robot.txt that was provided from my webmaster tool to my page: User-agent: * Disallow: / But after requesting for the removal, I get this message: "Your request has been denied because the webmaster of the site hasn't applied the appropriate robots.txt file or meta tags to block us from indexing or archiving this page." This site is a blogger template btw. Did I placed the robot.txt in the wrong place or format?
You may consider reading this good source - How to delete website from Google search results | SeoblogR or seeing the similar discussion at How to remove my site from Google Index
I already did that but keep getting denied with this message: "Your request has been denied because the webmaster of the site hasn't applied the appropriate robots.txt file or meta tags to block us from indexing or archiving this page. Please work with the webmaster of this site or select an alternate removal option from the webpage removal request tool."
The steps that is been explained in that blog is what exactly I did and it works fine for my site. I think there is some error that you had made while creating robot.txt.
All I did was paste the robot.txt exactly the way it was shown in the webmaster tool: User-agent: * Disallow: / So if I made an error then I'm hoping someone here can tell me what it is.
The file name must be rotbots.txt not robot.txt in order to stop google from indexing. sign up for Google web master tools and request the indexed page removal after updating the rotobts.txt file.
I just said I did exactly what the webmaster tool instructed. How else am i supposed to do it manually?
Is your robots.txt file in the root folder of your website? It must be. So, if it is done correctly I could browse to www.yoursite.com/robots.txt and the file would open for me. (eg: www.comtec-ars.com/robots.txt) Is this how it is?