3 weeks ago I placed robots.txt on my site to disallow some pages but it seems there are still some pages showing in Google search. How can I disallow those pages fast? Thanks in advance
thanks for the reply! But putting meta tags won't help, the pages I'm trying to block is a test page. Actually, I want all pages to be disallow reason why I placed robots.txt * / to disallow all pages but I still see some page specially home page. Is there any other way to fast track this?
I use a sub domain for making test pages that G never finds, no pinging, etc for it. Have you used the URL removal tool on G?
If your targeted page or home page is already indexed in Google then ‘disallow’ won’t working like before. You should disallow pages before it indexed in Google or other search engine. You can test with Google search console that your disallow comment working or not properly. Thanks
Are you trying to remove the indexed pages from Google search result? You can try this. Login your Google Webmaster Tool >> Google Index >> Remove URLs >> Temporarily hide Enter the page URL you want to remove from Google search result. It should take few days to be effective, or very soon. After that, add "Disallow" in your robots.txt to prevent search engine bots to crawl and index the page again. Hope this helps!
If you go into Google Webmaster tools, their is a link on the right hand side where you can disavow sites that are linking to you. I'm not sure if that's what you mean, but if you are getting backlinks that are hurting your rankings, you can disavow them in Google.
Thanks for the respond! Appreciate it... I think the page I'm trying to disallow have different method of doing it. the page is a sub-domain page, like: test.domain.com, yes it's a test page. I don't want the test page to be seen in Google SERP. On my on-going research I found it has different method, I see different way some say need to create separate directory then place robots.txt, others saying RewriteEngine on RewriteCond %{HTTP_HOST} ^subdomain.website.com$ RewriteRule ^robots\.txt$ robots-subdomain.txt Then add the following to /robots-subdomain.txt: User-agent: * Disallow: /