1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

How to disallow page fast?

Discussion in 'Search Engine Optimization' started by DarwinJones, Sep 19, 2016.

  1. #1
    3 weeks ago I placed robots.txt on my site to disallow some pages but it seems there are still some pages showing in Google search. How can I disallow those pages fast?

    Thanks in advance :)
     
    DarwinJones, Sep 19, 2016 IP
  2. wordplucker

    wordplucker Well-Known Member

    Messages:
    205
    Likes Received:
    38
    Best Answers:
    1
    Trophy Points:
    105
  3. DarwinJones

    DarwinJones Member

    Messages:
    75
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    36
    #3
    thanks for the reply! But putting meta tags won't help, the pages I'm trying to block is a test page. Actually, I want all pages to be disallow reason why I placed robots.txt * / to disallow all pages but I still see some page specially home page.

    Is there any other way to fast track this?
     
    DarwinJones, Sep 19, 2016 IP
  4. wordplucker

    wordplucker Well-Known Member

    Messages:
    205
    Likes Received:
    38
    Best Answers:
    1
    Trophy Points:
    105
    #4
    I use a sub domain for making test pages that G never finds, no pinging, etc for it.

    Have you used the URL removal tool on G?
     
    wordplucker, Sep 19, 2016 IP
  5. DarwinJones

    DarwinJones Member

    Messages:
    75
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    36
    #5
    not yet
     
    DarwinJones, Sep 19, 2016 IP
  6. Md. Faruk Khan

    Md. Faruk Khan Active Member

    Messages:
    45
    Likes Received:
    1
    Best Answers:
    3
    Trophy Points:
    83
    #6
    If your targeted page or home page is already indexed in Google then ‘disallow’ won’t working like before.

    You should disallow pages before it indexed in Google or other search engine.

    You can test with Google search console that your disallow comment working or not properly.
    [​IMG]
    Thanks
     
    Md. Faruk Khan, Sep 22, 2016 IP
  7. Mike Lee

    Mike Lee Active Member

    Messages:
    58
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    53
    #7
    Are you trying to remove the indexed pages from Google search result?

    You can try this.

    Login your Google Webmaster Tool >> Google Index >> Remove URLs >> Temporarily hide

    Enter the page URL you want to remove from Google search result.

    It should take few days to be effective, or very soon.

    After that, add "Disallow" in your robots.txt to prevent search engine bots to crawl and index the page again.

    Hope this helps!
     
    Mike Lee, Sep 23, 2016 IP
  8. Lindsey Walters

    Lindsey Walters Greenhorn

    Messages:
    47
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    8
    #8
    If you go into Google Webmaster tools, their is a link on the right hand side where you can disavow sites that are linking to you. I'm not sure if that's what you mean, but if you are getting backlinks that are hurting your rankings, you can disavow them in Google.
     
    Lindsey Walters, Sep 23, 2016 IP
  9. DarwinJones

    DarwinJones Member

    Messages:
    75
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    36
    #9
    Thanks for the respond! Appreciate it...

    I think the page I'm trying to disallow have different method of doing it.
    the page is a sub-domain page, like: test.domain.com, yes it's a test page. I don't want the test page to be seen in Google SERP.

    On my on-going research I found it has different method, I see different way some say need to create separate directory then place robots.txt, others saying

    RewriteEngine on
    RewriteCond %{HTTP_HOST} ^subdomain.website.com$
    RewriteRule ^robots\.txt$ robots-subdomain.txt

    Then add the following to /robots-subdomain.txt:

    User-agent: *
    Disallow: /
     
    DarwinJones, Sep 27, 2016 IP