Blocking particular subdomain urls.

Discussion in 'robots.txt' started by tanha_dil, Feb 27, 2010.

  1. #1
    Hi guys ,

    I have a small ad network site for mobile ad publishing . now as the links are placed over different sites its quite obvious that different bots crawl my links . Last few days google crawling was significant and it was clicking on our ad links for which was counting as valid clicks for the publishers. I want to stop google from accessing this domain which is like x.x.com while leaving full authority to crawl the root site . For now i have banned google bot with .htaccess and i am unsure how to do anything like this with robots.txt . Please note that i dont want only to prevent my links being clicked by google but also want stop indexing those links in search results. Seems some guidance from you all can show me right path.
     
    tanha_dil, Feb 27, 2010 IP
  2. strid3r

    strid3r Well-Known Member

    Messages:
    251
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    103
    #2
    if you want Google to deindexing your site, go to Webmaster Tools and remove the url
     
    strid3r, Feb 27, 2010 IP