Robots.txt for Virtual Subdomains?

Discussion in 'robots.txt' started by Avalanche, Nov 14, 2007.

  1. #1
    A company I'm working with has set up various subdomains for their DNN site (yeah, yeah, I know...), and each subdomain is its own dnn installation.

    We don't want search engines crawling some of those subdomains, so we wanted to use a robots.txt. However, that would mean we would have to place a robots.txt file in the folder for each subdomain. Since these are virtual subdomains, there are no separate folders to place the files in. I'm not very sure how the ins-and-outs of DNN work, but I guess it constructs the urls on the fly. Anyone have an idea of how to work around this and block virtual subdomains with robots.txt?
     
    Avalanche, Nov 14, 2007 IP
  2. energy.fs

    energy.fs Peon

    Messages:
    46
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    no brother ...sorry ..i dont hav any idea about this ....try to ask in dif. forums if u dont get in this ....or try in expert sollutions in google
     
    energy.fs, Nov 21, 2007 IP