A company I'm working with has set up various subdomains for their DNN site (yeah, yeah, I know...), and each subdomain is its own dnn installation. We don't want search engines crawling some of those subdomains, so we wanted to use a robots.txt. However, that would mean we would have to place a robots.txt file in the folder for each subdomain. Since these are virtual subdomains, there are no separate folders to place the files in. I'm not very sure how the ins-and-outs of DNN work, but I guess it constructs the urls on the fly. Anyone have an idea of how to work around this and block virtual subdomains with robots.txt?
no brother ...sorry ..i dont hav any idea about this ....try to ask in dif. forums if u dont get in this ....or try in expert sollutions in google