Robots.txt for http and https on Linux server

Discussion in 'robots.txt' started by hoshheid, Feb 23, 2009.

  1. #1
    Hi.

    I want to block all robots from crawling the https version of my site, but give full access to robots on http


    I use a Linux VPS, I wondered can I simply place a standard 'block all access' style robots.txt file in httpsdocs on the server, or is this the wrong place? Do I need to set up htaccess rules to send https traffic to use robots-ssl.txt?

    Thanks for your input
     
    hoshheid, Feb 23, 2009 IP