Hi. I want to block all robots from crawling the https version of my site, but give full access to robots on http I use a Linux VPS, I wondered can I simply place a standard 'block all access' style robots.txt file in httpsdocs on the server, or is this the wrong place? Do I need to set up htaccess rules to send https traffic to use robots-ssl.txt? Thanks for your input