Hello! I have an extra server (webserver) filled with files for my website (they are on different servers), and I would like to block anyone from downloading files from the server who is not coming from my website. What is the best way of doing this? I have read about .htaccess and mod_access, and by blocking referrers other than my websites' that would do it. Is this the best way? Would it be easy to fake referers so that people can use my bandwidth? Also, what would be the correct way of setting this up? Thank you for you're time
Referer/user agent checks would stop most - to a stage where there wouldn't be much concern in terms of bandwidth, but yes, they can be spoofed and it can be hard to detect. If this is really important, you can use a script to deliver downloads after the user has filled in a CAPTCHA or alternatively, you can require users to register/login.
Hello Buddy, I Hope Free Software Below Can Help You HTAccessible - FREE DOWNLOAD HTAccessible - Generate HTAccess Files Unless you work with .htaccess on a daily basis the chances are you haven't got the exact syntax rolling around in your head when you need it. HTAccessible provides a simple interface for putting together some of the most commonly used Apache directives, without the need for yet another Google search. Price: FREE [ Download] OS: (Win95/98/ME/2000/NT4.0/XP) Download Welcome Dany
Wow, that was a great program for me Thanks a lot for the recommendation, it will come in handy a lot in the future. I have been playing around with its settings now for 20 minutes, and have run into a problem. I hope somewhere here at Digital Point can help me. Here are the contents of the .htaccess file: # Linking Control - allow only blank or listed referers Options +FollowSymLinks RewriteEngine on RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?http://www.website.com/*/.*$ [NC] RewriteCond %{HTTP_REFERER} !^http://(www\.)?http://www.website.com/*/*/.*$ [NC] RewriteRule \.(bmp|gif|jpg|png|rar|zip)$ - [F] # Redirects (from site path to URL) Redirect / http://google.com/ # Prevent viewing of htaccess <Files ~ "^\.ht"> order allow,deny deny from all satisfy all </Files> # Disable directory listing from this point Options -Indexes # Omit all files from the directory listing IndexIgnore * Code (markup): When I then try to download a file from "website.com", I get this: This is what it says in the error log: I would be very grateful if someone could help me fix this
Try this instead: RewriteCond %{HTTP_REFERER} !^http://(www\.)?yourdomain\.com [NC] RewriteRule \.(gif¦jpg¦jpeg?¦png|rar|zip)$ - [F,NC] Code (markup): This says that if the referer is not from www yourdomain.com or yourdomain.com (case insensitive), then send a 404 for these files (gif¦jpg¦jpeg?¦png|rar|zip). This isn't 100% though. Some proxies and ISP's block the referer, so those customers will be out of luck. Your best bet would be to put those files outside of your document root and server them up via a download script. Hope this helps.
Try checking the permissions on the file itself. Make sure it's owned by the right user/group and that's it's readable by the same user/group.
Do you mean in Windows? So you can understand better, I have set things up like this: I have webspace for the main website (running on vBulletin using a download script). I have then setup a windows server which hopefully will be the "file server" if I can get this to work. Would it work having an extra download script for the server, as there is already one on the actual website? Thank you.
If it's a Windows server, then make sure the web server has read permissions to the file. If you already have a download script. I would go ahead and use that.
Yes the webserver has read permissions. The download script I use only protects file names which are local (on the webserver), not on another server. Can you suggest any other way for me to solve this so that people are unable to steal my bandwidth, as the htaccess file is not working as it should.
Anyone who can help please? I didn't think this would be as big of a problem as it has turned out to be. Is there possibly any error in the htaccess document?