I'm particularly interested in a software/script which simply protects images on a website from being downloaded by mass downloader apps... Any ideas? Thanks.
You can create your own anti-leech protection by simply editing your .htaccess file or downloading a script for doing such task. Further information in both options is available here: www.xentrik.net/htaccess/antileech.php http://blog.evisio.no/?view=plink&id=122
Thanks, but it seems that using the .htaccess will prevent people from viewing my images if they find them directly in a Google search for example... Am I correct? I'll try evisio tho, and see what happens. Thx!
Also I wanted to ask... Is there a special way in which a downloader app connects to a website which is different from when a browser connects to it, and is there a way to allow just browsers to connect to a website etc...?
Well, each browser or app should use their "user-agent", and you can use that on your .htaccess or your server programs (PHP, PERL, ...). But that "user-agent" can be changed also on some downloader apps. FOr example "--user-agent" option for wget.
The idea of examing the user-agent string is still of value. It will stop the casual downloader who does not know about user-agent strings. Just be aware that someone who is determined to rip your site will still do it.
That's assuming people are skilled and knowledgeable enough to write their own leaching software, but not enough to set the user agent string. I would guess 90 % of people who grab images en masse are using off-the-shelf software to do it. Some of the programs identify themselves, others mimic a browser. I don't have a list of any of them, but Google can find you one pretty quickly, and you can at least make some headway, banning some of the mass downloaders is probably better than none.
Currently I'm using a small list on my .htaccess: For apps: "Wget" "HTTrack" "WebCopier" "WebSauger" "WebReaper" "WebStripper" "Web Downloader" For unfriendly crawlers: "aipbot" "RufusBot" "voyager"