Well I am writting lot's of content and want to make sure that ppl don't just download the page of my site using various softwares and than browsing it offline. Is there anyway I can stop this ? How can I use robots.txt effectively ?
http://www.javascriptkit.com/howto/htaccess13.shtml should pretty much cover it if i understood the question right
excellent list but one questions - won't putting so much code in my .htaccess create any issues like my server getting overloaded ? If not than I will immediately put the full code in my .htaccess
www.javascriptkit.com/howto/htaccess13.shtml should pretty much cover it if i understood the question right[/QUOTE] Thanks for the link, I've been looking for something like that for a long time!
Your server will read through that faster then you can read the first letter, don't worry about overload =P.
Are you saying that you do not want people to look at your source code and copy-paste into their WYSIWYG editors? In that case, there is no way as far as I know to protect your source code. Try scaring them with a copyright notice. A big, fat, copyright.