Someone knows a good 'n' easy software that makes CRC of every file on a certain directory in order to compare the CRCs with previous days CRCs. I'd like to use something like this on my shared host space, but I forgot the name of such software. I could probably do a script with 'cksum'..anyone did that?
mtree is the standard tool for this on BSD boxes. I am sure Loonix has a port too. Run, $ mtree -K sha256digest Code (markup): on the filesystem (or at the very least kernel and system binaries) and store it offline.
Unfortunately is not installed on my host I'll send a support request about it. Right now I am using find . -type f | xargs md5sum > results050307.txt Code (markup): But it gets even file I don't want to control, so it's a bit hard to check results.txt files by hand or diff.. Anyway, I am not sure if I have to start it on binaries files..this should be the task of the hosting company. I'm not using my own server.
Well, if you are looking to protect only C++ source files (for example), then why not make that: find . -type f -name "*.cpp" | xargs md5sum > results050307.txt Code (markup): The other thing I notice you missed out is that you will need to get a hash of your results file - otherwise how will you know that has not also been tampered with?
That could be a good example. But then a hacker could easily change just the .o files. Which is better 'cause you never open them so you don't really see the difference if he add some malicious code. Of course he has to change it every time I recompile. About your last question: scp resultsfile.txt localhost:/