Hi. Earlier I used awstats to analyze my server logs, but when the logs started to become large awstats used too much resources. I then switched to urchin, which sees to be more efficient than awstats. But now urchin is using too much resources too I have log rotation every day and each log is about 200-300 Mb. I didn't think a server log of that size should be a problem. But the server load is very high (5-150) when urchin is analyzing it... How do you analyze your server log ?
I use WebLog Expert. It's excellent. Sounds like you probably use a *nix server, but for Windows servers, you can run it on the server as a service. Works terrific for me. I used to use LiveStats, but it has a major shortcoming that WebLog Expert does not have. You can also use WebLog Expert for *nix servers, but you have to run it locally on your PC then.
analog is fast. www.analog.cx from the site : I don't think it uses a lot of resources, I could be wrong, it has lowmem settings and stuff to keep from using too much. You can also nice processes in linux... turn down their priority, man nice.
I still use AWStats, but I configured it to roll the logs over. Set it up to run once an hour to distribute the processing load.
logrotate (should come with most distros) will also rotate logs and zip them, at whatever size you want. Then you can open the zipped logs later if you want to look at them again. Forgot to add that before.