I've been wondering for a while if there is some web stats analizers that go into more detail than awstats - which I currently use. I do find awstats covers most of the things I want. However I often find that I'm wanting to go into more detail on 'search and spider visitors' and 'search keywords'. I keep wanting what pages the spider fetched (not just how many) and with keywords, which engine did they search, when, and who searched for it. I don't mind paying for software - especially stuff I can't write myself . But the only software I've seen that goes into this sort of detail is webtrends and nettracker and the prices of those are simply obscene. What does everyone else use ? Does anyone have suggestions of software that does go into the detail I want ?
Damn, didn't even know this forum was here -- thought it strange I couldn't find any posts on stats software.
You MUST check out www.statcounter.com. They are amazing...and free! The only thing is, their log file is limited to the last 100 visitors, so the history of your stats only spans so far, that's when you have to start paying. Not sure what the prices are like since I still just use the free version, but I'm always amazed at how extensive their information is. They even have an ip tracker that shows where (on a map of the world) your visitor's location is. -Brian Renner entrepreneur7
It's fairly new. Try these threads- http://forums.digitalpoint.com/showthread.php?t=26424 http://forums.digitalpoint.com/showthread.php?t=29782 http://forums.digitalpoint.com/showthread.php?t=14474 http://forums.digitalpoint.com/showthread.php?t=21144 you can see me post the exact same thing in all of them.
OK thanks for that Rob - looks like you're a fan of urchin. As far as I can see (and you confirm in one of your posts) it doesn't deal with spiders Entrepreneur7 - statcounter seems to do the biz on keywords, but again it seems to have nothing on spiders/bots. As I see it, my requirements are: 1) the standard stats - uniques per month/day/hour. pages visited etc etc 2) spider stats - number of visits, when they visited, and more importantly exactly what they fetched. Ideally I'd like to be able to click on a link and see a list of pages, click on a page within this list and see which bot fetched it and when. ie to be able to go in via spider and also page. 3) keywords stats - numbers for each keyword, be able to see which SE sent that visitor and when I would have thought that this was a pretty standard requirement for most SEOs and the free stuff can really only be used to provide an overview. Looks like I'm going to have to shell out some serious money. I don't like outsourcing stats, unless I can ftp old logs across - which many seem not to offer. I'm sure that some will. I don't like adding bits to my code in order to send stats to external servers - this also normally negates importing old logs. I require to analize about 25 sites at present and imagine this will grow considerably. Ouch something tells me this is not going to be cheap. I was hoping for some $200 bit of software that would do the job, but it looks more likely I'm looking at >$1K for what I want. I'm going to take a very close look at sawmill, also urchin 6 if/when it comes out.
I personally see very litle value in reporting spiders and even consider spiders activity as kind of "noise" that should be filtered out. But you place spider reporting as major requirement. Whould you please explain why? May be I miss something.
I want to know what pages are getting spidered and how often. Are the spiders ignoring certain pages - if so I can look into why. Spiders are what provide the information to allow SE's to rank a page, it is (imho) important to know what the spiders are fetching - or rather it's more important to be able to work out what the spiders aren't fetching. So far I'm impressed with sawmill. It seems to do everything I want. and the ability to write notes about what I do and get a dicount of the price appeals. I know I'm going to need more than 100 profiles, so I may well have some to rent out at some point if anyones interested.
Interesting idea. Never thought about a desktop based analyser. Immediate drawbacks that springs to mind: a) it runs on windows b) it means that I would have to take a desktop with me when I went away. I already take my laptop if I'm away for more than a week, but if the net access is via modem (gprs or whatever) or via internet cafe's then it's just not workable. RDP across gprs simply isn't usable, so vpn is not a viable cure. Personally I can't see how this could be considered as a serious option for multiple site analysis there are imho simply too many drawbacks. For small siites it might be an option, for 10's or 100's of sites I seriously doubt that it cuts it.
I'm seriously impressed with Sawmill. Even more happy with the way that the pricing works. It is priced per 'profile' and I had initially thought that a profile was a 'domain'. This is not so. A profile is can be a collection of the same type of log file. Since all my servers use apache I can have all my domains sat in the same profile. It took a slight adjustment to the log format for apache to make this usable but that wasn't exactly a great hasell. This fact alone makes this software actually pretty cheap in my mind So it looks that I'll be going with Sawmill.