You can use a combination of web log analyzers and web trackers. The web trackers will identify the human traffic, while the web log will contain both bots and humans. You even use special bot analysis tools to enhance that further.
Sorry for my naivete but what tools can I use to do all those? Google Analytics? Awstats? Can you please tell me what I should do?
Hey I too have same question for the Bots. Thanks for asking... I also have a query that generally after how much period does the googlebot visit the site with Page rank of 4.
A web log analyzer like awstats will work for tracking the bots. There is other software or more programmable ways to detect bots. As for regular traffifc, GoStats is a good option (you get your data immediately) @DR There are likely many more factors aside from the PR of a site as to when it will be crawled by googlebot. You should just track the return of google bot and see.