a) Whats a good number of links in a sitemap file? I created a sitemap.xml using GSitecrawler and I have just under 6K links. b) How does it help the Google bot to traverse these pages? At present I have articles on my site which display irrelevant ads, I remember reading somewhere that Relevant ads are displayed after the Google bot has visited the page.
The maximum number of urls per sitemap is 50K. 6K is well below average. Sitemap tells the googlebot all the urls in your site which helps it in crawling even floating links. We can set the priority of each page in reference to other pages, since we have a better idea of which page is important and which is not, it tells the crawler importance of pages in our site wrt other pages. Also, we can set the frequency of modification of pages, thus conveying to the bot when to visit the page next.
It helps Google to know when to reindex your pages, thus saving you bandwidth + quicker updates in SERPs + helps ensuring you get all your pages indexed. If your sitemap gets past 50,000 thousand pages (or the 10 MB limit) just split your sitemaps across multiple files and use a a sitemap index file. You can also compress xml sitemaps to save bandwitdh. Some xml sitemap builder programs can do this, including mine.
Even when I stated that my homepage changes multiple times a day - but I still see Google bot visiting just thrice a week. Could it be because the site is just PR3 or site does not have a BL from any authority site (well, I get BLs from ezinearticles - I imagine that is an authority site!)
What a sitemap does is that it just tells the Crawler, the page is supposed to chage every day. But its the crawler that decides on whether or not to crawl daily or not. Even if you set the frequency to never, googlebot may crawl your page. The information provided in the sitemap will not be taken as granted.