i think you better use robot.txt to make google find your sitemap.xml in your site, because...if you site active..google bot will often visit your site to crawl the content.
how often is often though.... cause i have a bunch of blogs and websites, my blogs get crawled in a matter of a couple of days max. my websites take a week on average , and its a piss off because i concentrate more on my sites and update (to an average of adding 3 to 5 pages a day).
you have a last modification and change frequency tags that help google determine how often your site should be crawled.. works for blogs and content sites, doesn't seem to work for ecommerce sites though...
if you are going to change only content and not going to add new page daily then you need not to submit your sitemap again google is automaticly determine that how often you are updating the content and will crawl accordingly you can check frequency of crawling in google webmaster tool thanks
what if your gonna be adding new pages but not going to change the content in existing pages. For example, i have a site and everyday i add three to five articles to it, how often should i be redoing and submitting my site map?
There's no problem in doing it every day or even every hour, if you had this kind of particular need! In fact you are helping google crawlers by telling them where to look. However, changing the frequency of the submission, won't change the frequency of Google's visits to index your pages. In my case, they come once in one or two weeks and index like 20 new pages, but I have much more new pages to index than the one they index!
exactly, i dont understand that. If i add approx. 30 pages between visits, they only index 5-10; why is that?? what can i do to get them to index all my new pages??