I have read up bit on this and it seems you basically just put in urls into a txt or xml file that you want crawled? But doesn't a Google bot spider all your pages anyway as long as they are linked on the main page? SO if all my pages are linked on the main page anyway then do I really need Google Site Map as won't any SE bots spidering it go through them anyway?
Google sitemaps are for getting more pages indexed by google. If your site is 100% indexed then a site map will not really help you. There are some semi-useful stats in the sitemaps account though.
Thanks How do I know whether I am 100% indexed? I know I get spidered, I thought if i get spidered then my pages will defo be indexed? I might still do a Sitemap anyway as it won't do any harm, just wondered about my query thats all.
Simple (but not certain) test: count the # of pages in your site, compare with the results of the site:-query (better: check the URLs) Google Sitemaps also help when you change things in your site. If you use the XML format and specify the last changed date, then Google will crawl those changed pages faster than if it had to "find" the link to that page again.