I have a coupon website (http://couponeasy.com/) Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks.
Note that a sitemap does not automatically give you expanded site-links - that is completely and arbitrarily up to Google to provide - it's not automatic.
I would recommend https://www.xml-sitemaps.com/. You can tell the software where to run the sitemap; as well as how often. Though I do see the value of indexing the entire website.
Why not keep the pages alive - e.g. if you have a page with coupons for a specific product - just show past coupons or "coming soon" and keep the page up? (In particular if each page with coupons for a product has some conttent as well)