I recently started using sitemap index pages for my website. There are 4 index pages that combined link to roughly 3000 individual sitemap xml files. Each sitemap xml file contains roughly 110 links. With the exception of 5 links repeated on each xml file, all content is unique. Is this a good or bad way of using sitemaps? Prior to me switching to this format, I had grown to a little over 1200 pages indexed. Afterwards, I'm around 600 pages. I don't know if this change had a direct effect. If it did, then I'll happily remove it.
By the way, was this to save bandwidth or...? I first generate the sitemap index file when passing 50,000 urls (the limit per xml sitemap).
No. It was to be able to fully crawl over 250,000 pages that change hourly. In a 24 hour day period, I could theoretically have around 10,000 unique news pages stored and accessible forever. Each stock symbol has 110 links. 10 subpages (option,quotes,analyst, and headlines pages). 100 news article pages are also indexed. I'm currently tracking close to 3,000 stock symbols.