Should I try to limit the number of URLs in a sitemap at all? Is it bad for googlebot if the sitemap lists thousands of pages on the site?
The only reason I'm asking is because if the number of URLs reaches such a high number, I'm concerned with the bot not actually going through them all, therefore I wanted to know if it's a good idea to remove archived pages from the sitemap.
the maximum number of URLs an xml sitemap can contain is 50,000. also, it should not be more than 10MB in size.
Yes you have make separate pages of your sitemap url you can do 100 URLs per page, it will be lot helpful to get crawled.
In google guideline google has clearly said that dont increase it more then 500 i think ( i m not sure about figure but I am sure that it has limitation) .. you can check it you will get the idea.. you need to break your sitemaps into multiple parts.