I have a directory site. I added around 50,000 pages 2 months back. I have added a sitemap with 50,000 pages and verified in Google. It has only 350 pages in index. I looked at the crawl stats, max pages crawled per day is 1200. How to accelerate crawling of large sites.
I think i know why, 50k pages? im guessing scraped? it will be seen as content thats not unique, you could add 100k but i dont think it would make any difference.
I am surprised you even got 350 indexed already. In terms of being crawled it takes time. There is no VIP in getting crawled by Google and all it takes is time. Don't let this drive you crazy.
Google has upper limits of how many pages it will index per day...looks like you reached it for now. I wouldn't expect all 50,000 to show unless they are very unique