I hypothetically build links to a page that contains links to all of my pages. This page will be over 20mb and contain a directory of over 20,000 links all on that single page. Can gbot handle this and index all those pages?
Google still recommends 100 links per page or less. A 20MB page would contain so many links that, even if Google were to index it from start to finish, the ranking power passed on by any of these links would be insufficient to cause Google to follow such links in order to index the target pages. While you certainly want to accommodate search engines when you create web pages, a page that is so unfriendly to users would be very likely to be ignored by Google to one extent or another. I can't think of any good reason to create a giant page of links. If you're trying to automatically generate a sitemap page, this is not a good way to do it. You might be best off manually creating a sitemap page that is limited to the most important pages (like major category and sub-category pages).
I vaguely remember there used to be a 100kb limit on indexable content. Though this information might be outdated. Anyway, I would not do it, because it smells like spamming and could raise some red flags. And what if a poor user clicks that page...
I have a page with more then 500 links and it seems to be ok, it's the one time that PR matters - from what I've seen I noticed that pages with higher PR can have more external links on them that will get indexed. I assume it's because Google 'trusts' pages with PR and thus will be more likely to index them. If you're doing it to get the pages indexed, then yes go for it - put the links on a page that has high pr (for your case you'll want one with very high pr because that's a lot of links). If you're doing it for the PR and want those links to get PR, no it won't work! Because the value of the PR given will be divided across the links, and 20,000 links divided is going to mean very little PR distributed across all of them. You may want to think about nofollow on the least important ones, or breaking it up, so it's like 1000 per a page - paginate to say 20 pages or more.
Yes Gbot will index all your sites but it needs some time BTW if You will not put some baclinks to subpages Your subpages go to SUPPLEMENTAL RESULTS index You can check it via type in Google search: site:Yourdomain --> this is all index site:yourdomain/* --> quality pages
Ya yr all pages will index by Gbot. bt it would be take some time. for best thing. u hv already created Sitemap XML right!! now u do create html sitemap as u hv too many links so link sitemap 1 , sitemap 2. for visitors friendly & google robots friendly. Bt per sitemap html u can add upto 500 & in sitemap xml up to 10000.