Would it be easier to get a large website, 1000 ish pages indexed faster if i made a subdomain for each of the pages rather than 1000 individual pages. For example instead of having www.domain.com/page1.html, www.domain.com/page2.html, www.domain.com/page3.html and so on i would have www.sub1.domain.com, www.sub2.domain.com, www.sub3.domain.com from my previous seo experience i know its easy to get a couple of pages per site indexed fast in Google however it can take months to get the whole site indexed properly; if Google actually treat sub domains as indvidual domains are they claim then i should be able to get all my pages indexed faster shouldnt i?
No, you could just make sure your XML sitemap is thorough, and get some deeplinks to the sub pages that you need indexed. This should be just as quick and effective as subdomains.
administering and optimizing all those subdomains can be a nightmare...large sites tend to do better..how are you going to get links to all these domains?
Keep all the pages under the one domain name, a large site using well written xml sitemaps should index quickly enough.
SEORanter is right. Link generation to all of the subdomains will be a nightmare. Use sub-domains to create an additional website with in the domain. For example: website.com and entertainment.website.com not page1.website.com page2.website.com
Go for same domain name. Proper linking between the pages and XML sitemap is the solution of your problem, I think. Submit that XML sitemap to Google and wait.