I ran a test where I interconnected 60 pages (59 inner pages and one root page). All were linked to each other. No external links to inner pages, just to the root domain. My domain had a PR4 and it passed so much juice to all the other inner pages that they all received a PR3. That's pretty good IMO
The spiders need to be able to see that your site links together (obviously it is difficult to know exactly waht criteria they are using). In most cases where linking has been achieved with code or buttons, it pays to have standard URL text-links included on the page too - that way even bots that can't see your clever java-navigation can still get to all the places they need to get to. If you can't do that, you can just add a sitemap to make sure that there is at least something that links your pages (ie. a sitemap as a HTML page). I also plan to use a 'glossary of terms' and a 'FAQ' that links throughout my site. The reason for this is that it not only builds up the internal links, but you can include search terms in your FAQ section which may directly mirror what a visitor might enter into a search engine - giving rise to close to a 100% match for a specific search term. The glossary can be used to expand on your keywords and themes and to give your site more depth of content. ?Did I over-answer??
That is a pretty interesting test johnson. Could you please expand on this. How were the links structured? Did you have 60 links on each page that connected the group?
This is like asking "Does breathing air make you healthier?", because the simple answer is yes. Internal linking is how search engine crawlers beyond sitemaps and backlinks find web pages on your website. You can also add relevancy to keywords and phrases with the anchor text as well.
Well, thats not the "only" method search engine crawlers can use. Self built sitemaps\XML sitemaps and backlinks are another method. But, your theory behind what your saying is sound.