A competing development firm in town uses a bizarre content management system that puts all page HTML and content in a long series of Javascript files. The source code of the page is nothing but scripts. I'm pretty sure that the pages of these sites are non-compatible with spiders but I wasn't sure. Opinions?
Check out this thread. Talks about how Googlebot 2.0 will be able to look at .js files. http://forums.digitalpoint.com/showthread.php?t=60427&highlight=javascript