Hello, I heard a lot of time and also read in webmasters guidelines that Google Bots can't crawl Flash and JavaScript. Google is having with high level of programmers, search engineers. Then how is it possible ?? Please tell your thinking, researches..!!
There is solution to get your JavaScript generated content indexed but it is not trivial at all. It is called "crawlable AJAX" and there is a pseudo-standard for it, developed by Google. The point is to make a HTML snapshot which is a rendered version of your JavaScript generated site/DOM and then you serve that site for the search engine crawlers instead of the original version of the site. Not sure if other search engines support this technique tho.
A finished .SWF Flash file is quite closed up, so for them to decompile an SWF and break down the text/searchable content + any possible links is a task in itself that usually can't be automated and a lot of times cannot be accomplished without the original .FLA Flash source file. Also for reasons of bandwidth for the Google spiders, Flash files can be quite large and when a spider crawls your site it basically sees only the text (HTML) and not images or other things that require bandwidth/file transfers. This has always been an issue with Flash and part of the reason I never use it
I think google bots can execute java script. For example, if you do java script redirection google bots will see this and penalize your site.