Google finds it hard to crawl content in rich media files like Flash, Ajax etc. It cannot crawl images, which is why we use Alt tags. And Google cannot crawl content in robots.txt files.
good question. we've been occupying ourselves with it long and hard during the past 2 years it has a simple and a more complex answer. simple answer: UNIQUE, ORIGINAL content is good. more complex answer: it very much depends on your page structure and content. if you have a lot of duplicate content or a page frayed with HTML errors or blackhat tricks, chances of getting crawled/indexed well are slim. as a general guideline for a manually created site, you should aim to have a lot of original text with no other duplicates on the web, which is written for human visitors (i.e. has a low bounce rate). Googlebot is super keen on indexing stuff which retains people and invites backlinks. so, as a blogger, your best bet is to write provocative, interesting, controversial content and spawn/invite comments and discussion. if you aim for more traffic and a larger site, the strategy is a bit different. here's what we do. 1) we crawl the web for little snippets of information in some specific niche (example: mp3 filenames or intelligent text-only summaries of pages having the words "accident attorney" in them) 2) we create a database out of that info plus the place (site, deep URL) and time where we found it (and some more stuff like meta keywords of that page etc) 3) we "fatten up" that information by running some additional scripts on it. for example, getting a thumbnail screenshot of the page we found it on, auto-translating the text snippet to several languages, adding some randomly selected on-topic RSS feed extracts, adding "what is similar" and "recently viewed" modules on each deep page, etc 4) we add dynamic user-influenced stuff like last searches, a shoutbox, a geolocation google maps widget, etc. 5) we repeat this for 1 million+ pages Once a certain complexity for the deep pages is reached, and they're interesting enough for human visitors, it turns out Googlebot is super eager about crawling and indexing them. We've had a million pages indexed within 4 weeks.
To make it Easy to crawl content with clean code structure quality content+ keyword interlinked pages proper site map update & submission
fresh, unique and quality content But still, you need to work out on your onpage and offpage seo to get crawl faster
Google crawl all websites content easily except those flash,ajax based websites. But Google always give preference to unique content.
The answer is simple it spider's and likes TEXT. The simpler the better. Just go to view source. This is what it "sees"
Google can crawl text, image, video contents. And Flash banners cannot be crawled by the Google (i have read this on some seo blog).
Posting the fresh and interesting content so that it can be crawled by the SE..!! using more text than the images is a good concept for designing the page