As i was reading IBMS ajax tutorial this morning one taught kept going thru my mind "this new web 2.0 technology is bad for seo!" i mean customers want it on their sites, but how do i explain to them that search engines just cant index ajax powered sites on one hand i want to dive into new techology on the other i dont want to drop from the serps!
Its the same as JS in that respect, you wouldn't use JS to do everything and neither should you use AJAX for all sites. Everything has its place and you need to use stuff like this when you need it not just because its there.
Depends what you're using AJAX in your pages for. Although it does break a lot of the "functionality" of a users experience that frames did - but that's not SEO related. There's nothing to stop you SEOing your home page and documentation/faq pages. Or even special sales pages. You shouldn't miss out of any natural traffic. After all, it's only transaction stuff, a search engine lands on a page the same way that a user does if they type in the URL. ...and if you've got stuff you think will really bugger up the spider, then cloak it out - or don't let those pages be spidered. Shoutwire and Digg use AJAXy kinda things in their pages and they rank highly.
ajax has nothing to do with "ranking highly", it can only affect indexation of site Blackbug For example, my site using google maps is a catalog of restaurants, gas stations, etc etc in my city. The index of the points is ajax-generated. SE will never know that i have "La Boheme" restaurant described, and that i even have a separate subpage dedicated to this restaurant. It will never know that - of course if i dont give it this information somehow. But then... the subpage about Le Boheme is as well ajax generated... so how will SE now that this subpage and my home page are different? it will only see some javascript, coz everything else is generated on-the-fly. So... google is encouraging webamsters to use AJAX, when it cant really index it. Paradox? Maybe not really paradox, but unnatural situation, where u have to either surrender functionality to good SE rankings.... or put much much more work just to make your site SE-Spiders-Readable... Ehhhh... Im struggling with this right now... Hard task... and i have to combine PHP with JavaScript what makes me schozofrenic coz i have to switch from "string".$var (php) to "string"+var (JS) every two minutes!
I never said anywhere that it does have anything to do with ranking highly. Programmers are always faced with obstacles. You need to devise some way of defining entry points via static or dynamic urls. If you generate a page the same way each time, then you'll be able to define a direct access url to it, surely? Other sites do this. Other than that, most instances of AJAX pages don't represent anything in a recreatable state - i.e. in application development, where you wouldn't want to link direct to a page state. My point with comparing frames to AJAX is they they can both be equally crap at entering a page in a state you want from an external link - search engine or otherwise (i.e. everything in place and not looking at a menu frame or something). But people got around that. Ah, that's the nature of web dev. Switching personality every five seconds...
Web 2.0 bad for SEO ? I don't think so. I actually posted an article about the whole 2.0Culture (was on diggs frontpage) and think it's definitely worth to adapt your site to the "new" design part of the whole buzz. No need to be fully 2.0 updated, but go for the design. Google likes it (I guess). The full article is here: http://www.vipedio.com/roman/blog/20culture_full.html Enjoy it.