Official Google: "tl;dr: We are no longer recommending the AJAX crawling proposal we made back in 2009. In 2009, we made a proposal to make AJAX pages crawlable. Back then, our systems were not able to render and understand pages that use JavaScript to present content to users. Because "crawlers … [were] not able to see any content … created dynamically," we proposed a set of practices that webmasters can follow in order to ensure that their AJAX-based applications are indexed by search engines. Times have changed. Today, as long as you're not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers. To reflect this improvement, we recently updated our technical Webmaster Guidelines to recommend against disallowing Googlebot from crawling your site's CSS or JS files." Thoughts? Am I correct to assume that nothing changed but some obsolete method descriptions? Or did the methods change as well? Share your best practices on the matter.
I think that the way to index the #! combination for the indexing of urls still works. And nothing will change.