Need suggestion. As far as I know when you are building your website using ajax technology you will submitt issues with search engine robots. But when you do noy use your ajax on your website - you asre using some very needed features, and thats really bad. We are having website for business templatesflow.com and due to search engine difficulties we have cut off all ajax parts from our website. Any opinion to find way out to have needed features and deal with SE ?
I face that problem three years ago. Now I have a lot of SE friendly pages that are getting a lot of traffic from Search Engines. And on each page there is a link to open the AJAX page with all the features.
All I can say is choose one or the other, if sacrificing the AJAX is too much, leave it in there. If the AJAX is just there to look snazzy, it isn't worth it. Just my two cents.
I have the same issue, I developing a site with a lot of unique data that i do not want people stealing ,, So i intend to write in my php to check the user-agent and if its google, yahoo, msn, ask etc the site will show the version that is not ajax built. If its not a known user-agent i will use ajax so i display the page and content but i dont put the content in the source file. Hopefully it will stop and automatic spiders trying to grab my data,
just give some idea..... just like vbulletin forum, there is lo-fi edition that search engine friendly... and the main forum sometimes use ajax things. I think its better to build 2 version..... - ajax is great because its eye catchy but not in search engine eye To solve this problem, you need to build the website that can detect if the visitor is search engine or not (you can use ip... etc ... googlebot ip).. if the visitor is search engine bot, then display the search engine friendly page if the visitor is human, then display the ajax version..... another point, use same url for both ajax and search engine friendly version.... just put some ip detection at the header to detect visitor ip... I think this solution is great but need somekind of double development for your website. have a great day... edit: another thing is: to save your precious bandwidth, use the very very simple version for search engine friendly version (maybe only text???).
That will not stop bot stealers, because usaully they are using a known user-agent. Is very easy to change user-agent on web copiers or on tools like wget. IMHO you shouldn't go that way. Googlebot and other search engines crawl sites on both formats, usually identifying itself as bot, but sometimes identifying itself as human (using a common user-agent and different IPs) . If the bot detect different content for search engines version and for human version you site will get some kind of penalty for sure. You can go with the 2 versions solution, but you should use different URLs.
All thatz needed is provide value to your users., Use ajax where its most needed, like 1) When users use the search box in your site, you can use ajax to preload content so that users will go the next page result or come back to the previous page very quickly
does google have any information on their (SE) work with ajax pages ? besides, we have left ajax for login fields, cart manipitaion etc.
I'm curious to know what Google thinks about this. It's very similar to black hat SEO in which different content is provided to search engines while normal users see standard content. Of course the intentions in this case are all good, but I wonder if Google considers it to be black hat? Anyone have any thoughts on that? I personally feel that AJAX should only be used to increase usability, not provide content itself, but it's still a topic of interest to me.
We are not going to use "black hat" content. But topic is still actual, any comments? Who is informed?
As far as I know, usual web crawlers don't make XMLHttpRequest, so sitemaps won't help spiders to find the AJAX content.
Well, AJAX is something like XML data obtained with javascript code running on browsers client. But, that XML data is not included on HTML pages returned by your server. Usual crawlers: Googlebot, Slurp (yahoo bot), Ask, ... get only that HTML data and not the XML data (mainly because usually the query to get the XML data is user-generated).
Yes. I know that (and thank you for interpritating to simple text for me), but the point is - if we would make 2 separate websites - could we get banned by SEO for that?