Ajax or not ajax?

Discussion in 'All Other Search Engines' started by happy_2b_kot, Sep 24, 2007.

  1. #1
    Need suggestion.
    As far as I know when you are building your website using ajax technology you will submitt issues with search engine robots.

    But when you do noy use your ajax on your website - you asre using some very needed features, and thats really bad.

    We are having website for business templatesflow.com and due to search engine difficulties we have cut off all ajax parts from our website.

    Any opinion to find way out to have needed features and deal with SE ?
     
    happy_2b_kot, Sep 24, 2007 IP
  2. ajsa52

    ajsa52 Well-Known Member

    Messages:
    3,426
    Likes Received:
    125
    Best Answers:
    0
    Trophy Points:
    160
    #2
    I face that problem three years ago.
    Now I have a lot of SE friendly pages that are getting a lot of traffic from Search Engines. And on each page there is a link to open the AJAX page with all the features.
     
    ajsa52, Sep 24, 2007 IP
  3. happy_2b_kot

    happy_2b_kot Peon

    Messages:
    234
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #3
    hmm - nice idea but this may be confusing for visitors

    am i right?
     
    happy_2b_kot, Sep 24, 2007 IP
  4. WebTalkVB

    WebTalkVB Peon

    Messages:
    329
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #4
    All I can say is choose one or the other, if sacrificing the AJAX is too much, leave it in there. If the AJAX is just there to look snazzy, it isn't worth it. Just my two cents.
     
    WebTalkVB, Sep 24, 2007 IP
  5. happy_2b_kot

    happy_2b_kot Peon

    Messages:
    234
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #5
    I was told to read this manual
    AJAX and SEO

    what do you think, is it possible to combine?
     
    happy_2b_kot, Sep 24, 2007 IP
  6. darrens

    darrens Peon

    Messages:
    808
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #6
    I have the same issue,

    I developing a site with a lot of unique data that i do not want people stealing ,,
    So i intend to write in my php to check the user-agent and if its google, yahoo, msn, ask etc the site will show the version that is not ajax built.

    If its not a known user-agent i will use ajax so i display the page and content but i dont put the content in the source file.

    Hopefully it will stop and automatic spiders trying to grab my data,
     
    darrens, Sep 24, 2007 IP
  7. izwanmad

    izwanmad Banned

    Messages:
    1,064
    Likes Received:
    14
    Best Answers:
    0
    Trophy Points:
    0
    #7
    just give some idea..... just like vbulletin forum, there is lo-fi edition that search engine friendly... and the main forum sometimes use ajax things.

    I think its better to build 2 version.....

    - ajax is great because its eye catchy but not in search engine eye

    To solve this problem, you need to build the website that can detect if the visitor is search engine or not (you can use ip... etc ... googlebot ip)..

    if the visitor is search engine bot, then display the search engine friendly page

    if the visitor is human, then display the ajax version.....

    another point, use same url for both ajax and search engine friendly version.... just put some ip detection at the header to detect visitor ip...

    I think this solution is great but need somekind of double development for your website.

    have a great day... :)

    edit: another thing is: to save your precious bandwidth, use the very very simple version for search engine friendly version (maybe only text???).
     
    izwanmad, Sep 24, 2007 IP
  8. happy_2b_kot

    happy_2b_kot Peon

    Messages:
    234
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #8
    very nice idea, will try to discuss this with programming department
     
    happy_2b_kot, Sep 24, 2007 IP
  9. ajsa52

    ajsa52 Well-Known Member

    Messages:
    3,426
    Likes Received:
    125
    Best Answers:
    0
    Trophy Points:
    160
    #9
    That will not stop bot stealers, because usaully they are using a known user-agent. Is very easy to change user-agent on web copiers or on tools like wget.

    IMHO you shouldn't go that way. Googlebot and other search engines crawl sites on both formats, usually identifying itself as bot, but sometimes identifying itself as human (using a common user-agent and different IPs)
    . If the bot detect different content for search engines version and for human version you site will get some kind of penalty for sure.
    You can go with the 2 versions solution, but you should use different URLs.
     
    ajsa52, Sep 24, 2007 IP
  10. Western

    Western Well-Known Member

    Messages:
    1,751
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    135
    #10
    i think 100% Ajax websites are not good
    its not good for google indexing
     
    Western, Sep 24, 2007 IP
  11. sarathy

    sarathy Peon

    Messages:
    1,613
    Likes Received:
    76
    Best Answers:
    0
    Trophy Points:
    0
    #11
    All thatz needed is provide value to your users.,
    Use ajax where its most needed, like
    1) When users use the search box in your site, you can use ajax to preload content so that users will go the next page result or come back to the previous page very quickly
     
    sarathy, Sep 25, 2007 IP
  12. happy_2b_kot

    happy_2b_kot Peon

    Messages:
    234
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #12
    does google have any information on their (SE) work with ajax pages ?

    besides, we have left ajax for login fields, cart manipitaion etc.
     
    happy_2b_kot, Sep 25, 2007 IP
  13. scriptman

    scriptman Peon

    Messages:
    175
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #13
    I'm curious to know what Google thinks about this. It's very similar to black hat SEO in which different content is provided to search engines while normal users see standard content. Of course the intentions in this case are all good, but I wonder if Google considers it to be black hat?

    Anyone have any thoughts on that?

    I personally feel that AJAX should only be used to increase usability, not provide content itself, but it's still a topic of interest to me.
     
    scriptman, Sep 27, 2007 IP
  14. happy_2b_kot

    happy_2b_kot Peon

    Messages:
    234
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #14
    We are not going to use "black hat" content. But topic is still actual, any comments? Who is informed?
     
    happy_2b_kot, Sep 28, 2007 IP
  15. eli03

    eli03 Well-Known Member

    Messages:
    2,887
    Likes Received:
    98
    Best Answers:
    0
    Trophy Points:
    175
    #15
    used sitemap if you have a lot of ajax on your site.
     
    eli03, Sep 28, 2007 IP
  16. happy_2b_kot

    happy_2b_kot Peon

    Messages:
    234
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #16
    we do have it for a long time
     
    happy_2b_kot, Sep 28, 2007 IP
  17. ajsa52

    ajsa52 Well-Known Member

    Messages:
    3,426
    Likes Received:
    125
    Best Answers:
    0
    Trophy Points:
    160
    #17
    As far as I know, usual web crawlers don't make XMLHttpRequest, so sitemaps won't help spiders to find the AJAX content.
     
    ajsa52, Sep 28, 2007 IP
  18. happy_2b_kot

    happy_2b_kot Peon

    Messages:
    234
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #18
    ghm...
    sorry I am not a cool-coder, could you explain
    in more simple text?
     
    happy_2b_kot, Sep 28, 2007 IP
  19. ajsa52

    ajsa52 Well-Known Member

    Messages:
    3,426
    Likes Received:
    125
    Best Answers:
    0
    Trophy Points:
    160
    #19
    Well, AJAX is something like XML data obtained with javascript code running on browsers client. But, that XML data is not included on HTML pages returned by your server.
    Usual crawlers: Googlebot, Slurp (yahoo bot), Ask, ... get only that HTML data and not the XML data (mainly because usually the query to get the XML data is user-generated).
     
    ajsa52, Sep 28, 2007 IP
  20. happy_2b_kot

    happy_2b_kot Peon

    Messages:
    234
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #20
    Yes. I know that (and thank you for interpritating to simple text for me), but the point is - if we would make 2 separate websites - could we get banned by SEO for that?
     
    happy_2b_kot, Sep 28, 2007 IP