Canonical vs. robots disallow?

Discussion in 'Search Engine Optimization' started by atxsurf, Jul 3, 2012.

  1. #1
    Why bother with canonical tag instead of just disabling robots access to all non-canonical pages (presuming it is possible via distinct URL of non-canonical pages)? What is the difference?
     
    atxsurf, Jul 3, 2012 IP
  2. deeptigarg

    deeptigarg Peon

    Messages:
    106
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Basically, canonical tag is used to remove the duplicate of content for Ex: you have a website without www. and you submitted in google when it is indexed by google you put the www. in your domain name and it is also indexed by google then google count the two websites one is with www. and another is without www. and in google eyes it should be duplicate of content. And Robots.txt is used to there, wehre we want these particualr pages of the website does not crawl by google.
     
    deeptigarg, Jul 4, 2012 IP
  3. dcristo

    dcristo Illustrious Member

    Messages:
    19,797
    Likes Received:
    1,201
    Best Answers:
    7
    Trophy Points:
    470
    Articles:
    5
    #3
    They serve different purposes. The canonical tag is to tell google what the best URL is to prevent duplicate content issues.

    On the other hand, robots.txt tells the crawler what parts of your site you don't want crawled/indexed.
     
    dcristo, Jul 4, 2012 IP
  4. atxsurf

    atxsurf Peon

    Messages:
    2,394
    Likes Received:
    21
    Best Answers:
    1
    Trophy Points:
    0
    #4
    You don't need to tel me that the difference is that canonical is tag and robots.txt is file, neither what are they for, as that is obvious and it was not my question.
    the question was what will be SEO difference if I used robots.txt disallow to prevent bots access to any duplicate content pages instead of putting canonical tag in each of them?

    I guess the answer could be:
    If any of the duplicate content pages have backlinks, than using canonical tag will allow those pages to be crawled and link juice to be passed to some other pages (or canonical page?), while if duplicate page is disallowed no link juice can be passed period because it can't be crawled.
     
    atxsurf, Jul 4, 2012 IP
  5. dcristo

    dcristo Illustrious Member

    Messages:
    19,797
    Likes Received:
    1,201
    Best Answers:
    7
    Trophy Points:
    470
    Articles:
    5
    #5
    Well your question really wasn't that well asked.

    If you have canonical issues use the canonical tag this is automatically done with the all in one seo pack with wordpress sites.

    The robots.txt file is really only used to block certain crawlers from all or parts of your site. It's not going to provide more of a SEO benefit.
     
    dcristo, Jul 4, 2012 IP
  6. atxsurf

    atxsurf Peon

    Messages:
    2,394
    Likes Received:
    21
    Best Answers:
    1
    Trophy Points:
    0
    #6
    I don't think there is such thing as "canonical issues", there might be "duplicate content issues" that could be addressed in numerous ways depending on the situation:
    1) canonical tag
    2) 301 redirect
    3) robots.txt disallow
    4) making duplicate pages more unique
    5) deleting duplicate pages

    my question was 1 vs. 3 effect on SEO
     
    atxsurf, Jul 4, 2012 IP