external links ... no follow

Discussion in 'Link Development' started by darrens, Jul 18, 2006.

  1. #1
    Hi Guys,

    I have a area on my site that is very popular so i dont want to remove it but it keeps getting spammed.
    The area allows people to post comments/links onto a page.

    How can i tell yahoo/msn/google to ignore the links ?

    I remembering reading something about 'nofollow' command? also something about being able to block them using .htaccess file or robots.txt file?

    suggestions ? whats the best way.
    will it stop all spiders following the links ?

    Thanks in advance.
     
    darrens, Jul 18, 2006 IP
  2. Monty

    Monty Peon

    Messages:
    1,363
    Likes Received:
    132
    Best Answers:
    0
    Trophy Points:
    0
    #2
    If it's spammed by humans, you can use the attribute "nofollow" (and 'advertise' about it, cause if they don't know, it's kind of pointless.)

    If the spam come from bot you can try a capcha (I'm not sure about the english word, I mean a "verification form".)
     
    Monty, Jul 18, 2006 IP
  3. Cristian Mezei

    Cristian Mezei Notable Member

    Messages:
    3,332
    Likes Received:
    355
    Best Answers:
    0
    Trophy Points:
    213
    #3
    Cristian Mezei, Jul 18, 2006 IP
  4. Bondat

    Bondat Peon

    Messages:
    2,397
    Likes Received:
    217
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Try using a registration form before they can post post comments/links.
    This won't eliminate spam but will eliminate atleast a few.
     
    Bondat, Jul 18, 2006 IP
  5. mad4

    mad4 Peon

    Messages:
    6,986
    Likes Received:
    493
    Best Answers:
    0
    Trophy Points:
    0
    #5
    If you allow people to post links to your site then the only thing you will get posted will be spam. I suggest blocking any posts with html in them.
     
    mad4, Jul 19, 2006 IP
  6. Warkot

    Warkot Peon

    Messages:
    376
    Likes Received:
    31
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Hmmm. I take it you are using Apache... You can try and get a simple script in place that would parse the posts and remove all the URLs from them automatically, or convert them to plain text, or replace slashes with underscores, or whatever.

    The rel=nofollow thing wouldn't prevent spiders from crawling, but it will result in no PR and link pop being transferred.

    Warkot
     
    Warkot, Jul 20, 2006 IP
  7. enposte

    enposte Active Member

    Messages:
    133
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    53
    #7
    If you want geniune comments to continue, then like mad4 said, if you disallow html, that will remove the incentive for spammers.

    If you don't any comments spam or not, then your system may allow you to disable comments entirely for that section.

    If the spam you're getting is automated, then it may be worth implementing a CAPTCHA (type what you see).
     
    enposte, Jul 20, 2006 IP
  8. Warkot

    Warkot Peon

    Messages:
    376
    Likes Received:
    31
    Best Answers:
    0
    Trophy Points:
    0
    #8
    My mistake, this will prevent G-bot from crawling, according to Cutts.

    Warkot
     
    Warkot, Jul 21, 2006 IP
  9. darrens

    darrens Peon

    Messages:
    808
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #9
    Thanks guys ...
     
    darrens, Jul 23, 2006 IP