Many of us have been complaining about the spam appearing in search results lately. Well there is something we can all do to help ensure we are not enabling the spammers. We all need to make sure the server log reports automatically generated by our websites are not indexable by the search engines. I've been finding a tremendous amount of bogus traffic to my sites is generated by bots spoofing legitimate user agents for the sole purpose of leaving the URL of websites they are trying to promote in the referrer field of server logs in hopes that those entries will then result in back links to the site being promoted via search engine indexable server log reports. People either need to make those logs password protected, deny access to those logs by search engines via the robots.txt file or disable the generation of those reports on their web servers. Personally, I am finding dozens of incidents of referrer spam in my server logs each day. Getting listed well in SERPs is hard enough as it is, lets not help out the spammers by giving them back links via our server log reports.
yes that is true i preffer to use robot.txt rules to stop this . this days lot of spammers doing such things they create affiliate links spam them and earn money.
From what I have been pulling off my logs, a lot of what I'm seeing appears to be SEO firms creating back links for their clients "legitimate" web sites. One IP address will spam my logs with a half dozen or more different websites in the referrer.