ia_archiver is not listen to Robots.txt can i block there UP? All my domains have blocked ia_archiver but still they come back even after few month. They simple do not listen to my request not to crawl my sites. What can you do?
"disallow" is just a directive. It is not necessary that robots will follow it. This is even for Google. There are lotsa incidence that ia_archiver denies to follow the robots.txt directive. It is really hard to keep cool although it is hogging your server time and bandwidth but doesn't give you any referral.