Hi all I saw a lot of sites that won't allow the baiduspider to access/crawl the site. Can you describe what might be the reason for this. Even the Digital Point site (TLD) is not allowing the BaiduSpider to crawl the site. i.e User-agent: BaiduSpider Disallow: /
If I am not mistaken it's an aggressive spider. If you're on a shared hosting but have a rather large site, you may want to restrict it. Otherwise you don't have to.