Hi! Just yeasterday I've noticed that google webmaster found some strange mistakes at a site. For example such broken link: http://www.devart.com/ibdac/ibdac15.exe/194.file I think it is in some way connected with Multi-threaded downloads. Do you know is it harmfull for the site to have such kind of mistakes and is there any way to escape it?
The multithread download is different from the normal download, so this might be reflective of that. since no harm was done to your site by this, chances are this is just an error of script. Such errors happen some times. Cheers.
I just have got an idea of dissallowing indexing of .exe files in robots.txt So I just want to write such conditions: Disallow: *.file Disallow: *.exe What do you think about it?
I do not know how your website url look like but this may be cause but backlink building campaind. To be more precise here is what can happen. It the website where you try to post the link the webmaster of that website decide to restrict the numbers of characters for the link section this can make your link to be trim and so you will not get the full link and when g come and try to connect to that link it will get a error. Also there are website that modify your link and this will also cause a error from your server when G will try to get that url.