I have a crawler that downloads web pages from different domains, usually about 100 domains in any one moment. It runs on two different servers (both with Fedora Core 5) with very good links to the internet (100 MBit). The problem is that after running for about two hours, the download speed falls from 2-5 MBit to 10 KBit! After the crawler has been stopped for about an hour, the speed picks up again. The server isn't loaded and there are no visible reasons for the speed to drop. The channel is stable, if I download files simultaneously the speed is normal. There is no such problem on windows servers. This means that the problem is most possibly with Linux. I have tried to turn the firewall off, but it doesn't help. Any ideas as to what may be the reason / cure ?
I think I addressed this problem in another forum where you posted. You need to look at resource usage in your code. Are resources handled the same in Linux as they are in Windows. As I noted there, you can always run on what is already known to work, Windows .