This is one of many ways to do it, everything in one line of code: fwrite(fopen($local_filename, 'wb'), file_get_contents($remote_url)); PHP:
It wouldn't be able to handle very large files because the data is internally temporarily stored as a string, in memory. So it depends on your php.ini settings, or you can overwrite them with ini_set: ini_set(â€memory_limitâ€,â€700Mâ€) PHP: but, this is not recommended! Sorry, I don't have a GUI for this. You could try this one: http://www.hotscripts.com/Detailed/64402.html
You could look at this: http://us2.php.net/manual/en/ref.ftp.php But I dont know if PHP would be well suited for this type of functionality.
why dont use rapidleech @keyaa btw file_get_contents just return in string right? really can download into our server? i'm sorry .. i'm newbie
fsockopen , copy() and fwrite() , I use all of them in my script with no problem. copy() works great for me in my site of rapidshare premium link generator. It handeles any amount of data too .
<?php $fp = fsockopen("www.example.com", 80, $errno, $errstr, 30); if (!$fp) { echo "$errstr ($errno)<br />\n"; } else { $out = "GET / HTTP/1.1\r\n"; $out .= "Host: www.example.com\r\n"; $out .= "Connection: Close\r\n\r\n"; fwrite($fp, $out); while (!feof($fp)) { echo fgets($fp, 128); } fclose($fp); } ?> PHP:
Depends on whether the site uses GET or POST. Server does not reject it unless it is not accepting it on the whole !
As administrator, I have full access to servers, so I will do it with linux command wget. For example <? shell_exec("wget "+$url); ?>