PHP code for server to server?

Discussion in 'PHP' started by wvccboy, Apr 15, 2007.

  1. #1
    Hey

    I got an arcade script, and it's got almost 2GB of this game pack.

    I need a PHP server to server script to transfer the file to my server so I can extract it.

    Does anyone have any code for this?

    Thanks
     
    wvccboy, Apr 15, 2007 IP
  2. bobby9101

    bobby9101 Peon

    Messages:
    3,292
    Likes Received:
    134
    Best Answers:
    0
    Trophy Points:
    0
    #2
    You can use the copy() function, and then use settimeout() for about 6 hours
     
    bobby9101, Apr 15, 2007 IP
  3. wvccboy

    wvccboy Notable Member

    Messages:
    2,632
    Likes Received:
    81
    Best Answers:
    1
    Trophy Points:
    250
    #3
    Thanks; any example of that code?

    All I know is

    But how do I add the URL?

    Thanks
     
    wvccboy, Apr 15, 2007 IP
  4. bobby9101

    bobby9101 Peon

    Messages:
    3,292
    Likes Received:
    134
    Best Answers:
    0
    Trophy Points:
    0
    #4
    
    <?php
    set_time_limit('999999999999999);
    copy('http://site.com/file.zip', 'file.zip') or die("error");
    ?> 
    
    PHP:
     
    bobby9101, Apr 15, 2007 IP
    wvccboy likes this.
  5. yourihost

    yourihost Well-Known Member

    Messages:
    1,708
    Likes Received:
    45
    Best Answers:
    0
    Trophy Points:
    115
    #5
    Use shell if you have it on your servers.
     
    yourihost, Apr 15, 2007 IP
  6. wvccboy

    wvccboy Notable Member

    Messages:
    2,632
    Likes Received:
    81
    Best Answers:
    1
    Trophy Points:
    250
    #6
    Ok so in

    copy('http://site.com/file.zip', 'file.zip') or die("error");

    http://site.com/file.zip is the source, file.zip is the destination?

    Thanks
     
    wvccboy, Apr 15, 2007 IP
  7. bobby9101

    bobby9101 Peon

    Messages:
    3,292
    Likes Received:
    134
    Best Answers:
    0
    Trophy Points:
    0
    #7
    correct, test a smaller file first ;)
     
    bobby9101, Apr 15, 2007 IP
  8. wvccboy

    wvccboy Notable Member

    Messages:
    2,632
    Likes Received:
    81
    Best Answers:
    1
    Trophy Points:
    250
    #8
    Alright thanks.

    I'm having a lil problem:


    Parse error: syntax error, unexpected T_STRING in /home/lquid/public_html/lld/mygamescriptgames/copy.php on line 4
     
    wvccboy, Apr 15, 2007 IP
  9. bobby9101

    bobby9101 Peon

    Messages:
    3,292
    Likes Received:
    134
    Best Answers:
    0
    Trophy Points:
    0
    #9
    <?php
    set_time_limit('999999999999999);
    copy('http://site.com/file.zip', 'file.zip');
    ?>
    try that
     
    bobby9101, Apr 15, 2007 IP
  10. Barti1987

    Barti1987 Well-Known Member

    Messages:
    2,703
    Likes Received:
    115
    Best Answers:
    0
    Trophy Points:
    185
    #10
    Use wget, very easy, fast and reliable.

    Peace,
     
    Barti1987, Apr 15, 2007 IP
  11. lionheart008

    lionheart008 Guest

    Messages:
    43
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #11
    I believe for the copy() to work allow_url_fopen must be turned on in php.ini. Some hosts keep it turned off because it can allow remote execution in poorly written scripts.
     
    lionheart008, Apr 15, 2007 IP
  12. Subikar

    Subikar Active Member

    Messages:
    241
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    60
    #12
    I think you use wget instead of using any phpscript as the file size is large.
     
    Subikar, Apr 15, 2007 IP
  13. krakjoe

    krakjoe Well-Known Member

    Messages:
    1,795
    Likes Received:
    141
    Best Answers:
    0
    Trophy Points:
    135
    #13
    2 gigs is a lotta data, it might be worth spending an hour writing something decent to transfer it to save time while transferring.....

    Server A is original, B is new machine, if it were me, I would write a script for server A, that script would descend into the directory with your files in, gather each small file into it's shared memory with gzcompress(file_get_contents()), echo some identifiable data, like original location of the file on server A's hdd relative to the path you started, then echo out the compressed data - the identifiable data and compressed data would be gettin written do the disk on server B as it arrives with a pretty simple curl class that recieves the data and every time an identifiable line comes along it creates that identified location, uncompresses data and writes to the file 1meg at a time - you could compress that 2gb to under 500mb in this way.........
     
    krakjoe, Apr 16, 2007 IP
  14. bluesaga

    bluesaga Peon

    Messages:
    102
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #14
    why not gzip it in shell and then wget it from the new server and then unzip it?
     
    bluesaga, Apr 16, 2007 IP
  15. wvccboy

    wvccboy Notable Member

    Messages:
    2,632
    Likes Received:
    81
    Best Answers:
    1
    Trophy Points:
    250
    #15
    Alright thanks :D

    Will try all the ideas out in a sec.
     
    wvccboy, Apr 16, 2007 IP
  16. stevecrozz

    stevecrozz Peon

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #16
    I would try to do this with scp, check the man pages for the correct syntax.
     
    stevecrozz, Apr 16, 2007 IP