Hi, I wanted to know what you guys do to backup websites with 1000+ files. I've been using my FTP client for the last 4 weeks, but SmartFTP is so slow, it takes 2 hours to download all these seperate little files. Ideally I would like one of the following: An online service provider where I supply my FTP access information and they do the rest > backup all files on a scheduled base and put it in a sperate folder or so... A little script or software tool that I either install on my server or on my PC that runs scheduled backups and is faster and easier to use than normal FTP clients... Whatever the solution is, it should be affordable for a smallish website. ANy ideas? Cheers, Kai
What type of hosting do you have? Are you using cpanel? Full site backups would be easy assuming you have cpanel.
Assuming there are no databases involved, and that your server runs *nix, I suggest simply logging in via ssh & making a tarball of your entire root folder & downloading that tar file via regular HTTP (will be faster)
or move public_html.tgz inside the public_html folder and download it via HTTP cp public_html.tgz public_html Code (markup):