Hello, My friend is a collector of old papers, stamps etc. He asked me to dowload a set of web-pages including pictures for him. It was rather much work. Parsing the HTML of the pages, following the links. (Good routines in PHP5 though.) Finally the program can use curl to download many pages, parse the HTML, cut away unnecessary information and display the result on one big page. My friend can then just press "save as webpage complete" in his browser and Firefox will save the page and included pictures. However, there seems to be a limit on the size of the pages that can be saved. If the size of the entire page including pictures exceeds about 1-1.5 MB the latter part won't load in Firefox. (Same for his and my computer.) It seems to be a limitation already when PHP is curling down the source to the server. if I write the page HTML source to the disk of the server, all of it will not be written for larger pages. I want the program to be able to download bigger batches than this. As the program now works, it only saves the HTML and the pictures are linked from the HTML to the original site. The pictures are saved by pressing "save webpage complete" in Firefox and the pictures are saved by the Firefox function in a folder called mypage-files. I guess the problem could somehow be solved by also using curl to download the pictures and modifying the downloaded source of the original page. However, this is a hobby project and I think this would include too much extra work. (I already have to change the base address of the lniks in many cases.) Can anyone think of a reasonably quick and simple method to save these web-pages including large amount of pictuers to the disk so they can be easily accessed? Or maybe suggest another approach. My friend tried many of those "download entire site"-programs, but none worked well in his situation, though. Thanks