Hello , i would like to parse some external web files. I find out that cURL it's better for me, because i can bind on different IPs, different IP C Classes (i got bought for my sever) Now, what fopen have and what curl do not have. With curl, i can only downland the whole website at once!? Can I do with curl something to only downland string by string?! you known , like $line=fgets($handler,1024); ?!
Curl opens a http connection so you dont actually know the size of the file, it doesnt work the same way the file system does. A way to do what you want could be to pull the file using curl, then explode the file on the \n character. This way you can iterate through each line of the file as desired, do the processing and then delete the original file if you wish. Hope this helps
No , you don't get me what i want to do ... I want to parse the file while i read it, so when i found out what i will need, i will stop without downloading the full website, so it will save time for me!
With what you're doing it's best to get the full source, and if you do that you can use CURL. Otherwise if you really want to only get the minimalist of code (realize that this isn't advisable since code changes, and formatting can change - full source will be more adaptive then just cutting off where you think the data is that you want) - you can use fopen and read line by line, you would do a loop with 'do while feof(file)..'