I'm using cURL and simplexml_load_string to suck up an RSS feed (eBay), reformat it a bit, and then display it. It works well, but it's slow. The eBay RSS feed url is hard set to return 100 items, but I only want to display 10. The way I do it at the moment is to suck up the entire page (100 items), then just display the first 10. This is SLOW and clumsy. Is there a way I can use a counter to read the RSS as it's being downloaded, and to exit after 10 items? function my_simplexml_load_file($url) { $ch = curl_init($url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_HEADER, 0); $xml = simplexml_load_string(curl_exec($ch)); curl_close($ch); return $xml; } PHP:
Sure. Look in the PHP source code for SimpleXML and write it to take a "lines" parameter. Then convince your host to use your version. SimpleXML reads the whole thing, so you'd have to really hack the site to hook into the internals of the function to see a "current line count". Practical answer: No. There might be another way you could try it, depending on which part of the code is slow. (If it's the download from eBay, there's not much you can do unless you redesign the whole thing to do a line by line scrape of the site.) Instead of using simplexml_load_string, just write your own XML parser, and stop it after 10 lines. You have to weigh the time wasted by the user in waiting for the data versus the time wasted by you in learning to write an XML parser.