Hey fellow dp members, Well recently i wanted to start a new project, to advance on learning php. The project i will be working on is an online file scanner, such as virustotal, novirusthanks, or jottis malware scan. Now, i don't have the money or time to purchase a server and install 20-40 unix antivirus systems, or the knowledge to configure the backend php relations between scanner and webpage. I have been googling for things i need to know but it's alot of work. So i figured, i would use php to submit the file for scanning to another service such as http://jotti.org and grab the results and display them. I know if i wanted to save the results etc, i would need to integrate mysql etc, but for now im just going to stick to a single php page displaying the results once only. Is this possible with screen scraping? If so i would appreciate it if someone could explain to me how i could do this, provide any coded needed etc. This is what the script should do, in english: 1. Submit file using form to jotti's form 2. grab the scan result url e.g http://virusscan.jotti.org/en/scanresult/817ea1803d477d89596148e7971e308da64060f3 and check status for scanning or finished. 3. When the status is scanning the php script should just display text saying that the file is still being scanned it should grab the filename, File size and file type. 4. When the status says its complete, the script should screen scrape the results and display them. Thanks alot for the help in advance. Regards, Alex