Hi again, Firstly am I even posting these questions in the right forum? Sorry if I'm not but couldn't see a forum for Perl topics. Secondly, now I have Perl scripts working I have made a simple web crawler but want to make it so that it checks the robots.txt file but there seems to be a few different classes using LWP. Does any one know where I can find a decent tutorial for setting up a crawler to check the robots.txt file and then going to the given URL is it's allowed to? Thanks in advance.
So in case any one else is curious, I discovered the answer. If you use LWP::RobotUA rather than LWP::UserAgent it automatically checks the robots.txt file if one exists.