software to go to a site, follow all the links, and make a list of all the links it finds on that website. (I want to submit all the pages to a SE) Thanks, RonMo
what I'm really looking for is a program that strips out just the urls and puts them in a text file, one URL for each line. -Ron
Xenu Link Sleuth can do that. You need to download and install it. It will generate a report in the end which will have errors (if any) and then a list of all the internal and external links on the website you checked. If you want an online tool, you can try this Dead and Broken Links Checker. It takes about 30 minutes to run (if you have a large site) and gives a list of internal and external pages in the end. However, this might not work for VERY large sites, so Xenu will be good for VERY large sites.
http://www.xml-sitemaps.com will do that. It will give you options for text files containing a list of urls as well as html, xml and ror sitemaps.