I am talking about websites with 1-5 million pages or more. I found a good tool at AuditMyPc.com, but it keeps hanging up. At this point run it for like 5 minutes, it add some rows of data, and after that I need to save the work into a file, and restart the program again (load the most recent data, start it again, and so on). In general, it looks like I wont be able to use it. This one is good, because it would be easy to get title tags from there. Another good one that I am testing is Xenu's Link Sleuth. I am testing it on the same domain, but it takes like it will take some time to get data for the entire site. I know that Scrapebox can collect title tags from list of urls, something like I sitemap. This one is paid. I also know about DeepCrawl.com, but this one is expensive. Screaming Frog's tool will not be good for sites of this size. My question is - what is the best and most efficient way (time) to scrape title tags from big sites, and also possibly create sitemaps, but this is not a necessary thing. Thanks.