Hello Im gonna share one of my own tools, which i think many people could use. Ever wanted to do a dublicate domain check between 2 .txt files without mixing them together? Im making a lot of auto approve blog lists with unique domains, but i soon found out that once my first list were done, how do i make sure i dont re-use the domains i already used in the first list? I quickly found out that to check for dublicate domains in Scrapebox, i was unable to use very large lists, or i could but not as easy as now. So i had this tool made for me, which can check 2 different txt files for dublicate domains, and then export the dublicates and uniques to each their .txt file. this tool can handle up to +20 Million URL´s, and it takes merely seconds to finish. Where it can get a bit troublesome if you use scrapebox due to the 1 mil limit, and slow loading. I hope some of you can make use of this tool, as i have. Download tool and instructions here http://www.auto-approve-blogs.com/scrapebox-tools/unique-domain-filter-tool.html Kaspersky Online Virus Check: Known viruses:4445056 Updated:15-12-2010 File size (Kb):117 Virus bodies:0 Files:2 Warnings:0 Archives:1 Suspicious:0