I am for a couple hours now building a list of directories and article submission url's and i know i have duplicate url's in the lists but to go trough the list manually and looking for the url's is quite tedious to say the least. Isn't there some sort of software that i can use on a .txt file that it would give me the duplicate url's so it would make it easier for me to run the lists?
I would import a txt file in to Microsoft Excell, reorder the data in alphabetical order and then see the duplicates much faster. I am not sure, if this is the solution for your case.
That approach sounds very good actually have to look into this microsoft excell program a litle further to see how i need to do this but it's a very good start for me. Thanks man
I downloaded Excell from the MS site but i can't seem to let it put everything into alphabetical order. There is no option available for this. I found another program called URL Alphabetizer but it doesn't work that well. I get a \apr statement behind every URL and then i have to edit this again. That really doesn't make things easier. I also found a website alphabetizer.com but that seems to be ofline, i'm wondering when it's coming back, hopefully soon because it should be promising. If anyone know any other online tools for this or softwares please let me know so i can make an effective article submission directory, directory submission list with all the pagerank indications as well. If you help me out with this dilemma of mine i would not forget you and give the lists as well when interested. Thank you