I have a website with 150000 pages . So sitemap generators are taking too much time. I have links in website. That listet 1 by 1 Example http://www.site.com/come.html http://www.site.com/go.html http://www.site.com/come.html http://www.site.com/re.html How can i create sitemap from a txt file with full of urls ? I think we can make it with php . But i don't know how to
It will probably take around a day max (hard to say, depends on how fast server is) to scan that. Anyways, if you have a text file with all urls, you actually already have a valid sitemap which you can submit to Google. XML is a bit better, but you can also submit txt sitemaps to Google.
ThomasSchulz thank you i will try . But i think xml will be better I sent pm to jazz7620 but i didn't receive any answer for a long time. Isn't there any tool to make it?
There are many sitemap generators, check my signature for instance I am unsure if there is a sitemap text to xml tool... However, if you are not going to add more information to your XML sitemap files, I think you just as well (almost anyways) can use a sitemap text file.
Why don't you add an RSS feed so new posts are added to the Sitemap directly? You will then only need to create the sitemap of past posts once.
infonote there are 50000 pages. How will i add rss support to the script ? It's not an automated system. It has 50000 files. so it's hard to do it