1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

any sitemap generators for larges sites 100,000 + pages

Discussion in 'Google Sitemaps' started by greenflag, Jul 22, 2006.

  1. #1
    i cant find a solution to index large sites - any industrial strength xml sitemap generators out there?
     
    greenflag, Jul 22, 2006 IP
  2. jalex

    jalex Active Member

    Messages:
    184
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    51
    #2
    you can generate yourself if you know php.. read the sitemaps structure or search on google for php sitemap generator
     
    jalex, Jul 23, 2006 IP
  3. greenflag

    greenflag Active Member

    Messages:
    369
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    51
    #3
    i am looking for specific product recommendations...
     
    greenflag, Jul 23, 2006 IP
  4. Nick_Mayhem

    Nick_Mayhem Notable Member

    Messages:
    3,486
    Likes Received:
    338
    Best Answers:
    0
    Trophy Points:
    290
    #4
    Humm.... I have had same problems with my hotappz.com

    So I made a script in PHP which just writes to a sitemap.xml file.

    Let me know if I can be of any help.
     
    Nick_Mayhem, Jul 24, 2006 IP
  5. onelife

    onelife Guest

    Messages:
    11
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Hi there,

    Most of my sites are 500,000+ pages, so I have had similar problems. Though two stand out as being pretty decent Gsite Crawler at http://gsitecrawler.com/ although a little slow and Brian Pautsch's crawler at http://www.brianpautsch.com/ - damned fast

    Worth noting that in a week or so Brian will have a new version out based on some of my requirements. But on the whole its probably the fastest crawler and sitemap maker, converts 500,000 url google-sitemap to yahoo in under a minute.

    Hope this is of help

    Dave
     
    onelife, Jul 25, 2006 IP
  6. Boby

    Boby Peon

    Messages:
    207
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Boby, Jul 25, 2006 IP
  7. cellularnews

    cellularnews Peon

    Messages:
    246
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #7
    If you have a site with 100,000+ pages, can I presume that most of them are database generated ?

    If so - why not write a short app to create a sitemap based on the database ?
     
    cellularnews, Jul 25, 2006 IP
  8. websitetools

    websitetools Well-Known Member

    Messages:
    1,513
    Likes Received:
    25
    Best Answers:
    4
    Trophy Points:
    170
    #8
    Well, I just released a new version of my sitemap generator software. 100,000k should be no problem at all (for the newest version anyways). The only thing that can jinks is that I decided to build and upload my software past midnight ;)
     
    websitetools, Jul 26, 2006 IP
  9. catanich

    catanich Peon

    Messages:
    1,921
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    0
    #9
    I also use the http://gsitecrawler.com/ generator. It is slow but does a very good job. But I have not tested the upper limits of it. I do 22k all the time without any problems.

    It has most of the output formats as well.

    Jim Catanich
     
    catanich, Jul 28, 2006 IP
  10. MohAdnan

    MohAdnan Peon

    Messages:
    13
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #10
    PHP Sitemap generator is best for 100,000 + pages.
     
    MohAdnan, May 12, 2011 IP
  11. Adrian kiwee

    Adrian kiwee Peon

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #11
    I use sitemapGenerator.jnlp, works for any site, but if you have more than 50.000 urls in it, you'd better break it in pieces, for indexing purposes.
     
    Adrian kiwee, May 31, 2011 IP