1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Sitemap tool for extremly large pages?

Discussion in 'Google Sitemaps' started by DSR, Feb 14, 2006.

  1. #1
    Hi everyone,

    I setup a single webpage to list all ~40,000 items in my database which is setup with mod_rewrite to be SEO friendly. GSiteCrawler says it can't index it because it is too large. Are there any other tools that do not care about the page size?

    Thanks! :)
     
    DSR, Feb 14, 2006 IP
  2. iconv

    iconv Well-Known Member

    Messages:
    189
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    108
    #2
    If your site is very large, crawling will take a long time; in these cases I usually generate the sitemap using a script reading directly from the db or recursing the directory and generating the properly rewritten urls for the sitemap.
     
    iconv, Feb 14, 2006 IP
  3. capebretoner

    capebretoner Well-Known Member

    Messages:
    536
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    128
    #3
    I think that this is the only way to go. I had to write something similar to this for my directory.
     
    capebretoner, Feb 15, 2006 IP
  4. FireStorM

    FireStorM Well-Known Member

    Messages:
    2,579
    Likes Received:
    88
    Best Answers:
    0
    Trophy Points:
    175
    #4
    Try coffeecup sitemapper
     
    FireStorM, Feb 17, 2006 IP
  5. WhatiFind

    WhatiFind offline

    Messages:
    1,789
    Likes Received:
    257
    Best Answers:
    0
    Trophy Points:
    180
  6. globalmarketing

    globalmarketing Guest

    Messages:
    27
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    I also met this scene
     
    globalmarketing, Feb 18, 2006 IP
  7. Lordo

    Lordo Well-Known Member

    Messages:
    2,082
    Likes Received:
    58
    Best Answers:
    0
    Trophy Points:
    190
    #7
    A script that you create for your site will be the fastest because it won't crawl, it will use the database directly.
     
    Lordo, Feb 18, 2006 IP
  8. tkluge

    tkluge Peon

    Messages:
    54
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Or write a custom input plugin for phpSitemapNG - just extract the links and feed them into the system to let it create the sitemap file(s) automatically.
    If you have any question regarding that - drop me a message.
    Tobias
     
    tkluge, Feb 23, 2006 IP
  9. Design1

    Design1 Active Member

    Messages:
    388
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    78
    #9
    I've recently beta tested a program that would do exactly what you need it to do. Check it out here:
    http://www.templatesresource.com/seo/site-map-equalizer.asp

    This program does it all.. Gets your entire website as a xml or html sitemap and even lets you know of 404 and other key things that may reflect on how well your website does.. I've used it on a few websites and have been extremely impressed on how well it functions.
    Best of luck!
     
    Design1, Feb 23, 2006 IP
  10. softplus

    softplus Peon

    Messages:
    79
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #10
    DSR - send me the URL (by mail or PM) if you want me to check it. The GSiteCrawler should have no problem with that many URLs (it might take some time though :D).

    However, if you have it all in the database, it would be MILES easier to just query the database and have it generate the sitemap file from there. You have all the URLs, why go the long way around by crawling your site? You could even just take the query output, wrap the proper URL-stuff around it and save it as a text file (text listing of URLs) and submit that to Google. No magic involved :D and it saves you a lot of time, bandwidth and CPU resources.
     
    softplus, Mar 14, 2006 IP
  11. Sarangan

    Sarangan Well-Known Member

    Messages:
    127
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    108
    #11
    The great and easy way to index big site`s link into sitemap.xml is just crawling the link by folders.

    Fx. you have a domain, with news & articles..
    Then you can crawl the pages in xxx.domain.xxx/news & save that sitemap in a file.
    And then crawl xxx.domain.xxxx/articles & and save then in a new sitemap file..

    2 sitemap files..
     
    Sarangan, Mar 18, 2006 IP
  12. aprilforshee

    aprilforshee Peon

    Messages:
    47
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #12
    I suppose I'm late, but for everyone else. Sitemap Writer Pro creates sitemaps with nearly 6 million URLs.
    good luck.
     
    aprilforshee, Jun 11, 2010 IP