1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

GSiteCrawler, who uses it?

Discussion in 'Google Sitemaps' started by fatabbot, Feb 20, 2007.

  1. #1
    I'm creating a sitemap with that free program, but it's running for like 5 hours now and it's still not finished.
    Is this normal?
    I can see that it's still active, but could there be anything else that causes the slowness?
     
    fatabbot, Feb 20, 2007 IP
  2. Openg

    Openg Peon

    Messages:
    47
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    I used it for a while until I realised that the datafile got to 1gig and it stopped functioning, then rather than mess about I just removed it from my computer. It is free for now because we are basically beta testing it for the developers...and I must say that when the glitches are sorted out it will be a very useful little program.
     
    Openg, Feb 20, 2007 IP
  3. cormac

    cormac Peon

    Messages:
    3,662
    Likes Received:
    222
    Best Answers:
    0
    Trophy Points:
    0
    #3
    I found it very useful on smaller sites. Really good for finding dupe content etc. I have also seen it run very slow on bigger sites which stopped me from using it full time.

    Openg I see you were in my neck of the woods a couple of weeks back. ;)
     
    cormac, Feb 20, 2007 IP
  4. fatabbot

    fatabbot Well-Known Member

    Messages:
    559
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    138
    #4
    What are you using instead now?
     
    fatabbot, Feb 20, 2007 IP
  5. cormac

    cormac Peon

    Messages:
    3,662
    Likes Received:
    222
    Best Answers:
    0
    Trophy Points:
    0
    #5
    I use Xenu to check for broken links etc but also to grab a list of valid urls on the site.
     
    cormac, Feb 21, 2007 IP
  6. trichnosis

    trichnosis Prominent Member

    Messages:
    13,785
    Likes Received:
    333
    Best Answers:
    0
    Trophy Points:
    300
    #6
    if you have an huge web site , it's normal . it must crawl all of your pages
     
    trichnosis, Feb 21, 2007 IP
  7. Aragorn

    Aragorn Peon

    Messages:
    1,491
    Likes Received:
    72
    Best Answers:
    1
    Trophy Points:
    0
    #7
    I use it to index my local site. Then I replace localhost with my sites domain name and then uploads the sitesmap to the server. This saves my bandwidth and my sites bandwidth limit :)
     
    Aragorn, Feb 22, 2007 IP
  8. websitetools

    websitetools Well-Known Member

    Messages:
    1,513
    Likes Received:
    25
    Best Answers:
    4
    Trophy Points:
    170
    #8
    That is a quite good idea as long your website isn't CMS / forum / blogging based :)

    (Albeit, for some strange reason, my Apache runs very slowly on my own local computer.
    I am not quite sure why, but it happened some months ago. Probably configuration related.)
     
    websitetools, Feb 22, 2007 IP
  9. Aragorn

    Aragorn Peon

    Messages:
    1,491
    Likes Received:
    72
    Best Answers:
    1
    Trophy Points:
    0
    #9
    First I downloads the database and updates my local database. Then I runs the program. Hence it doesn't matter whether its a Forum or Blog. Provided you have access to the database.
     
    Aragorn, Feb 23, 2007 IP
  10. swapnil90

    swapnil90 Well-Known Member

    Messages:
    1,528
    Likes Received:
    80
    Best Answers:
    0
    Trophy Points:
    115
    #10
    Hey thnx...i have been searching for this software since a pretty long time!!
     
    swapnil90, Feb 24, 2007 IP
  11. cormac

    cormac Peon

    Messages:
    3,662
    Likes Received:
    222
    Best Answers:
    0
    Trophy Points:
    0
    #11
    I just remembered something about GSiteCrawler. As far as I know it checks valid urls on your site by using Google results, this can slow down the whole process. You can disable the option when going through the new sitemap wizard.
     
    cormac, Feb 26, 2007 IP
  12. uppaluri

    uppaluri Peon

    Messages:
    33
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #12
    On my site, GSiteCrawler takes nearly 2.1/2 hrs for about 2500 links.

    Thank you turfsniffer . Xenu's Link Sleuth will go into the FOSS directory I am building.
     
    uppaluri, Feb 26, 2007 IP
  13. Litho

    Litho Peon

    Messages:
    105
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    0
    #13
    Try this webmaster tool for finding broken links, plus building sitemaps.

    2500 links should only take minutes!
     
    Litho, Feb 26, 2007 IP
  14. eski009

    eski009 Peon

    Messages:
    48
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #14
    interesting!
     
    eski009, Mar 1, 2007 IP
  15. codeber

    codeber Peon

    Messages:
    578
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    0
    #15
    yep, it should take minutes for 5000 links.

    But i suppose it depends how many links you have on each page and the length of content, and also how many duplicate links you have.
     
    codeber, Mar 4, 2007 IP
  16. Openg

    Openg Peon

    Messages:
    47
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #16
    Ok, after trying out a few other services I am back with gsitecrawler.

    It doesn't seem to be creating the huge databases on this update, but I am a bit confused as to how the site I am crawling (the one in my sig linked to ecommerce templates that has about 240 products) returns almost 3000 pages, and has crawled for aroiund 6 hours so far and says it has 10 hours to go.

    I am using oscommerce 'ultimate seo url' plugins and hope it is not replicating loads of pages that are both seo urls and don't seem to want to be removed from the cache as unfriendly url's.

    Give me strength ;)

    Dom
     
    Openg, Mar 31, 2007 IP
  17. ultimatehandyman

    ultimatehandyman Peon

    Messages:
    246
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    0
    #17
    I use Gsitecrawler and it takes about 14 hours to spider my site.

    I've used it for months and it has been extremely useful :cool:
     
    ultimatehandyman, Apr 1, 2007 IP
  18. njoker555

    njoker555 Notable Member

    Messages:
    4,392
    Likes Received:
    139
    Best Answers:
    0
    Trophy Points:
    240
    #18
    it eats up a lot of bandwidth, lol i tried to use it on a forum with 10k + posts lol not even half way done after 6 hrs, there is like 20k links in there
     
    njoker555, Apr 2, 2007 IP
  19. groverinri

    groverinri Member

    Messages:
    34
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    41
    #19
    I had heard of gsitecrawler but never used it before. that good, huh?
     
    groverinri, Apr 3, 2007 IP
  20. pwhite

    pwhite Peon

    Messages:
    193
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #20
    cheers for the xenu link been looking for something like that for ages!
     
    pwhite, Apr 10, 2007 IP