how often should I run vbseo sitemap gen on my site I have about 400 - 600 new threads daily for my forum It will be helpful if shawn tells how often he runs sitemap gen for dp forums
daily or as needed many engines spider all the time so it never hurts to keep that thing as fresh as can be
Search engines can find your pages... that's what they *do*. I really haven't (yet) seen a point for sitemaps at all. As it is right now, Googlebot spiders ~500,000 pages per day on this forum... without a sitemap.
Sitemaps can be good for new forums wanting to get a lending hand in SEO as for big sites there should be no need as for DP it takes withing the minute or two for a thread to be in the SERPs. You should run it once every 24 hours.
I really thought that a wile ago now I think sitemaps is necessary for new sites however for sites like DP it is no so important it is a s google says "submit a sitemap to tell us about pages that our bot may not find" it the internal linking of a site is well built there is no need for google I think Howver I think search engines like bing and the new Microsoft and yahoo engine will depend so much on sitemaps the coming days ,beginning of 2010 so I recommend having a sitemap for DP at this time to be submitted to bing and the new SE great respect to you Shawn !
To be honest, I'd be more inclined to block Bing's spider that waste my time generating a sitemap for them. They are an AWFUL search engine as far as relevance goes... Check out some of their stats (they think this site is porn)... makes me laugh at how bad they are. Not only that, they haven't yet figured out how to control their own search engine spider, much less give users relevant results. I'm already throttling them way back (see robots.txt), and maybe someday if they can figure out how to return relevant results (or at least something that sort of resembles it), I'll unthrottle them. But at this point, they consume more server resources than they are worth as far as driving traffic.
oh ! you make me start thinking to do so ! I have only 4000 urls indexed in bing, 134k in Google ! the spider really sucks however they managed to get large number of searchers
it is your mistake Shawn .. from the start why you named it digital point ? what do you mean by "digital" ? ... a digital orgasm? and what is the "point" is it the g-spot? it is a porn site then ! just kiddin ...
Sitemaps are intended to have links to main pages on the website, not all pages. For dynamic content sites like forums its crazy to add all the pages on the sitemap. VBulletin forum, in my opinion, got good internal linking structure. And that should allow the bots get on every public page of the forum. I just think generating sitemap everyday is a waste of time.
My sitemap is generated daily, and google grabs the new sitemap every 24 hours. As for bing - this is straight from my google analytics account: Traffic Source ------ Bounce rate google / organic ---- 58.96% yahoo / organic ---- 25.26% bing / organic ------ 23.39% aol / organic ------- 60.32% ask / organic ------- 66.90% Bing has a 35% lower bounce rate then google (on average). I also submit a sitemap to google, bing and yahoo. The bounce rate of AOL traffic and google traffic is about the same, and that is not saying a lot. To me, it shows the sad state the that google search engine has turned into.
I tend to run the sitemap generator once a month and out of nearly 80,000 urls google has gotten 56,7000 urls within 2 month's. Instead of larger sites generating a sitemap daily, I would set it to make one every 48-72 hours to help cut down on system resouces that the sitemap chews up every time it's run.
How do you know that number of indexed pages might not double if you were to submit a sitemap? I don't think there's any harm in at least trying it once.
I'm trying to get pages out of Google, not add more. 88,200,000 pages from forums.digitalpoint.com in Google right now... [search=google]site:forums.digitalpoint.com inurl:forums.digitalpoint.com[/search]