I realize there has already been a thread which tangentially covered this topic, but i wanted to post a specific thread addressing just this topic. This is a poll of sorts, I guess. How often do you update your sitemaps on average? And, by the way, you might also want to state the number of URLs contained in your sitemap, and your pagerank so that perhaps this 'poll' can help yield some useful reference. Here's my data: So far, sitemap freshly uploaded, planning on weekly but depends on what you all post. 260,000 URL's PR 5
I do it constantly, and automatically. All my sites are Wordpress blogs, so on the large blogs that I actually put time into, I use the Google Sitemap plugin to generate a huge sitemap. The sitemap is updated with each new post. With my smaller, automated blogs, I use the feed as the sitemap, and it is too, updated each time there's a new post. The reason I don't use the plugin is because my RSS scraper doesn't make it update each time there's new posts. As to efficiency - this works like a charm, new pages are indexed in a day or two, and new blogs never take more then a week to index.
• Nonprofit site. PR4 home page. Most inner pages are PR4 or PR3. • About 125 indexed pages, all personally written, not generated. Unique content on each page. I submit a new sitemap.xml each time I add a new page.
i'm using wordpress sitemap generator for my sitemap and i'm updating it after i add content to my site
thank you all for your input. i think what svarog says is a big clue, sitemaps need to be automated and in sync with content. everytime i resubmit my sitemap to google, it says something like "sitemap last downloaded 17 minutes ago" or "5 minutes ago" or something really recent. at first i kept thinking "oh damn! just missed it!" but now i am suspecting that google comes looking for the sitemap very, very often. like all day long. consequently am updating daily now. one of these days i will figure out a way to do it automatically, on the server.
On one of our sites - PR6, 250k pages we do it once a week as about 10% of the site changes in that time. I believe it's a waste of time but we've got nothing to hide so do it anyway. As much as anything, the sitemap generator ensures the site is crawlable.
Is it better to make the "LastModified" date for today even for the pages that didn't change? or should it be as it changed: only the pages that were added/changed will show the current date?
I have a site that is very dynamic and large, with constantly updated user content. I took the sitemap off because I couldn't find a tool to create a sitemap for all the pages. What does everybody here use?
looks like you have a major headache i guess your main issue is how to add new URL's to the sitemaps without too much delay. the scripts you want to use for updating your sitemaps depends on what you are using to drive your site...
i've made the sitemap dynamic so everytiem Google tries to fetch the sitemap.asp file it generates teh sitemap in teh real-time of all the URL's.. i guess Google fetches the sitemap two - three times a week! or maybe because my sitemap.asp returns last-modificed date as the same date as it's been requested on.
I used to not have a site map and google was on my sites and crawled them every day. I added a site map and google disappeared from the sites. I removed the site maps and now google crawls the sites everyday again . The site maps hurt me. I do much better without them.
I had created a sitemap a few months back. I resubmit it occassionally and remove it after a few days. Believe it or not, I have experienced an increase in traffic whenver I remove the sitemap.
Sitemaps are exactly useless.Remove them and u will see your indexes are improving in search results....
I have my dynamically set to change daily. It seems to help to have data feeds if you really don't update your content daily.
I'm not sure if vBulletin is automatic or not, but: * If it is then It updates automatically * If not then it never has been updated, and I won't for ages. Being vBulletin I'm guessing that it will.