Google didnt crawl my site anymore and i dont know why but still it would crawl the index leave come back crawl the index when there are plenty of links on the index it could have went to try creating a sitemap and google will be in love http://johannesmueller.com/gs/ this proggy works well to create it
Google Sitemaps aren't magic - but I sure love that program Remember, Google Sitemaps only adds your URLs to the Google crawler queue, but Google will use its own criteria to figure out which URLs it really wants to crawl - so don't think it's a magic bullet to get indexed, and don't think you can forget about getting links, because you can't, ever The better your site is standing with Google (ie better inbound links), the better Google Sitemaps makes sure that your site is fully indexed. So first get the links, then add the Sitemaps, and hope for the best.
The same might be happening to my site http://www.best-personal-loan-rate.co.uk which has an xml sitemap. Still only my index page is crawled and not my inner pages.... My xml sitemap is at http://www.best-personal-loan-rate.co.uk/sitemap.xml Any body can figure out any solutions??
Google has definitely crawled my sites more since I started using sitemaps. I really think that the sitemaps can be quite beneficial and will definitely use them on new sites in the future.
greetings, i've looked at your xml file and it seems like it hasn't updated since 2005-10-26. So i think that's the reason for the googly guy to ignore it... All i can suggest is to insert more recent times in the xml file. If you have vb6 you can use the source code of a small program i wrote for one of my sites - all it does it updates the sitemap.xml file with the new timestamps, whenever you run it. i use task scheduler on my server for that purpose. Works quite well. Well, the prog is too simple to recurse subdirs, but my sites are tiny, so i didn't care much. Ah, also - i suggest using changefreq attribute with the 'hourly' value . vb6 source code - http://www.vbrocks.net/downloads/dir2xmlSRC.zip binary - http://www.vbrocks.net/downloads/dir2xml.zip sample xml - http://www.hiddencoupons.com/sitemap.xml -- vbrocks.us
That's the absolute worst thing you can do. Most webmasters assume that just because the cnagefreq is set to "hourly" google will visit your site more often. That's far from the case. If you have a static page set the changefreq to "weekly". Google will crawl the page more often because it doesn't want to provide an old page in it's search so it will try and make sure that it crawls atleast once a week to handle changes to the page. Forums for this reason are a bad place for google. The content changes so often that google knows it will never be able to keep up. (It can't crawl your site 24/7.) So it delays crawling the site by as much as possible to make sure it gets as much content as possible at a later stage. (There's no real proof of this, atleast not that I have seen.) An RSS feed for your site helps as well since it's much easier for google to crawl through an RSS feed (without the pesky HTML tags ). Ever since I put a RSS feed on a clients forum google has been there almost for the entire day. The RSS feed exports every 15 mins with the last 100 topics/posts. Try it, it might give you better results.
that's quite interesting suggestion - do not update it hourly, update it every 15 minutes instead. could be true for some busy sites, mine is a slow one. My point was, actually, not to put 'hourly' in the sitemap.xml just for fun. I think that the real site update should happen with comparable frequency. regards, -- vbrocks
Not only is there no proof FOR this belief - there's lots of proof AGAINST it. My forums certainly do get crawled daily, seven days a week, and usually more than once a day. Last week was not an atypical week: my psychology forum received 1852 hits from Googlebot in 7 days. That forum luanched March 25, 2004: except for perhaps the first few weeks, there has not been a day since then when Googlebot did not crawl the site - even when I added a MOD which inadvertently delivered error pages to Googlebot it still kept coming daily to index the error pages.
Hi All New to your forum. I have added a google sitemap to our site. The Frequency is set to monthly, but the pages are seldom if ever changed. Does this present a problem? Des
according to the information in my google sitemap account ,google download my sitemap everyday,I would assume that helps for my site been indexed
If you do not have a program to build a google friendly sitemap you can go to www.auditmypc.com/free-sitemap-generator.asp Nothing to install from there. Just run their proggy. It's a javascript and works very well. It spiders your site and produces a finde sitemap. All you have to do then is to upload the new sitemap with a ftp program (like filezilla) and include a link to it on your startpage. That's it. Not the best automated way, but a good one if you do not change your pages very often. AND it is one of those great free tools
Remember guys it's still a Beta product and also don't forget that Google is doing this to help normalize their database and query data. If they can get web sites to start providing standard reporting data that makes the bots and systems behind the scense work better and faster. Just look at how many times the robots que is cleared. There obviously still tweaking.
Just a quick note for all those discussing the crawl-frequency settings: They're ignored by Google at the moment. They are only using them to collect statistics. Same goes for the priority setting. Personally, I'd remove that tag from the xml file
I knew that the priority tag was placed for future use. But after the last month or so it looked liked the freq tag was getting a tune up. I don't have anything other than visual information for this but the frequency of downloads seemed to have gone up. Noticed after the last robtos.txt reload. Of course this is all from just paying attention to the data on the sitemaps site, nothing more scientific than that.