No need to generate your sitemap again and again. you can add you new pages manually. Its very easy.....
What is '*.gx' extension? Did you mean '*.gz'? If so, you should unarchive it first. Google accepts only. 'sitemap.XML' files as far as I know.
Fantastic info on sitemaps! You did an excellent job of explaining how they work. It is some of the best I have seen.
Google Sitemap and robots.txt both are used only for search engine. XML sitemap is used for indexing our all the pages of website. Robots.txt file is used to tell search engine that which page have to cache and which not to cash.
My site thegioitructuyen.com is developing that is why site map file is often changed. Is it not good because Google doesn't indexed after that?
Googlebot visits your website more frequently if you add a new content regularly. If you notice that google doesn't index your website the reason may be missing backlinks pointing to your website. Your website needs backlinks to be indexed.
It need to get backlinks to all individual pages (for index on Google). I get lots of backlinks to main page but links to content pages very less.
Yah ! google sitemaps are the newest consepts and can make them are very nice and can help us to put the all new ones and some are the very good to use the all others. so i like to post the comments on google.
It is very helpful for the SEO purposes. There are 2 reason why the links to the individual pages are good: 1. It looks more naturally. 2. Your subpages will rank better for keywords they are optimized for. However, to be indexed, there is no need to have an external link to every subpage. Link to any page of your website is enough. Googlebot follows the link to find your website and then follows the link on your website to find all individual pages. Of course, it requires to use correct linking on your website.
ya i m also in need of creating automated site map for the website i have not used XML till now reply to this is highly appreciated Thanks
YES, friend really good information . I've read your FAQ about site map or robot.txt file. XML site map is must for any site. It help to easy crawl all site URL or information. We can generate it simple with the help it tool bars and after that submit it Google tool bar and web site root dir.
Friends, what do you guys recommend on using ror and gz format for sitemaps. I have read somewhere that these formats are also considered on high importance for sites with dynamic structure and sites where user generates content ? Any ideas ?