Google Sitemaps FAQ I decided to make these frequently asked questions about sitemaps to help relieve the amount of same thread questions. Hopefully this can help answer any questions you have related to Google Sitemaps. You'll find this FAQ will be able to answer most of your questions related to sitemaps, if you don't you could always post your question in the Google Sitemaps sub-fourm on digitalpoint. Q: What is a sitemap? A: A sitemap is an XML file that lists all available URLs for a website, along with additional information about each URL in the file. You should always create a sitemap for your website. This will help search engines crawl your site faster and display more content. Right now the standard sitemap format is XML, all major search engine use this format for sitemaps. -------------------------------------------------------------------- Q: How do I create a XML Sitemap? A: This is one task you will not want to do in notepad, it can be done, but would be very time consuming. There are many websites & freeware programs that can build an XML sitemap for you. I don’t recommend purchasing software to create XML sitemaps; many freeware programs do the exact same thing. Below is a few links to free XML sitemap generators & websites. http://www.vigos.com/products/gsitemap/ - Free XML Sitemap Program http://www.auditmypc.com/free-sitemap-generator.asp - Online Sitemap Generator (no limits) https://www.google.com/webmasters/tools/docs/en/sitemap-generator.html - Official Google Sitemap Generator -------------------------------------------------------------------- Q: How do I submit my XML sitemap to Google? A: This is a very simple thing to do; I also recommend placing a direct link to your sitemap on your homepage. You can the Google Webmasters Central to submit your sitemap to Google. There are many other great features and tools with Google Webmasters Central as well; you’ll find it very useful. Below is a link to Google Webmasters Central. http://www.google.com/webmasters - Google Webmasters Central -------------------------------------------------------------------- Q: What is the robots.txt file? A: The robots.txt file is part of the Robots Exclusion Protocol. You can use this file to allow or disallow search engines from crawling and indexing certain areas of your website. This is great to reduce bandwidth usage by crawlers. You can also point to your XML sitemap file; this is useful since it allows search engines to find your XML sitemap. http://en.wikipedia.org/wiki/Robots.txt - More Detailed Information -------------------------------------------------------------------- Q: How do I create a robots.txt file? A: This is very simple to do. You can open notepad or any text editor you like and create the file robots.txt is text format only. Below is the basic sample code to point to your sitemap and allow indexing and crawling of all your pages. Just save this file and upload it to your top web directory. User-agent: * Disallow: sitemap: Your URL to Your SiteMap http://www.robotstxt.org/wc/robots.html - More Advanced Robots.txt Usage. -------------------------------------------------------------------- Q: I’ve submitted my sitemap, why haven’t I been crawled? A: Just because you submit a sitemap doesn’t mean you might get crawled any faster. New sites can take awhile to get fully indexed into Google, or any other search engine for that matter. The best way to increase this indexing is by getting quality back links to your site. Remember, the sitemap is just a map of your site for the search engines; it doesn’t increase your crawling rate. If it does, it isn’t by much. -------------------------------------------------------------------- Sitemap Resources Below is a list of resources that may also help answer your questions, remember reading is knowledge, and the more knowledge you have the better your site will perform. https://www.google.com/webmasters/tools/docs/en/protocol.html - Google Explanation of the XML Sitemap Protocol http://www.vigos.com/products/gsitemap/ - Free XML Sitemap Program http://www.auditmypc.com/free-sitemap-generator.asp - Online Sitemap Generator (no limits) https://www.google.com/webmasters/tools/docs/en/sitemap-generator.html - Official Google Sitemap Generator http://en.wikipedia.org/wiki/Robots.txt - Robots.txt Detailed Information http://www.robotstxt.org/wc/robots.html - Advanced Robots.txt Usage.
Friends I really want to know more about sitemap..I have read the above FAQ and I start to understand But still I have many questions Hope frens can help me First question: Where should I put the sitemap xml file?? only at the root directory? Or what? Thanks
Place the sitemap in your top directory, this would be where you place files to be access via your root website address... ex: www.yoursite.com/sitemap.xml
Thanks for the info..I will try google sitemap then..um in inside the .xml what should we write actually? Can give some examples? Thanks
I recommend you use http://www.vigos.com/products/gsitemap/ It is a FREE XML sitemap creator you can download, will do everything for you, just set your tags you want or don't want.
Ya, its nice. I personally like the ones I can run from my computer. Don't they have a limit anyways on the max number of pages?
xml-sitemaps is a good one but it has max500 limit. I just try vigos on my site, it got more than 1000 urls. I like the vigos's tool. Thanks!
Glad you like it, not many webmasters know it exists and how great it is for a FREEWARE sitemap generator
This is one of the best easy to read guides to this subject that I have read, you've summarised in a few paragraphs what I've been trying to figure out for weeks!!!!
It is great and helpful, but it leaves me still slightly puzzled... 1. I made a sitemap with auditmypc and submitted it to Google. But I notice that the xml file has no time stamp. If I add content to an existing page in the future, how is Google going to understand that the content has been updated? 2. How can Google's bots be persuaded to come more often? 3. What is the deal with being crawled by Googlebot every day, but NOT having anything change in Google's SERPs? It's like they're going out of their way to provide outdated results to searchers??
Everytime you update the site you should change the modification date to that current day in your sitemap. This means when you create a new sitemap just mark ALL the pages with that days date. The more you update, the more Googlebot will come to your site. If availible to you set the crawl rate to faster in Google Webmasters Central. If you run AdSense ads you'll also get crawled more often since Adsensebot and Googlebot work together. I personally believe SERPs change over long term data, not short term. Web pages need to mature and their backlinks need to mature and this takes time.
Thanks for your reply! I understand this, but I mean there is NO timestamp anywhere in the XML file. Not for the individual pages, the whole file, or anything. That's what seemed strange to me.
It doesn't need a timestamp, the modification date is the timestamp of when the page was updated As a rule of thumb, put a link to your sitemap in your main index file, this helps the search engines find it better.
Hey dude, that was indeed excellant. Actually i was in need to know about this sitemaps in detail and was about to create a thread. But i took some time to view the old threads and i have got a damn good one. Yours was very useful. Now i know what are all needed and i will try to develop my sitemap.
You mean the modification date of the xml file itself? SEs can know that? I put sitemap: http://zzz.com/sitemap.xml in my robots.txt; I'll also add a link in the index file. Thanks!
hi, i have tried to create a sitemap using : http://www.vigos.com/products/gsitemap but it showing error: http://www.pokerdeal.org/sitemap.xml hmmm
can anyone provide short brief about using sitemap? Experience, of course. 1. Site can better ranking spot? Or it is only advance with crawling site! Is anyone had some bad experience with using sitemap?
I don't have any bad experiences. The good thing about them is that you can control how efficiently and which pages the spiders crawl. It means more pages can be crawled and will help to prevent irrelevant pages from being indexed.
Do you guys know if there are any site map generaters that will automatically update everytime you ad a new page to your website? Lets say I add 5 new pages to my site, do I have to re-run the program, or is there some auto generating one? thanks for the great thread. digital point comes through for me again and again