Is it better to have a sitemap in plain .xml or have a page with all the links with anchor text as a normal html file and use it as a sitemap. For example a working html page with all of your website pages i.e articles etc. Or have both? Would goodle websmater tools except a html page as a sitemap? Witch is better for your site in terms of SEO?
There are two type of site map which we used for any site. 1>>High Weight Site map(It may be HTML or any programming language) 2>>Low Weight Site map(X<L Site map) In order to maintain a regular site structure, we create the high weight simeap and search engine every time give the good impression to this. But for search engine crawling, you always prefer the xml sitemap in order to get crawled all pages through this. Thanks Marketing expert
So which one is more beneficial more? I always submit my sitemap xml to google when i update, would it be more beneficial to have a "High Weight Site map" for SEO services? Or can you do both?
Depends on your site. Your main benefit of sitemaps is to get dianostic information. If you have a site that has thousands of pages and they are not href linked from the site then a xml sitemap will help. All sites should have a "html" sitemap with the main or important pages listed. This is a given and is a benefit to visitors only exception is if you have a crappy one page affiliate website.
Hmm thanks so both maps are beneficial? I only have a xml version and i wanted to make it a better website for the user.
Sitemap.xml is acceptable by all three major search engines Google Yahoo MSN And I am using both sitemaps, sitemap.xml and sitemap.htm. 1. sitemap.xml for Search Engines 2. sitemap.htm for Visitors
Definitely go with sitemap.xml. I'm of the belief that you don't need a visitor sitemap if you have a proper and obvious link structure. If your server/software supports it and you have an extremely large directory (50k pages +) consider adding the gzip function to your sitemap, the major search engines will unzip it.
XML is best to go.It is widely accepted. Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site. Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site. Sitemap 0.90 is offered under the terms of the Attribution-ShareAlike Creative Commons License and has wide adoption, including support from Google, Yahoo!, and Microsoft. This document describes the XML schema for the Sitemap protocol. The Sitemap protocol format consists of XML tags. All data values in a Sitemap must be entity-escaped. The file itself must be UTF-8 encoded. The Sitemap must: * Begin with an opening <urlset> tag and end with a closing </urlset> tag. * Specify the namespace (protocol standard) within the <urlset> tag. * Include a <url> entry for each URL, as a parent XML tag. * Include a <loc> child entry for each <url> parent tag. All other tags are optional. Support for these optional tags may vary among search engines. Refer to each search engine's documentation for details. Also, all URLs in a Sitemap must be from a single host, such as www.example.com or store.example.com. For further details, refer the Sitemap file location For more see www.sitemaps.org
Always try to make sitemap in xml so that you can submit it in Google webmaster tool. XML site map may be crawled by the different search engines.