Hi all, I've seen MANY people submit a super-large sitemap to Google containing every single valid URL of his/her site. Now I ask you, is it really the correct procedure?! I mean, I usually submit a sitemap containing all important directories of my site, an update frequency and priority. I assume Google is smart enough to catch the links from that point. If you do it this way, there is no need to update the sitemap every week. What's your position on this issue? Is it really necessary to submit every valid URL of a site?
proper SEO and web publishing includes tht every single page has its individual content and is different from all others at least in some important points. hence to get the maximum value TO the searchers/surfers all valid pages must be known to the SE. one efficient and scientific method is to include every valid content page in the sitemap and to submit sitemaps to the SE the other slower method is to burdne the work of finding all content to the SE the benefit of giving all pages to the SE is 2-fold - for surfers to find the very best match and for you to get the very highest possible Nr of visitors eventually resulting in the highest possible advertising revenue for you as well. the latter helps you to invest more and create even more content for all
Thanks for the info Hans! I usually use auto sitemap xml generator for my currently 30+ pages website.
where and how? I am having difficulty producing a sitemap for my site.. so much so I just made a UK town list to at least get the spiders to start looking at all the subdomains. How on earth do you get 25,000 links onto a sitemap? it will be over 1Mb and take 5 mins for a dial-up user to view James.
sitemaps are usually never created via dial up but directly by tools installed on server or by avaiable online tools the only problem area for creating sitemaps are dynamic pages but even that is possible with available tools and wome tweeking search the Google sitemap page with sitemap tools other than the Google sitmap creator for my static part i use http://enarion.net/google/ a php script and for the dynamic part above as well and/or http://www.softswot.com/sitemapdl.php this one is fast and the only tool that can do the most important part of my dynamic pages cleanly - it also is available as a free trial version online or for download there are several other tools online or for installation on your site a few thousand links or even more is zero problem for static pages and a matter of seconds the same with dynamic pages may take minutes but on the server and hence also no problem at all for an overview of all Google recommended third party sitemap tools - see http://code.google.com/sm_thirdparty.html
The most I have submit to google is 41,400 pages... built using a custom script.. just keep in mind to set the change frequence right... if you have a small site under 1000 pages try using this.. http://www.octadyne.com/sitemap/
Google's XML sitemap can be up to 50,000 URLs and under 10MB when uncompressed. I haven't seen an XML sitemap go over that or there would be too much in the URL to begin with. Long query strings will hurt you on their own. Listing all of your pages is good, but be sure to list them once... some pages have sort parameters in the URL which when every column is sorted up and down... some pages can have 50 variants of the same page, same content. AutoMapIt.com has the filters to craft your URLs that get used on the sitemap to avoid pages like these. Aside from the kind of page that repeats itself under different names, every unique page should be listed from your privacy policy, to your terms, to your homepage.