My particular question: Do you use a sitemap, if yes, how many URLS are in it ? I am asking since i usually submit a sitemap with about 2000 URLs or so. Problem is...amazon stores theoretically can have several 100.000s of sites....and i definitly have a LOT of amazon products. I see some queries in google WM tools that people come to my site using specific keywords, eg. specific products. So...logic says its good when googlebot crawles all my pages since then more products are in the catalogue and people find those keywords on google. My question is whether a sitemap with (say) 2000 URLs *limits* googlebot (so it crawls only those 2000 URLs...well...i just see: NO: I *do* have about 30k sites in google, with most of them supplemental (of course). But the 7 first pages are not. Would i benefit somehow deleting my sitemap alltogether ?
for my AOM stores I don't submit sitemap at all due to the large number of pages 2000 urls sitemap will not limit google to only crowling those but it will not ensure that those pages will be visited too
I've made sitemaps for 2 of my stores, using gsitecrawler. For the first one, I had about 1900 links, but for the second I had about 100.000 links. With this script you can split your sitemap when it reaches what number of links you want and you can submit more than one sitemap for one single site. But for this second store, I give up and a made a sitemap with only 1900 url's. Google will index the rest of them, alone