www.lightzoneindia.com The site is up since jan 2006 & its listed in google,yahoo,msn. Strangely msn is able to crawl all the pages & its actual content on those pages but google does not do it or not able to do it. My site is a php gallery & all the pages show the same title, so could that be the reason that googlebot thinks its the same pages being repated. Googlebot visits daily & the site is updated once a week. Some suggestion could be helpful. Ranjan
Yes I have done so but it shows error saying unsupported file format, I am new to all this finding my way to get this sitemap working. Is that the reason googlebot not able to crawl further. My understanding of sitemap is that it helps google locate the structure & find newly updated pages, but the old pages should be crawalable.
The easiest way to do a sitemap is to put each URL on it's own line. Try doing it that way if you haven't already. http://www.domain.com/index.html http://www.domain.com/links.html http://www.domain.com/contact.html http://www.domain.com/about.html Code (markup):
On my website the php gallery dynamically generates the links so how do I add all the 400 plus pages. Do I put all the 400 url in a text file & upload that text file to the root of my site? Is that what you are suggesting.
Yeah, you can put all of the URLs in the file and name it like sitemap.txt and then upload it to the root folder. Then point Google Sitemaps to that file and it should be good. As for how to get all of the URLs into the file ... there are a number of sitemap generators on the web. You should do a search on Google for Google Sitemap Generator and use one of those.