Hello, I need help for this. Currently there are over 400 pages in my site but in my googlesitemap it seems that there are around 360 pages. Normally it should be over 400 pages. How can I change this? Thanks your answers.
search google for a free online xml sitemap generator. Use that to generate a sitemap and then upload that xml file to your server. Then go to the google webmasters tools and submit the site map there. Hope this helps.
Have you tried generate your sitemap using another sitemap generator? Here's the A1 Sitemap Generator checklist if your xml sitemap has fewer links than expected. Most of the things apply to other sitemapper tools as well. If you still have problems, perhaps you should give some examples of URLs that you believe are missing.
If I am correct, you have created your sitemap using a sitemap generator tool. When you create a sitemap using tool, it search for all the pages following from home page on the websites and collect the list of urls. But some pages could not be found by tool as those page might be hidden or not connected from any other page...
What is the tool you used to generate the initial site map? You need to re run the sitemap generator and check the results.
A sitemap geenrator has to be run and re run on a regular basis to update sitemaps unbless of course it is ona blog and you can possibly link it to run as a chron job
Surely this has to be a case for regular re runs of the sitemap procedure. Not too sure how you would run this as a Cron job especially if the site is a static site.
google normally takes some time to crawl all these pages so be patient and make sure that all pages are easily accessible if the google spider decides to crawl them
Comedy1: after you created your XML Site Map please submit it to the Google Webmaster Center for a little analyzing. You'll see the crawling of the sub pages of your website will be much quicker.
after adding it to webmaster it will show you the list of urls and if you missing some of urls it means you have some redirects and may be some unethicle stuffs so it is a indication from google crawler to replace it
I would recommend xml-sitemaps.com, if you can afford it and have several sites buy the full version it is worth the money, you can set it to spider the page with a cron job, create personalized templates and it will ping google every time the sitemap is updated.
normally no site map generator makes mistake in fetching the site data unless you have blocked some pages with robot.txt or given "nofollow" tag to few links on homepage.
If your site has clean links throughout, you will get all your pages crawled. Broken links, mispelled href's will only delay the crawling. Its more of a "fix it and resubmit" process in the beginning as you no doubt will have some broken links, maybe even content that doesnt even exist. What nobody has mentioned here is how to get the sitemap accessed on subsequent tries. Sometimes you have to nudge Google sitemaps and ping them.