Hi, How can I fix this. In my google webmaster tools its indexing just 9 pages out of 37 pages/posts. http://www.mcmnwindows.ie/sitemap/ The website has 8 sitemaps generated by Yoast the SEO plugin http://www.mcmnwindows.ie/sitemap_index.xml This article talks about simplifying the management of your sitemaps https://support.google.com/webmasters/answer/75712?hl=en The website currently 37 Pages/Posts but webmaster tools say 19 submitted and only 9 indexed. In Webmaster Tools there are 30 crawl errors most are unimportant
first of all 30 error should be shown in this post so that any help i can provide,, after that indexing takes times.. like a bot come to your site .. it start indexing if some any unknown source code, script, or any flash items or any other thing comes in robot path it will not skipped it will try to index.. and robot come for particular time frame if in particular time frame your website is clean and robot able to crawl , they will index your website ... otherwise they will come later,,, they will come daily or weekly to index your website properly.. solution to this problem check any unknown script, or flash items or any error in source code aren't there?? 2nd make a cutomize robots.txt file for you website..
hi this is my robots.txt: User-agent: * Allow: /wp-content/uploads/ Allow: /*/20*/wp-* Allow: /*/wp-content/online/* Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: ?wptheme= Disallow: /wp- Disallow: /clients/ Disallow: /xmlrpc.php Disallow: /cgi-bin/ Disallow: /trackback Disallow: /comments Disallow: /feed/ Disallow: /blackhole/ Disallow: /category/*/* Disallow: /transfer/ Disallow: /tweets/ Disallow: /mint/ Disallow: /*?* Disallow: /*? Disallow: /*.php$ Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ User-agent: ia_archiver Disallow: / User-agent: ia_archiver/1.6 Disallow: / Allow: /wp-content/gallery/ Sitemap: http://www.mcmnwindows.ie/sitemap.xml User-agent: ia_archiver Disallow: / # Google Image User-agent: Googlebot-Image Disallow: Allow: /* # Google AdSense User-agent: Mediapartners-Google* Disallow: Allow: / # digg mirror User-agent: duggmirror Disallow: / Its a wordpress website and its really hard to validate with W3 validator. Do you know some good app which show me error in wordpress files ?
Google has saved resources to crawl just 9 pages, and during the time, the next of your content will be indexed too, unless you are not blocking it. I find your robots.txt file as too aggressive, so try to optimize it. There might be blocking content in any of the folders you disallowed
i changed to default wordpress, is it better ? Sitemap: http://www.mcmnwindows.ie/sitemap.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/
Your sitemap is incomplete. There are way more pages on your site, than the one appearing in the sitemap.xml Use another sitemap generator. Best is to use the one from ScreamingFrog. It will also give you additional answers.