Hello, Does anyone knw why google crawler not crawl one particular page???? i have one web designing and development related site and now my question is that one or two pages of my site has not been crawled by the google bot..so what is the reason behind it and why this things happen....and what should i do for crawling my page by the google bot....if any one knw than plz give me suggestion... Thanks, Noel Christian
many possibility for not crawling one particular page. create unique content for this page, create sitemap.xml and submit search engine, upload robots.txt file, submit your site and this page social bookmarking site, social media site..
Can be several reasons, possible are: - robots.txt is not well configured - wrong meta tags (noindex, nofollow) - your website isnt visited by a spider of a searchengine - you are banned from the searchengines So what you should do check robots.txt and your meta tags. If thats alright set up your sitemap.xml (if you hadnt that already) and build some quality backlinks. Checking if you are banned? Search on the net for a Google ban/sandbox checker.
I am agree with the bownsmith . he is right . Please submit sitemap.xml and check the robots.txt. One more important thing is : Submit your sitemap and site in to google webmaster tool . Thanks!
Reason behind it that your page has poor seo. you can do complete all seo activities. you can increase page priority in sitemap.xml. Do offpage activities for the page.. your page will be cached within one month.
If your site is well optimized, your robots is fine and your google webmaster tool not showing any error then check the quality of your content? Is they original or some how duplicate...if duplicate then work on it and improve your content, then google must cache your site and its pages..