I have a website with around 10,000 pages and I am find it hard to make my webpages index. I know its not compulsory that Google will index each and every page but the indexing rate is very low. Out of 10,000 pages only 500 are indexed approximately. What may be the issue as Webmaster is also telling there is no issue in the sitemap. Should I ping them manually or resubmit the map? Or is there any other reason to resolve this problem.
There might be some problems regarding the content of those 10,000 pages, such as too many duplicate ones, or low-quality ones, etc. You should worry more about how to make the content better rather than about indexing rate. Google is smart enough to know good from bad.
You should check your webmaster tool thoroughly and also check your sitemap if all your website pages are showing there or not.
I will suggest following activities: 1. Create XML sitemap from paid tool and submit it by assigning a priority 2. Daily fetch 'x' amount of url to google webmaster tools for indexing
@Google webmaster tools in the right sidebar choose: Fetch as Google. Enter the URLs you'd like to index and few minuets later they all will be appearing in Google. already indexed of course.
Agreed. This can be an important reason as to why the pages are not being indexed. Please do check the quality of the content and check out for duplicates.
Add to Google Webmaster tool, Once you can add your Website to Google Webmaster, Spider will find your pages Quickly . Its used for Finding your website Error, Adding the sitemap also .
I suggest you to check the robots.txt file. It can have some code, which is blocking your webpages. Interlinking is also important here.If your webpages have poor internal linking structure, then search engine crawler cannot access all the webpages of your website. I hope this works.
You can build some back links for some of the pages, it will help index these pages with other pages.
You must submit manually in some other of the site and also check the content in every page is proper and it should be quality content.
check your article in the search engines by typing site:yourdomain.com whether the whole article has been indexed?
Since your sitemap is not an issue, there may be a couple of other causes: i) your server is limiting the bandwidth allocated to crawlers ii) some of your pages are marker with noindex iii) you are blocking access to your css and js files (search engines don't like that, because they can't analyse full pages) iv) you don't have backlinks to your site v) there is a high level of thin content, low quality content, or near duplicate content vi) you are using some black hat techniques vii) your site is super slow, so slow crawlers give up... viii) if you are using url parameters, they may not have been configured properly in GWT and Bing ix) if you have moved your site to a new url, did you implement 301 redirects to get the juice? x) if the domain was previously owned, did you check the backlink profile and its history?
I won't do that as pinging a website is not a good idea (especially if you do it several times!). I would create a sitemap or even post the index on a Social Media site for example (but don't SPAM!) and if there is GOOD inner pages linking, it would work as a charm!
It will take Google some time to index 10K pages, so don't worry about the slow indexing rate. As long as you aren't blocking Google from indexing those pages, they will eventually get indexed. Consider building links to some of the pages that haven't been indexed and interlink your articles properly to facilitate deeper crawling of your website.
How can I check whether CSS and JS files are Blocked? I guess this may be the problem. Hope to see response asap as I am having difficulties.