Hello Everyone, I just want to know, how can i index my website 30+ pages in google. When i search my domain name it only shows 3 pages , Thanks in advance.. Waiting for the suggestions on i can learn and work to do this.
Hi, It is advisable to avoid duplicate content and update all pages with fresh content for better indexing on Google. along with this you also need to submit fresh generated sitemap in Google Webmasters/Search Console tool.
You will likely NEVER have all of your pages indexed. Google picks and chooses which to index and which to not index. Read Google's index policy and you will discover that they will CONSIDER indexing an indexable page, but there is NO guarantee that they will index it. Based on over 20 years experience with my websites, Google seldom indexes more than a third of a site's pages. I have over 13,000 indexable pages on the web, but Google only indexes about 5,000 and excludes the the rest even though Google acknowledges that they could be indexed.
As I understand Google actually indexes most of the pages of your site, but those pages do not all appear in the search results. Probably a better question to ask yourself is how can you make sure that pages that receive a visitor do not bounce? If people stay on your site and they browse further or read a page for a long time, then Google is definitely going to index those types of pages.
Thank you for the suggestion will remember this and work on it. Thank you for the suggestion will remember this and work on it.
Go to Google Webmasters Tool and submit your sitemap. If you not have a sitemap you can create here: Site Map Generator . Also on main dashboard on Google Webmasters you have a search bar. Past your link here and you have a option to index it.
Go to Google search bar and put the name of your domain after (link:domainname), it will tell you how many pages of your website are in google crawlers. Otherwise, you can use index coverage status report If you do not find any page that means your website has not been indexed in google. This may be due to blockage via robots.txt, in order to correct that, add slash/ after URL and write robots.txt; domainurl/robots.txt, It will tell you whether you have allowed or disallowed the Google crawlers to crawl your pages. Correct it accordingly! If you are still not finding your pages, that may be because of low quality or duplicate pages or broken links. To index your pages on Google, the easiest way is to submit your URL in crawl fetch as Google. In order to increase the indexing potential use internal and external links, disallow low-quality pages to be crawled, add your important pages to your sitemap, and share your page to high traffic websites that are regularly crawled
It should be easy to get a 30-page site indexed with a good internal link structure and backlinks. How old is the site? It can take time for Google to discover and index new pages.
You can check on google webmaster if there is any error on site crawling, fix it. Also you can submit fresh sitemap, hope that would fix your issue.
You can increase the indexation with XML sitemap + breadcrumb navigation + internal links and submitting each URL in Search Console to inform Google you want it indexed. Make sure the pages aren't blocked in the Robots.txt and don't have the meta robots noindex tag implemented or canonical tags.
You can index webpages of your personal websites in google sitemaps. google.com/addurl has unfortunately been shut down.
It is simply not true those who are saying Google does not index all your pages. I have never had a site that does not have most if not all its pages indexed. I am guessing you are either having issues with non indexable pages due to robots.txt, or metadata. Or you have duplicate content, thin content, trying to index pages with no value to the search engines. Without having the website to actually review its impossible to say the reason though I would need to know a lot more about the site, such as age of site, number and quality of inbound links, review the content and code of the site. Does it have a sitemap setup? Does it have a good link structure through the website?
Firstly have a sitemap with working pages, also go to webmaster tool/ search console and you can fetch your url in url inspection one by one and ask for indexing the page. This would definitely help. Thanks
There are some steps to index all page of website in google. The steps are following a) Log into Google Search Console b) Navigate to Crawl Fetch as Google c) Take the URL you'd like indexed and paste it into the search bar d) Click the Fetch button e) After Google had found the URL, click Submit to Index
At 2 to 10 minutes per fetch/submission, it would take FOREVER to index 14,000 pages like I have. Even only 30 pages like the OP has would take anywhere from 1 hour to 5 hours to get submitted for indexing with no guarantee that Google would even index them, so your way is not really feasible for more than 5 or so pages and definitely not feasible for a major website.