Hi guys. I have a question and i cant find a proper answer nowhere so i put it here.. How can you check if the site (landing page, not domain) is indexed in google? Is it "site:url" command in the google the right one for it? Because we use tool MonitorBacklinks which shows some pages as indexed and some as not, but when i check that pages with a google site: command there are a different results and it turns out that it cant find that exact page in the most cases.... Can someone explain how to check if the exact page is indexed in google and to get reliable results? Thanks!
"site:url" would be the correct way to check on Google. You can't always rely on 3rd party solutions as they're not always accurate. If some of your site is indexed and some of it isn't, check your sitemap. How often the G Bot crawls your site depends on how often your site gets updated. You can also use social signals to speed up the crawl rate most of the time if you're not updating very often. Don't forget to interlink your content within your site, this also helps getting pages crawled.
Thanks for the answer. What about a particular backlink if we get one, how long does it usually take that Google indexes the site? Because i can see some sites from the same domain that a link was made long time before another and its still not indexed, meanwhile other created couple of days ago already is? How this works?
Indexing has become a problem after google made an update a few months ago. Things are taking a lot longer to get indexed, especially "new" links, meaning links built on new pages/url's that didn't exist prior. The more Authority/Trust the site has (that's giving the link) the better chances the backlink will get indexed. Sending social signals directly to the URL has helped in getting new backlinks indexed (this sends "trust"). You can also manually submit the URL to G through the Search Console. Most indexing services do not work anymore. I was actually refunded as the service I was using stopped working a few months back.
The timeframe in which is indexed depend on the ranking of the website where you have a backlink, how well is organized that website, if they have a sitemap and other factors. Nothing you can control.
The ranking of a site has nothing to do with how quick a link gets indexed. All sites you get a link from should/will have a sitemap, if it doesn't then the site probably isn't even worth the money they're spending to host it, meaning it's more than likely a useless site for a backlink anyway. Sitemaps are website 101 and common sense. You can influence the indexing result using a few tricks.
Ok...so regarding the guys who make backlinks as a service(guest blogging) from some high DA PR etc. blogs, and the page doesn't get indexed what could be the reason for that? In that case backlink is worthless i guess...
First thing to do is make sure the site you're getting a link from hasn't been de-indexed. Send a few dozen social signals to the URL, links from FB/Twitter/G+/etc are trusted sites and will help G speed up crawling that URL. Spam the crap out of it with GSA SER or similar service, I do not recommend this myself but it does work usually.
Thank you for confirming my post. No where in that video does Matt say anything about it depending on a sites ranking. CNN was given as an example, the reason why is because CNN is an Authoritative site with lots of Trust, not because it "ranks" well for "News". I stated Authority/Trust is what matters in an earlier post above. And as far as the amount of links a site has can be misleading. Because I can take a new domain name and create 10,000-100,000 links to it but that won't make it index links any faster, it's all about Authority/Trust, which this new domain would have none, even with thousands of links (as mentioned before "new" links).
Great info, thanks! So making social presence for a backlink will help getting it "alive" kinda...makes sense
Yes, I do this for all new posts before any backlinks as it looks more natural to have traffic, user interaction, sharing, viral before a URL would get links. Posting a new article today and it having 10-20 backlinks tomorrow without any social interaction/traffic looks very unnatural haha You want as many of your links as possible from Authoritative/Trusted sites.
I just checked 50 links I recently built for my site (local seo) and they're not indexed. These links are built on a G property(G Maps) and still don't get instantly indexed. So I'll submit 1 URL per day for 50 days through the Search Console. This way my links are spread out over time and more natural, plus I'm already ranking well so in no hurry.
You guys either don't speak English or you just live in a parallel universe. By ranking I meant PageRank. Matt Cutts : http://techpatio.com/2009/search-engines/google/matt-cutts-google-crawls-pagerank-video-bloggers
You're bringing up old news (2009) you do know it's 2015 on Earth right? SEO has changed lots in the past 6 years haha. Also Matt Cutts is long gone, he's old news. Not to mention PR(Page Rank) doesn't even get updated publicly so how would you know PR of sites from the past year+ that PR has stopped being updated? Just admit you're wrong and move on and join us here on Earth.
Search your site on google using "site:example.com" With that you can the find the specific page of your site, if the url you are looking for is not there then it is not indexed.