My new website was recently crawled by Google and indexed. The "Site:" command shows about 54 pages, while the google API (via the DP backlink tracker) says I only have 2 pages in index. How long does it usually take the API to account all pages?
I've had a 1500+ page site stuck at 1000 on the API for about 6 months now, even though I add content on a weekly basis the API never goes up from 1000
The api will probably never have the same amount of pages of the site command. I have never seen that yet. I think it is because google will almost always count some of your site pages as supplemental results. What you can do to have more pages in the api is to add more pages, ad more links to your site, and wait for googlebot.
"even though I add content on a weekly basis the API never goes up from 1000 " Get more links. Also, make sure to get deep links. The more important Google thinks you are (by incoming links), the more pages it will let you have in the index. The same thing is true with deep pages.
I have been unable to get the site: command to work at all reliabily. What method do you use to check for the existence of a page in Google? I just want to do a simple check and make sure the pages are indexed. Does anyone have a solution?
I use the "Cached snapshot of page" feature in google toolbar-or are you referring to something else?