Backlings changes several times during the last 12 hours... some sites gained a few backlinks earlier today... and now they went down again. I also so some changes in "pages in url" and I son't think it's over yet
yep it is - and yes it does say "about 26" but unfortunately when i actually try to look at them it shrinks to 19. also "251 with @URL" - that sounds good! - but i've not heard of this before - do i just type @ and then my URL? also - does this " @URL" actually list links or just instances of my URL be it in link form or just text?
he i got the @url working (and yes it says 251) - and guess what ? when i look at the results (go to page 2 or 3 ) it shrinks to.............(can you guess..?.) 19! Yay!! ok - so it's not quite as bad as that - at least it gives me the opportunity to "include ommitted results" and when i do it creeps back to about 81 - but unfortunately most of these are my own pages and forum sigs from here. does anyone know why it gives a figure like 250 and even when you include the ommitted results it still only shows 80? such an odd thing. i suppose none of it matters though as g should still be counting the ones it's not showing.
Will, did you test it recently? Many people feel there has been a change in the link: output this summer, and several people have reported seeing lower PR links only. That includes me, although I did not perform any scientific tests. A complicating factor might be the lack of up-to-date toolbar PR.
I've been looking at some 30 pages with backlinks... and most links are from low PR pages, but I've found a couple higher links too (Like links from PR5 and 6 pages) Today.
Since we're talking about backlinks and page counts: If I use Google direct, or one of the tools that reports multiple Google data sites, and enter the link:nowhere.com - type command, I get link counts ("1 - 10 of X") where "X" corresponds exactly to what the Digitalpoint tool shows when I refresh it. So far, so good. But when I do site:nowhere.com on G direct, I get page counts that are drastically different from what the tool reports. Generally, it seems that for sites with high counts listed, the G results are significantly higher (and, I believe, closer to correct, though still shy of the right sum), whereas for sites where G has only picked up a few of the pages so far, the G counts are a few higher than those from the tool. If both kinds of results--links and pages--differed from G to tool, OK, there are various possible reasons; but when one kind is right on and the other not, I am perplexed. On another point: Could someone list all the Google special commands that are not already listed by G, and what they accomplish? I see people using link and suchlike commands, and wonder if they could be tabled here.
In regards to why the API returns different results with the site: command, it returns what you can actually see for whatever reason. For example, if you look this: http://www.google.com/search?q=site:www.sendwu.com&num=100&start=900&filter=0 Notice it says over 1,000 results, but there is no way to view them. The API only returns what you can actually see.
OK, I see that. I don't understand it. If the site: command says Results 1 - 10 of about 13,000 and the API, as the tool reports, says Pages in URL: 7140, then one can only see, from Google, 7,140 pages of the site. Understood. But what is the status, to Google, of the 5,860 or so other pages? Google seems to understand that they are there, so why cannot or would not it be able to display them? 'Tis a puzzlement. Also (different question): how does one get only external links? I tried-- link:nowhere.com -site:nowhere.com and got no results at all.
That doesn't necessarily give links. That just tells you the pages that has the string .nowhere.com on it.
@www.domain,com @:www.domain.com :www.domain.com all give the same result which as far as I can see is also the same as "www.domain.com"Why it "shrinks" in your searches, I don't know -- it doesn't happen when I do it.
There really doesn't seem to be any rule of thumb to describe the backlinks Google displays any more -- through the last three "adjustments", including the "current" or most recent one, the link:url request shows a mixture of high PR, high ranking links with low/no PR "garbage" links. As far as I can tell, link:URL is now a request for a (quasi-)random sample. I suspect that's exactly the way Google wants it.
Minstrel, could be, but I still have my doubts. I'm not in a position for a large scale test, but just checked for 2 sites I know very well and I'm sure none of the most important links are shown.
I wasn't doubting what you are seeing, Jan, nor implying tyhat you are lying. I'm just saying that the results I see are different. I have no idea what it means: is it transitional? i.e., is it that we are drawing fromn different datacenters? some of the differences in what people report do seem to occur for that reason although it's usually a short-lived discrepancy as updates or adjustments propagate to different datacenters -- that doesn't seem to be what's going on here. I don't know any more what Google is doing and I doubt that anyone but Google does -- backlinks results are very odd; there are some other odd things too, such as established sites whose home pages dropped to PR=0 in the June adjustment while their internal pages retained PR and there was no evidence at all of a site/page penalty or anything like that. I don't think this is evidence for the "Google is broken" theory at all. But something rather unusual has been happening since June...
My theory is that Google are in the middle of changing from 1 type of system to another. There are many reports about them having ran out of memory. Compar points to http://www.w3reports.com/index.php?itemid=549 in another thread. For a Company like Google to change their system, it will take carefull planning and time. They certainly won't be wanting anyone to know what they are doing with their system change. One reason for this may be to lower the threat of a security breach in the new system. I wouldn't be surprised if everything is back to normal soon.
Don't think it's a lack of memory. If you run out, you buy more. Very simple. But the 32-bit (4 byte unsigned integer) theory is intriguing. A migration from 32 to say 64-bit page ids could be a major software upgrade, especially in a large scale 7 x 24 system. Could be compared to changing the wheels of a train while driving. Takes a while.