Initially it should only block sites in the SERPs on your machine.... but it makes perfect sense that G would aggregate the data and use it to figure out what sites suck based on how many times they are blocked from the results. Democratic spam removal by the masses; Viva la revolucion! edit: Oh I see he mentions this
and see also this part: As Matt says, this would be an easy and obvious and unfair way for someone to try to blackball a competitor if it were used as a "vote" for Google rankings. Additionally, the Google Gang are much too smart not to see the Alexa-like pitfalls of such a program.
Yeah I read about this last night.. At first it sounded interesting, but then it sounded like something that will get little to no use. Reality is based on the way the average person uses search engines I don't see this being of tremendous benefit, even if they do hold over blocks from one search to a next (which it currently does not do so it is almost totally useless) It is an interesting idea and may lead to other things, but blocking out pages from a search result just so you don't see them then is not useful. It actually would take more effort and energy than just ignoring the listing. As mentioned it can't really be used to judge websites, way too much opportunity for abuse.
yahoo has had it for a while. i've barely even noticed in the listings at all. its easier to just scroll past the listing than to click ignore on the sites that suck.
Yeah, can you imagine if there was a google bomb style attack to remove the whitehouse.gov or michaelmoore.com site? Nice idea for end user searches though, will let me remove all those spammy sites from my search results for personal browsing..