http://googlewebmastercentral.blogspot.com/ Rumor has it that SPAMoogle will use that blog to give us updates to where they are at on fixes to search result screw-ups, which would include for example the site: screw-up, sub-domain SPAM stuff, and explain what a 'bad data push' is. Oh, wait...that's just a dream!!!!
Simple blog But as it is of google, it is very nice. Was wondering what would it help people for Anyways thanks for the link
It's good to see that they have started at least some sord of communication with the webmasters. I don't expect them to explain anything even closely related to their algorithms. I look forward to that G. Ps: Thank you for the link Nintendo.
How may we provide you with excellent service today? Posted by Vanessa Fox 8/04/2006 01:23:00 PM Umm... fix your damn search engine?
Their search engine isn't broken! Its your search habits. You obviously are stupid and don't understand how to properly search. Re educate yourself and do it right! (Lots of sarcasm)
Odd. That's pretty much what Matt Cutts keeps saying. Could it be that yfs1 and Matt Cutts and GoogleGuy are all the same person?
No Im not but with that behind us I do happen to know that Issue #676 will be fixed in the 74th wave of the Genesis update coming early next month Of course that update will generate problems #677, 678 and 679 Here, have another Meg on your gMail account...that outta hold you Seriously though, I still don't get how Cutts can whip out his laptop, tell people their exact linking strategy including those being paid, etc yet they can't stop the current scraper duplicate sites that get 100,000 pages indexed a few days after registering the domain name
Perhaps, but since yfs1 left out the part about it being the webmasters fault I'm leaning towards no. Dave
For some reason, when I saw that headline "How may we provide you with excellent service today?", the phrase "adding insult to injury" kept running through my mind...
I guess they are trying to help the seriously undereducated with this: http://www.google.com/support/webmasters/bin/answer.py?answer=44231 Preferred domain... do you want that crawled with or without a www? I thought that was what .htaccess was for... I can't quite put my finger on why that bothers me, but if they would quit following 302 redirects, they would kill alot of spam... I assume there are alot of .htaccess files out there improperly redirecting. Recently, I found a page they had indexed of mine with a robots.txt exclusion and noindex - nofollow - nocache metatags. Perhaps they should worry about handling properly what the website is serving them first. Priorities seem a little off, don't they?
stop following 302s would stop a lot of spam, but there are geniune non-spammy uses for them. also as you rightly say, there are a lot set up that shouldn't be. i don't think there's a "silver bullet" solution..
Have the same thing with my site. Google shows pages I exluded more than a month ago. More than that, in "URLs restricted by robots.txt" Google Sitemaps reports that I have this URL in sitemap, though I deleted it from sitemap at the time I put it in robots.txt
Click tracking, for one, but if Google follows the links, it skews the results. Matt Cutts discussed a few examples, but it seems to me it is just bad webmastery... that Google is trying to fix. Perhaps that is why the "destination url" bit bothers me. Mine was at least 6 months old, and had been removed once by the Google url removal console. I tried it again and it would not take the entry... More than just the search engine needs fixing...
This has been happening to me for several months. Google is not only still showing old (deleted) pages in their index and SERPS, but they are still spidering pages that have not existed on my site for several months (at least since April 06). I have no idea how they're able to do that? But I'm pretty sure it's all my fault. I mean all I did was delete pages. My mistake was assuming google would stop spidering deleted pages. Apparently I have to send them a personal note every time I delete a page or modify my site.
I am still seeing requests in my logs for pages that haven't existed since the summer of 2005. I did file a deletion request for some of those pages but the logs still show Google trying to spider them.