I give up. Where is that damn unsubscibe. Yes, lets ban New York Times, Webmasterworld, IEEE and every other credible source that needs a sign up and load the search engines with even more crap (MFA). That will surely make the Web more usable for all of us.
- it is cloaking. it doesn't matter whether it shows keyword garbage or real content to the search engine. - it is cloaking. it doesn't matter whether some big guys are doing it or not. remember, even wordpress.org is banned for invisible content. what would happen if you had said "invisible content is ok, even wordpress uses it"? that's same. - it is cloaking. it doesn't matter if the membership is free or not. it isn't even related.
So, you want information in a search engine that you cannot access? Yes, I can access IEEE - but I spend a good amount of money for that. When I want an IEEE article, I search for it in the Digital Library. If I want a NYT article, I log in and search for it there. If I want a thread from WMW, then I log in (yes, I pay for that too - groan) and search for it there. I am sick and tired of searching for something in google, then having to log in to see if it is really relevant. The quality of the material has nothing to do with the technical method - cloaking in this case - of delivering to the search engine bots.
I agree, technically, this is cloaking. The same with del.icio.us, cloaking on the User-Agent : Browsers see <meta name="robots" content="noarchive,nofollow,noindex"/> Code (markup): but Googlebot don't and pages get indexed. I don't say it's good or it's bad, but whatever the reasons, if cloaking definition is "Show differents results depending on who request them", then it is cloaking.
When you give the SE spider one thing but show something different to a searcher it is cloaking, it doesn't matter how someone tries to explain it. If a small new site done the same as the NY times or webmasterworld, I'll bet google would waste no time to ban that site. Google is letting those sites "bend" the rules probably because Google consider them "authoritive" sites.
Exactly! That's the point. Regulations should be the same for everyone. I don't care who spams the search engines with cloaking trash pages.
I must admit - I've not found webmaster world particularly valuable. On the other hand, what's a free email account? Not a major issue either way.
Or from the consumer's perspective - Why they heck do you NEED a free account to login and see the content? Indeed - it is silly on both sides.
it is cloaking, yes. but cloaking isn't always blackhat, or is cloaking always grounds for being banned in google. there are plenty of legitimate reasons for cloaking. and I think a lot of people are confused: the far majority of webmasterworld threads do not require a paid subscription, they just require a free membership. only threads in the supporters forum require you to pay.
Don't you think they could have a legitimate reason to require logging ? Like scraping issues or something ?
that's definitely a likely factor. brett tabke's mentioned a number of times that scraping is the most significant problem webmasterworld faces. it's possible to scrape content even with something that requires an account + login, but it's more difficult and stops a lot of people. I don't think that's the only reason he's doing it though; I'm sure he's getting lots of subscriptions as a result of how he's structured things. I wouldn't call it blackhat, I'd call it pretty damn clever.
This is where I am ignorant - my understanding was that Google had a "no tolerance" attitude towards cloaking. But you are saying that Google does allow it? I have a PAID account, but it is still painful to have to log in (cause I clear my cookies every now and again) when I do a Google search and happen upon a WMW thread in the SERPs. People complain about SE spam? I see this as being no different.
yes, they do. cloaking is not automatically unacceptable. there are plenty of legitimate ways to cloak. it's still a risky road to go down, but plenty of sites cloak in a legitimate matter (webmasterworld would be one of them) and see no negative effects as a result.
Thanks Monty - that is an excellent example of what I thought Google's stand was on cloaking! As usual, Google makes it clear as mud...
I don't think you can take what they say on the guidelines page without a grain of salt. they also say: "Make pages for users, not for search engines." -- every SEO is guilty here. we do take SE's into account when making a page. if we didn't, we wouldn't be doing search engine optimization. "Don't participate in link schemes designed to increase your site's ranking or PageRank." -- don't participate in link building projects? again, every SEO is guilty here. "Avoid "doorway" pages created just for search engines." -- if any page created with SE's in mind is a "doorway page," again, almost every SEO is guilty here. the webmaster guidelines are not absolute. any of these things, when taken to an extreme, can obviously cause you major problems. on a minor scale google tolerates them, as can be seen from webmasterworld and NYTime's cloaking.
I agree with you, but this is not the webmaster guidelines, this is Matt Cutts making a clear distinction beetween IP delivery and cloaking, and he says :
Im not sure if that is cloaking, there are methods to allow googlebot to index your content but not to allow other users to view it, such as paid member forums