Hi, I just want to know what is cloaking? and how it is done? is there any harm if any site is doing cloaking? It will be a great help if you people help me saying this. Thanks Jack
Cloaking is where you attempt to provide one website for a search engine and another website for visitors. The search engine page will be stuffed full of keywords that make no sense to a visitor, but it contains a javascript redirect that the search engines can't (or couldn't) follow. When a robot comes to the site, it just sees the keyword stuffing and when a person visits the site in a browser they automatically get redirected to the real site. Cloaking is not a good technique for long term SEO! If Google discovers it your site will be banned from the search engines. Peter
Cloaking is a way of redirecting a site to another site. But it's considered as blackhat technique because the optimized site for search engine or should I say site that is full of targeted keywords is redirected to the another site. What search engine sees is different from what your visitors sees. Yes, your site can be penalized once google find out this.
But if your using PPC to get traffic, you don't care about SEO, and you really want your affiliate sales i say do it.
So cloaking is Black Hat Technique which an anchor text redirecting a site to another site.... is that right Hersheys? is the example of cloaking is doorway page? Just asking....
As far as I know cloacking simply means that you provide different content for certain search engines than for visitors. It is not necessary to use javascript redirect you may identify the different user agents such as Googlebot or Yahoo! Slurp. In general cloakers look behind them so they disable the public cache option with the meta nocache tag in order to avoid the extra risk.
Cloakers also don't use user-agent anymore. Any decent one is mostly(or entirely) IP based. Yeah, it's where I make most of my money. But cloaked sites do get banned sometimes. Especially if used in conjunction with link spam.
If i understand you clearly, decent cloakers identify the search engines by the IP. Wouldn't be the cloacking code shorter if the robots would be identify by their user agent? Or is it not enough safe?
I thought, that your answer will be something like this. And what do you think about identifying bots with this php code. $ua = $_SERVER['HTTP_USER_AGENT']; if(stristr($ua, 'msnbot') || stristr($ua, 'Googlebot') || stristr($ua, 'Yahoo! Slurp')){ $ip = $_SERVER['REMOTE_ADDR']; $hostname = gethostbyaddr($ip); if(!preg_match("/\.googlebot\.com$/", $hostname) &&!preg_match("/search\.live\.com$/", $hostname) &&!preg_match("/crawl\.yahoo\.net$/", $hostname)) { $block = TRUE; header("HTTP/1.0 403 Forbidden"); exit; } else { $real_ip = gethostbyname($hostname); if($ip!= $real_ip){ $block = TRUE; header("HTTP/1.0 403 Forbidden"); exit; } else { echo "eat this"; } } } PHP: Do you think that you may pass with a fake general.useragent.override value?
Not bad ;-) Factor in crawlers without reverse dns, the fake referrers the SEs test, and a bit of referrer filtering as well(so you can send the real visitors somewhere), and you've got an ok script there.
Doorway page is a part of cloaking technique. A site that is full of keywords for SERP is what you called the doorway page. The page is usually created for spamindexing.
cloaking is done through doorway pages and for spam indexing the search engine. It is considered as an unethical seo technique and does not work for a long time.....