Through programming you are showing another page to search engine and another page to user. This is knows as clocking in SEO.
Cloaking is a spamming search engine optimization (SEO) technique that some webmasters use to show the content of their sites to the search engine spider so they could rank higher for their keywords, and they show something else to the visitors. It's done by IP addresses or User-Agent HTTP header by using a server-side script!
Cloaking means presenting differing things to the Bot and the visitors. Showing few content to the visitors and letting the crawlers to see more content than visible by the actual visitor by some programming or scripting techniques is called cloaking. noscript tag also included in such technique.
Cloaking is the way to redirects on the other site. Ex. When you open the goolge.com.. but some time it redirects to google.co.in. if you open in India.
A lot of people seem to like Air's post, but I've always found it skirted the real purpose of cloaking as it relates to seos.... cloaking to obtain better search engine rankings. Air's example uses the language setting of the browser to selectively serve content in different languages. seos are more interested in selectively serving content based on the identity of the requestor. Here's a simple example of how seo style cloaking works: You have a cloaked web page on your server. This 'web page' is actually a CGI script which reads the IP address of the requestor. It compares the IP address of the requestor against a list of IP addresses known to belong to search engine spiders. If a match is found, then the requester is identified as a search engine spider. If no match is found, the requestor is identified as a 'human'. The CGI script can then selectively serve different content based on the identity of the requestor. If the requestor is human, it can serve the home page of the domain, or some other web page that is not highly optimized. If the requestor is a search engine spider, it can serve a highly optimized web page. The point of all this is that highly optimized web pages are served to search engine spiders while human visitors get a different web page... all from the same URL. I have used IP addresses in my example as the method of identification, but the User-Agent string can be used as well (with less reliability). Now that the mechanics of cloaking have been discussed, I'll get into the risk-management issues. cloaking is risky. Search engines have publicly stated that they frown on cloaking, and that websites can be punished (with penalties or banning) if caught. To keep your risks to a minimum, there are several strategies you would be wise to practice: keep your cloaked pages on a separate domain from your primary website, maintain accurate and up-to-date IP lists of the search engine spiders, and keep the cloaked pages out of the search enginges' caches. What is meant by keeping web pages on a separate domain? Well, you already have a primary website... it is the site you are trying to promote in the search engines. The last thing you want to have happen is to have this site penalized. Get another domain name and host it on a different server. Preferrably, the whois information for this new domain should be different from your primary domain as well. One way to accomplish this legally is to register the new domain under a D.B.A. and have the address be a post office box. This strategy keeps a buffer between your primary domain and your cloaked domain. If this domain is ever penalized or banned for having cloaked pages, your primary domain won't be affected. The second strategy involves the IP addresses your cloaked pages use to identify search engine spiders. You need to maintain a complete and accurate list of IP addresses. Search engine spiders are always crawling the web from new IP addresses and your cloaked pages need to be able to recognize the new IP addresses. Also, sometimes search engines abandon old IP addresses and these need to be removed from the list. Many search engines offer a cached copy of a web page they have spidered. The cached copy defeats the purpose of cloaking by showing any human requestor the highly optimized version of your cloaked page. Make sure that you use the ROBOTS NOARCHIVE meta tag on your optimized version of the cloaked pages and exlude any search engines that do not follow this convention. Finally, let's discuss an important question, "Should you do it?" There is an ethical slant to this question as well as a practical one. Many folks claim cloaking is sneaky and unethical. Their claim is that you are lying to the search engines about the content of your website in an attempt to get better rankings under keyword phrases the website doesn't deserve. The other side of this argument is that cloaked web pages are trying to provide relevent results to search engines... after all, what webmaster wants irrelevent traffic flooding their server? The other slant is practicallity. How much time is it going to take you to set up and maintain a cloaking system? Will the traffic it brings be worth the cost of the software (or the time programming your own script) and the cost of the new domain and hosting?
Cloaking is a spamming search engine optimization (SEO) technique Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index. Some examples of cloaking include: * Serving a page of HTML text to search engines, while showing a page of images or Flash to users. * Serving different content to search engines than to users.
Cloaking is considered spamming. Don't do it. It might get you results in the short run, but long term results will disappoint. You will most probably get yourself blacklisted or penalized.
cloaking is black hat seo technique in which the content shown to users is different from search engines. This is the way to get good ranking in search engines, but this is the spamming technique..
clocking mean's we use the script in server when we want to show high PR of our site which is not actually for user then we use clocking. But we can check it by following steps: check the URL of site which you have entered in the address bar of browser and then check the caching URL, if both are same mean's that site is not using the clocking and both are different then that site is using clocking.
According to me clocking used for "to hide your affiliate reference or your affiliate id." to made your URL shorter and keyword rich.
Cloaking is a search engine optimization (SEO) technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable. The purpose of cloaking is sometimes to deceive search engines so they display the page when it would not otherwise be displayed (black hat SEO). However, it can also be a functional (though antiquated) technique for informing search engines of content they would not otherwise be able to locate because it is embedded in non-textual containers such as video or certain Adobe Flash components. source: en.wikipedia.org/wiki/Cloaking
Cloaking is one of the best search engine optimization technique in which the content presented to the spider is different from that presented to the user's browser.