Arguments
are always there between webmasters if Cloaking is a recommended technique.
Most of them strictly go against it while few support it saying, “It doesn’t
harm if done carefully.” Reality is, Cloaking basically tries to trick search
engines and any method trying to do this is not legit at all.
Definition:
Cloaking is the process of showing one page to search engines and displaying
totally different page to searchers. Search engines design their ranking
algorithms that work on META tags, titles, inbound & outbound links to rank
the webpages.
Cloaked
pages satisfy all the META tag texts, titles, headings, links formalities to be
ranked high in the SERPs. So they easily get rid of search engines spiders to
get good place in the search engine index. But after the visitor lands on the
page s/he doesn’t get the accurate solution to their query.
When
users don’t get what they were actually searching, it’s a bad user experience
both on the website and the search engines. That’s why search engines never
ever like to be tricked. They always believe in providing the accurate search
results to the searchers.
How to detect a cloaked page: The
leading search engine Google always provides a link ‘Cached’ just near the search result. After clicking this link you
will get the webpage that’s actually indexed by Google. Compare this cached
page with the page displayed in the SERP. If they differ, then the page is said to be cloaked.
Why Cloaking is used?
Cloaking
is done by webmasters primarily for SEO purpose. However there are situations
when the site owners have to cloak their pages. Think of the situation when a
webmaster develops a flashy attractive site, yet s/he wants SEO for it. It’s
well known that search engines don’t usually prefer flash and image gallery
based websites.
Here
the webmaster finds no way other than cloaking the webpages. With the help of
cloaking s/he shows the flashy version of the page to the user to grab their
attention. At the same time another cloaked page is delivered to the search
engine index that’s purely text-based.
Another
case of cloaking suggests, some webmasters follow cloaking for certain selected
pages just because they don’t want to reveal their optimization techniques to
other webmasters. So they intentionally show one page to search engines and a
different page to public.
How it’s done?
Cloaking
is accomplished by making some changes in the ‘.htaccess’ file. There is a module named as “mod_rewrite” in the Apache server. Webmasters and cloaking experts
actually apply cloaking methods in the ‘.htaccess’
file by the help of “mod_rewrite”
module.
Webmasters
collect the IP addresses of search
engines and user-agents such as Googlebots. Then the total information
is supplied to the “mod_rewrite” module.
If the module finds the IP address or the user-agent is of a search engine, then it delivers the cloaked page. And if the IP address or
the user-agent doesn’t belong to any
search engine, then the normal webpage
is delivered by the module.
Cloaking is never a recommended
technique:
Cloaking
is generally regarded as an unethical tactic that tries to fool search engines
and page visitors. By using this technique you might get good ranking for a
very short period. But keep in mind sooner or later you will be detected by
search engines. Once the search engine finds any cloaked page in your site, it
bans the whole site for a longer time.