Cloaking is a touchy subject in SEO circles. Typically falling into the category of “gray hat,” it can be fairly safe or incredibly risky to your site’s search engine rankings, depending on how and why it is done. Unfortunately, the growing literature around cloaking does little to dispel the confusion over acceptable use.

Often times, the method doesn’t matter as much as the motive. Why are you cloaking? Is it for a legitimate reason, such as a better user experience? Or are you trying to pull a fast one on the search engines? Here are some common motives to cloak, as well as the risks associated with each.

I’m cloaking in order to rank better.
I’ll go ahead and start with the obvious. Cloaking has a poor reputation because it can be used to manipulate search results, which of course violates Google’s webmaster guidelines. If you’re using it to rank better, you’ll get in trouble. Even if you’re using it for a legitimate reason, however, cloaking can still be construed as manipulative. Read as, “Cloak at your own risk.”

I’m delivering different versions of the same content to different browsers.
Even with the best intentions, cloaking by user agent can be risky. Right out of Google’s help center, “Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.” If you can, try to find other ways (CSS hacks, for instance) to cater to different browsers. If cloaking is the only way to go, stay safe by serving up the same default version of your site to search engine spiders that you serve to most browsers.

I’m serving crawlable content to search engine spiders and dynamic content to users.
Sorry, but Google’s stance is to use alts, noscripts, noembeds, and similar static-within-dynamic HTML elements to serve up crawlable content. Really, you should do this for the sake of all the atypical users who visit your site, not just search engine spiders. To put it simply, cloaking is not the answer to crawlability, especially when search engines are getting smart enough to crawl Flash.

I’m providing restricted content to search engine spiders.
It can be tempting to give Google back-door access to all of your juicy content so it can be ranked, while at the same time password protecting it from unregistered human users. Google condones this, so it’s naturally considered permissible in SEO circles. The problem is, search engines spiders save cached copies of every page they visit. More importantly, they share these caches with users on demand. Thus, if Google can see it, any savvy user can see it. Even cookies will do little good in restricting access. Dan Goodin over at the Register shows how this exploit is being used by less scrupulous users to view restricted forum postings.

I’m delivering different content based on location.
This one has been debated and cleared as acceptable. You can perform geolocation, even to the point of blocking certain geographical regions, so long as you treat search engine spiders the same as you would any user from the same region.

My site is big and important, and I’m cloaking because I feel like it.
It’s a known fact that Google won’t penalize sites that are big and important enough, no matter what black hat tactics those sites use. Apple’s iTunes store was recently outed for cloaking. NYTimes.com, Forbes.com, CareerBuilder.com, CNet.com, Nike.com, and WSJ.com are just a few others known to cloak with impunity. The trick here is to be so big and important that users would stop searching Google if they couldn’t find your site on it. The rest of us need to keep our noses clean.

If you’ve been paying attention, there’s a common theme with cloaking. If you treat search engine spiders the same as human users, it’s probably okay. If you give them special treatment, you place your site’s rankings at risk. When in doubt, remember the first and fourth universal truths of SEO: “Don’t Game the System” and “Users First; Search Engines Second.”

5 thoughts on “Cloak and Dagger: The Many Motives for Cloaking… and their Consequences

  1. Sorry for the confusion, Anne. Cloaking is the practice of programmatically delivering different content to different types of visitors who come to your website. In the context of this article, one of the visitor types under consideration is a search engine spider.

Leave a Reply

Your email address will not be published. Required fields are marked *