Free SEO Audit
User-Agent · IP · JavaScript · Hidden Text · Redirect Cloaking

Website Cloaking
Checker Free

Instantly check if a website is cloaking content from Google. Compares what real users see against what Googlebot and Bingbot receive — content similarity, HTTP codes, redirect destinations, hidden text and JS bot-detection patterns. Free, instant, no login.

3-way fetch: User / Google / Bing
Content similarity score
JS bot-detection patterns
Hidden text & CSS checks
🕵️
Bot Detection
UA & referrer checks
📊
Similarity Score
N-gram text comparison
🔀
Redirect Check
Destination per UA
👁️
Hidden Content
CSS & noscript scan
🕵️ Free Website Cloaking Checker

Fetches the URL as a real browser, Googlebot and Bingbot simultaneously then compares all three responses.
Also try our Website Malware Scanner for injected code detection.

-- /100
Cloaking Score
--
Overall cloaking risk for this URL.
📊 Content Similarity Scores
User vs Googlebot --
User vs Bingbot --
🔍 Cloaking Signals Found
Content comparison, source-code patterns and hidden-text analysis
🔎 Response Comparison: User vs Googlebot
The Detection Process

How This Free Cloaking Checker Works

Five detection stages run server-side simultaneously — your URL is fetched three ways and compared in under 15 seconds.

1

Triple Fetch

The URL is fetched simultaneously as a real Chrome browser, as Googlebot/2.1, and as Bingbot/2.0 using server-side cURL requests with the exact official user-agent strings.

2

HTTP Comparison

Response codes, redirect chains and final destination URLs are compared across all three user agents. Mismatches in HTTP status or redirect target are immediate cloaking flags.

3

Content Similarity

All scripts, styles and comments are stripped from each response and the visible text is compared using a character n-gram similarity algorithm. A score below 85% raises a finding.

4

Source Pattern Scan

The page source is scanned for 12+ cloaking code patterns including navigator.userAgent bot checks, document.referrer branching, conditional redirects, and IP whitelist signatures.

5

Hidden Content

The HTML is audited for 7 hidden-text patterns — display:none, visibility:hidden, zero font-size, off-screen positioning, white-on-white text, overflow traps and keyword-rich noscript blocks.

Complete Cloaking Guide

Website Cloaking — Everything SEOs Need to Know

What cloaking is, how Google detects it, what the penalties look like, and how to fix accidental cloaking before it costs you your rankings.

🕵️

What Is Website Cloaking and Why Does Google Penalise It?

Cloaking is the practice of showing different content or URLs to search engine crawlers than to human visitors. A cloaked page might show Google a text-heavy, keyword-stuffed version while serving a clean visual design to users — or redirect Googlebot to a completely different URL. Google's core principle is that the content it indexes should be exactly what users see. Any deliberate divergence is a violation of the Google Search Essentials.

Why Google Considers Cloaking Deceptive

Google's entire value proposition to users is that search results accurately represent real page content. Cloaking breaks this contract — it allows sites to rank for content that users never actually see. Unlike most algorithmic issues, cloaking receives a manual action penalty from Google's search quality team, not just a ranking drop. Manual actions require a human reviewer, a reconsideration request, and can take weeks to resolve even after the cloaking is fully removed.

Accidental Cloaking Is Just as Dangerous

Many SEOs are penalised for cloaking they never intentionally implemented. A/B testing tools that vary content by session, security plugins that block Googlebot, lazy-loading systems that never hydrate for crawlers, and CDN-level geo-redirects can all create genuine cloaking conditions. Regular checks with a free cloaking checker are part of responsible technical SEO hygiene.

🔀

The 6 Types of Cloaking — Detected by This Tool

Cloaking is not a single technique. It is a family of related methods that all produce the same outcome: crawlers and users see different things. This free cloaking detection tool tests for all six major types in a single scan.

User-Agent Cloaking

The most common form. The server reads the HTTP User-Agent header and serves different content when it detects Googlebot, Bingbot or other known crawler strings. This can happen server-side in PHP or .htaccess, or client-side in JavaScript using navigator.userAgent. Our tool fetches with both the official Googlebot user-agent string and a real Chrome user-agent and compares the responses.

IP-Based Cloaking

Google publishes its crawler IP ranges. Sophisticated cloaking systems whitelist these IP ranges and serve optimised content only to requests originating from Google's infrastructure. This tool scans the page source for code that references Google's IP ranges or reverse-DNS crawler patterns, which is the only way to detect IP cloaking from the outside without access to the server.

JavaScript and Referrer Cloaking

Client-side cloaking uses JavaScript to detect bots or traffic sources and modify the page after load. Common implementations check navigator.userAgent for bot keywords, read document.referrer to detect search engine traffic, or use screen.width to infer a headless crawler environment. The source pattern scan in this tool checks for all three approaches. Pair this with our DOM Size Checker to audit JS-heavy pages for rendering issues.

Hidden Text Cloaking

Rather than serving different pages, hidden text cloaking injects additional keyword-rich content that is invisible to users through CSS trickery — display:none, visibility:hidden, zero-pixel fonts, off-screen positioning, or white text on white backgrounds. The hidden-content scan in this tool checks for all seven standard hidden-text patterns.

⚠️

Cloaking Penalties: What Happens and How to Recover

Unlike most SEO problems that can be resolved algorithmically by fixing the technical issue and waiting for a recrawl, cloaking penalties follow a different, stricter recovery path because they trigger a manual action rather than an algorithmic signal.

The Manual Action Process

When Google's search quality team confirms cloaking, they issue a manual action that can range from partial (affecting specific pages or patterns) to site-wide (removing the entire domain from search results). The manual action is visible in Google Search Console under Security & Manual Actions. You will not receive an email — many site owners only discover a cloaking penalty when their traffic drops to near-zero and they check Search Console.

Recovery Steps

Recovery requires three sequential steps. First, completely remove all cloaking code — every conditional branch, every bot-detection pattern, every hidden-text element. Second, verify the fix using this cloaking checker and re-confirm clean results across all user-agent tests. Third, submit a reconsideration request in Search Console that describes exactly what was found, what was removed, and what processes are in place to prevent recurrence. Google's review typically takes 1 to 4 weeks, and incomplete fixes result in an immediate rejection that resets the clock.

Distinguishing Cloaking from Legitimate Personalisation

Not all content variation is cloaking. Serving different language versions via hreflang, showing a logged-in vs logged-out UI, or geo-targeting landing pages are all acceptable provided the core content remains consistent for crawlers and users. The test is simple: if Google were to show a user the cached version of your page and it looked substantially different from what they see when they click through, that divergence is a cloaking risk. Use our HTML Headings Checker alongside this tool to audit what content structure Google actually indexes.

🔍

How Google Detects Cloaking — and Why This Tool Finds It First

Google runs a sophisticated, multi-layered cloaking detection system that has been continuously refined since the late 1990s. Understanding how Google finds cloaking helps you understand why the checks in this tool are structured the way they are.

Googlebot's Official vs Undercover Crawls

Google crawls the web using two different modes. The official crawl uses the well-known Googlebot user-agent string and comes from published Google IP ranges — this is what sites that do IP cloaking optimise for. But Google also runs a second, undercover crawl using unannounced user-agents and IP addresses that do not identify as Googlebot. Content that differs between these two crawl modes is flagged automatically for human review. This is why sophisticated IP-based cloaking is extremely difficult to maintain long-term.

Google's Rendering Pipeline

Modern Googlebot executes JavaScript as part of its crawl, which means JavaScript cloaking is more detectable than ever. After an initial HTML-only crawl, pages are queued for full JavaScript rendering using a headless Chromium instance. If the rendered content significantly differs from the pre-render HTML, or if JavaScript redirects trigger during rendering, Google flags the discrepancy. This is particularly relevant for single-page applications and frameworks that conditionally render content based on environment detection.

How This Tool Mirrors Google's Approach

This free cloaking checker mirrors Google's official-crawl detection by fetching with the published Googlebot user-agent and comparing against a real browser fetch. It also scans for the source-level patterns that Google's quality team looks for during manual review. The content similarity threshold of 85% reflects industry consensus on what constitutes meaningful content divergence versus normal dynamic variation. For complete technical SEO coverage, combine this tool with the Website Malware Scanner — injected cloaking code is one of the most common outcomes of a site compromise.

Frequently Asked Questions

Website Cloaking — Common Questions

Answers to the most common questions about cloaking detection, Google penalties and prevention using this Free checker.

What is cloaking in SEO?

Cloaking is a black-hat SEO technique where a website shows different content or URLs to search engine crawlers compared to what human visitors see. Google explicitly lists cloaking as a violation of its Search Essentials and can issue manual action penalties that remove pages or entire sites from search results. Common forms include user-agent cloaking, IP-based cloaking, JavaScript cloaking, hidden-text cloaking and redirect cloaking.

How does this free cloaking checker detect cloaking?

The tool fetches the target URL three times simultaneously — as a real Chrome browser, as Googlebot/2.1, and as Bingbot/2.0. It then compares HTTP status codes, final redirect destinations, page titles, visible word counts and full text content across all three responses using an n-gram similarity algorithm. A score below 85% flags cloaking. The tool also scans the page source for JavaScript bot-detection code, hidden-text CSS patterns and referrer-based content switching.

What are the main types of website cloaking?

The six main types are: user-agent cloaking (different content based on the HTTP User-Agent header), IP cloaking (different content for Google's crawler IP ranges), JavaScript cloaking (navigator.userAgent or document.referrer checks in client-side code), redirect cloaking (different destination URLs for bots vs users), hidden-text cloaking (CSS-hidden keyword content), and cookie-based cloaking (using cookies to distinguish first-time crawlers from returning users). This tool detects signals across all six types.

Does Google penalise websites for cloaking?

Yes — cloaking is one of the few violations that Google explicitly names as grounds for a manual action penalty. Unlike algorithmic drops that recover when you fix the technical issue, manual actions require submitting a reconsideration request via Google Search Console after removing the cloaking. The review process can take one to four weeks, and incomplete fixes reset the clock entirely.

Can cloaking happen accidentally on my website?

Yes. Accidental cloaking is common and just as penalisable as intentional cloaking. Common causes include A/B testing frameworks that serve different variants per session, security plugins that block crawlers, lazy-loading implementations that JS-rendered content never reaches crawlers, CDN-level geo-redirects, and personalisation engines that vary content by device. Running regular cloaking checks on your own site is a standard technical SEO practice.

What content similarity score indicates cloaking?

This tool uses an 85% similarity threshold. Pages with a user vs Googlebot text similarity below 85% are flagged with a finding. Scores below 70% are marked high risk, and below 50% are marked critical. Some natural variation below 85% is acceptable — dynamic elements like dates, personalised greetings and live pricing can cause minor divergence. The tool considers this alongside other signals (HTTP codes, redirects, source patterns) before issuing a verdict.

How do I fix a cloaking penalty from Google?

Remove all cloaking code completely — every user-agent check, every conditional redirect, every hidden-text element. Verify the fix is clean using this cloaking checker across all three user-agent tests. Then submit a reconsideration request in Google Search Console under Security and Manual Actions. In the request, describe exactly what was found, what was removed, and what monitoring processes you have added to prevent recurrence. Vague requests are rejected — specificity is what gets approvals.

Is cloaking the same as a doorway page?

No. Cloaking serves different content to the same URL depending on who requests it. A doorway page is a separate URL built specifically to rank for a keyword before funnelling users elsewhere. Both are Google policy violations, but they are different techniques. This tool checks for cloaking on a single URL. For broader site architecture audits use our Free SEO Audit Tool.

Free Technical SEO Tools
Built for SEOs Who Ship

Behind the Search builds free, no-nonsense technical SEO and security tools that run instantly in your browser. No login, no Chrome extension, no limits. Just reliable data.

View All 57 Free SEO Tools
3
User Agents Tested
20+
Cloaking Patterns
100%
Free, No Login
0s
Data Stored