Compare the live content of any URL with Google's cached version to detect cloaking — the SEO black-hat tactic of showing different content to crawlers vs human users. Returns a similarity percentage.
Cloaking is the practice of serving different HTML to search engines than to regular visitors. It's a Google Webmaster Guidelines violation that can trigger manual penalties. Common cloaking patterns: hiding spammy keywords from users while showing them to crawlers, redirecting based on user-agent, or rendering completely different pages based on the visitor type.
Enter a URL. The tool fetches both the live page and Google's cached snapshot through a CORS-friendly proxy, extracts the body text from each, and computes a word-set similarity percentage. High similarity (above 70%) usually means no cloaking; low similarity is suspicious.
Use it when investigating a competitor for unfair tactics, when troubleshooting your own site for accidental cloaking (often caused by misconfigured A/B tests or geo-redirects), or when auditing inherited sites you suspect have black-hat history.
Some legitimate sites show slightly different content to crawlers (e.g. AMP versions, mobile redirects) — these aren't cloaking if they serve users equivalent content. True cloaking always involves deception. If you suspect a competitor of cloaking, report it via Google's spam report form rather than retaliating.
No — that's personalisation, which is fine as long as the public-facing version Google indexes is real and accessible.
Possible causes: dynamic content that changes between the cache snapshot and now, A/B test variants, or sometimes JS-only rendering. Investigate before assuming malicious intent.
Yes — cloaking remains an explicit Webmaster Guidelines violation that can trigger manual actions.
Explore more website tracking tools on the tool hub — or jump straight to the Link Tracker, Check Server Status, Page Comparison Tool.