Tech Stack Detector
Detect 654+ technologies across 30 categories incl. AI builders (v0, Lovable), code assistants (Claude, Cursor), BaaS (Supabase, Firebase), feature flags, realtime. 5-source scan: HTML + Headers + Cookies + DNS + executed JS globals.
What this tech stack detector finds
This tool scans FIVE data sources in parallel: HTML source, HTTP response headers, cookies, DNS TXT records, and — for tools that leave no DOM trace — executed JavaScript globals via a headless Chromium (Puppeteer). Total coverage: 654+ signatures across 30 categories — CMS, analytics, tag managers, consent tools, SEO plugins, A/B testing, frameworks, CDNs, hosting, domain verifications, Backend-as-a-Service (Supabase, Firebase, Convex), feature flags (LaunchDarkly, Statsig), booking tools (Calendly, Cal.com), real-time (Pusher, Liveblocks), AI site builders / code assistants (v0, Lovable, Claude, Cursor), and modern CSS frameworks (Tailwind, UnoCSS, Panda).
Was the site built with AI?
This is a growing question in 2026 as AI site builders (v0.dev, Lovable, Bolt.new) and AI code assistants (Claude Code, Cursor, Copilot) ship more production sites. The tool surfaces two classes of evidence: direct builder signatures (v0 deployment markers, Lovable hostnames, explicit HTML comments like <!-- generated with Claude -->) and heuristic indicators (shadcn/ui + Radix + lucide-react stack, unusually long Tailwind utility chains, AI-typical marketing phrasing). The "AI Build Signals" panel classifies findings as strong (explicit signature), medium (several heuristics), or weak (one heuristic). Important: a skilled human using shadcn/ui matches the same heuristics — treat them as "maybe" unless combined with an explicit attribution.
Why four sources matter: an HTML-only scanner sees scripts loaded via <script src>, but misses what's in response headers (CDN, hosting, server stack) and DNS records (SaaS integrations like Google Workspace, Microsoft 365, Stripe). It also misses tools still present via cookie (e.g., GA4 firing cookies after a CMP blocks the loader). Each detection shows source badges — HTML, HDR, CK, DNS — so you know exactly how it was found.
Why does it show Cloudflare for a site hosted on Netlify/Vercel?
Because Cloudflare sits in front. When a domain uses Cloudflare DNS, Cloudflare terminates the TLS connection, sets its own Server: cloudflare and CF-RAY headers, then proxies to Netlify/Vercel behind the scenes. The browser only sees the Cloudflare response. That's normal — Cloudflare in front of platform hosting is a common, valid setup.
Privacy & Consent audit — what does it actually check?
If a consent manager is detected (Cookiebot, OneTrust, Usercentrics, Borlabs, etc.) and tracking scripts are also on the page, the tool checks whether any tracker loader appears BEFORE the CMP script in the HTML source. Trackers-before-consent is a classic GDPR violation: the CMP is there, but the trackers already fired. This is a static-HTML proxy check — runtime behavior (Consent Mode v2 default state, actual fire-on-load) still needs a Network-tab verification in Chrome DevTools.
How accurate is version detection?
Versions are extracted from three sources, in order of trust: the <meta name="generator"> tag, explicit regex patterns tuned per-tool (like wp-emoji-release.min.js?ver=X.Y.Z for WordPress core), and a generic name-at-version fallback. WordPress, jQuery, Elementor, Next.js, React and a handful of others have custom version patterns. Versions below a safe-threshold get a ⚠ outdated flag — useful for security audits.
Tech stack detector vs. Wappalyzer vs. BuiltWith
Wappalyzer uses browser-execution detection (window globals, DOM state) which catches more but requires a browser extension. BuiltWith has historical data going back years. This tool runs server-side without a browser, scans four data sources at once, and adds privacy/consent and GEO-readiness audits on top — which neither of the others do. For a one-off competitor lookup, it's fast; for ongoing monitoring or historical drift, use BuiltWith.
Is using a tech stack detector legal?
Yes. The HTML and DNS are served publicly to anyone who requests them. What you shouldn't do is crawl at scale or redistribute scraped fingerprint databases. The detection itself is fair game.
Explore more tools
Meta Tag Analyzer
Full meta tag audit for any URL.
Crawler Access Checker
Check AI & search crawler access.
Link Analyzer
Internal/external link analysis.
Security Headers
Check HTTP security headers and server config.
PageSpeed Insights
Full Lighthouse audit with Core Web Vitals.
FAQ
Lumina shows analytics, CMS, consent tools, and SEO plugins automatically — on every page, for free.
Add Lumina to Chrome — Free