Every term used in BotWatcher.ai reports — explained twice. Once for the developer who needs the spec, and once for anyone who just wants to know what it means and whether to care. 29 terms across 5 categories.
Why it matters: Google uses TTFB as a direct input to its Core Web Vitals ranking signal. Every extra 100ms compounds down the page-load chain.
Why it matters: Uncompressed responses waste bandwidth and slow first paint, especially on mobile. Search crawlers have limited bandwidth budgets per domain.
Why it matters: Missing Cache-Control means crawlers and users re-download unchanged resources every visit, inflating server cost and slowing page loads for everyone.
Why it matters: Without ETags, browsers and CDNs can't do conditional requests. Every cached resource re-downloads from scratch after expiry.
Why it matters: Every hop in a redirect chain is a real-world latency cost. Google recommends zero redirect chains; each one is a small crawl budget penalty.
Why it matters: Without HSTS, someone on a coffee shop Wi-Fi can intercept the first HTTP request before the server redirects to HTTPS. HSTS closes that window.
Why it matters: XSS (cross-site scripting) is consistently in the OWASP Top 10. A strong CSP is the most effective single mitigation. A CSP with unsafe-inline provides almost no protection.
Why it matters: MIME sniffing is a browser-level execution bypass. One missing tag can turn an uploaded image into an attack vector.
Why it matters: Clickjacking is cheap to execute and completely invisible to the target user. This header costs nothing to add and eliminates the whole class of attack.
Why it matters: Without a Referrer-Policy, your full page URLs — including query strings with user data — leak to every third-party resource loaded on your page.
Why it matters: Third-party scripts commonly attempt feature access beyond what your site needs. Without this header you have no control over what embedded code can request.
Why it matters: JavaScript timing attacks can read memory across browser contexts. COOP + COEP together enable the hardware-isolated context needed to prevent this class of speculative execution vulnerability.
Why it matters: An absent or misconfigured robots.txt forces crawlers to guess, wasting crawl budget on unwanted pages and potentially preventing important pages from being discovered.
Why it matters: Sites without sitemaps depend entirely on link discovery. Newly published or lightly linked pages may take weeks to be indexed. A sitemap shortens that to hours.
Why it matters: Without canonicals, Google may split the authority of a page across two URL variants, halving each one's ranking power rather than combining them.
Why it matters: A good meta description can double click-through rates from the same ranking position. CTR is itself a secondary ranking signal, making this effectively recursive.
Why it matters: As AI-generated answers increasingly bypass traditional search results, llms.txt is becoming the mechanism for content owners to control AI data use. Early adopters signal seriousness to the AI ecosystem.
Why it matters: This is the most common cause of mysterious indexation failures. A misconfigured server header can silently de-index pages that look fine in source code.
Why it matters: A missing og:image means shared links show a blank preview. In contexts where every message competes for attention, a blank card versus a rich card is the difference between a click and a scroll-past.
Why it matters: Cloaking is why this tool probes your site with 20 different user agents and compares the results. The divergence between what a Chrome browser gets vs. what Googlebot gets is the signal.
Why it matters: Most access control rules for crawlers are written against User-Agent strings. This tool probes with the real official UAs of every major bot to test those rules accurately.
Why it matters: Running GET requests for 20 bots would download 20 full pages per audit. HEAD requests give us the same access signals in a fraction of the time and bandwidth.
Why it matters: Policy-response mismatches are often invisible without this kind of cross-check. A WAF rule added months ago may be blocking crawlers your robots.txt still says are welcome.
Why it matters: For sites with more than a few hundred pages, crawl budget is a real constraint. Every optimization in this audit — faster TTFB, no redirect chains, clean robots.txt — directly increases how efficiently Google indexes your site.
Why it matters: Getting the right status code for the right situation is how you tell Google immediately whether a page should stay in the index. Wrong codes leave ghost pages indexed for months.
Why it matters: Without confidence scoring, a 73/100 security score from a failed main fetch looks identical to a 73/100 from full data. This metric makes that difference visible.
Why it matters: Surface-level scores hide contradictions. A site can score 78/100 on security with a hidden contradiction that means the real state is either 95 or 30. The contradiction panel tells you which findings to audit manually.
Why it matters: A cloaking detection algorithm that only checks one bot at a time misses systematic patterns. Comparing 20 probes simultaneously reveals both individual bot blocks and class-level differentiation.
Why it matters: Edge network latency, rate limiting, and temporary service disruptions can all cause partial data. Without data quality metrics, a partial result and a complete result look the same to the score reader.