We built a tool to track and improve your AI citation rate
April 2026
Spotlight tracks which of your pages get cited by AI tools when users ask questions in your space — and tells you specifically what to change to win more citations.
Here's what prompted it, what we learned, and what it actually does.
The observation that started this
We kept noticing the same thing: a page ranking #1 on Google for a query would be completely absent from AI search responses for the same query. Meanwhile, a page at position #8 on the same SERP was getting cited in nearly every AI response.
The content was being found — it was ranking. But it wasn't getting cited. Those are different problems with different solutions.
That asymmetry kept showing up across different sites and different topics. We wanted to measure it systematically and understand what was causing it.
What we found
After running this analysis across a lot of sites, a few patterns held consistently:
Answer structure matters more than keyword density. LLMs synthesize answers from retrieved chunks, typically 300–500 tokens. If a page's main answer is buried in paragraph three after a preamble, it often doesn't make it into the synthesized response even if the page gets retrieved. Pages that lead with the direct answer get cited significantly more often.
Entity specificity matters. Pages that name specific things — tools, companies, numbers, named concepts — get cited more than pages that talk about the same topic vaguely. "We help teams collaborate better" never gets cited. "Notion syncs in real-time across devices with offline support" might.
Backlinks don't transfer. Link authority drives Google rankings, but LLMs don't use PageRank. A page with zero external links but a clean, direct answer to a specific question will outperform a heavily linked page with a vague, meandering treatment of the same topic. This was the most consistent finding.
The competitive gap is real and measurable. For most queries, there's one or two sites being cited consistently and then a long tail that never appears. The sites at the top aren't necessarily better companies — they've just accidentally written content in a format that works for LLMs.
What Spotlight does
You enter your URL. Spotlight crawls your site, identifies the intents (questions) that AI tools associate with your space, then runs each intent against AI tools to see what gets cited.
The output isn't a score. It's specific: this page gets cited for this intent; your competitor gets cited instead for these three intents; here's why, here's what to change.
The widget side of the product is separate — it's an embeddable search tool (⌘K) for your own site's visitors, backed by the same content index. That's been useful for support and discovery, but the citation tracking is what we built this for.
What we're still figuring out
AI search is moving fast. The citations we track today may weight differently in three months as these models update. We're trying to make Spotlight useful for the current state while being honest that this is a moving target.
We also haven't fully solved the feedback loop: we can tell you what to change, but verifying the change improved things requires another scan, which takes time. We're working on making that faster.
Free plan available. The audit runs in about 2 minutes. Try it here.
Happy to answer questions about the technical approach or what we found.