AI can find your website. But can it understand it?
I fixed how AI understands my site, then open sourced it
I ran into this problem firsthand while building Anima Felix, an anxiety relief app. When I asked ChatGPT or Claude about our site, they could locate it, but their answers were generic, sometimes outright wrong about what we do.
The content was fine. The problem was structure.
Most websites are built for humans and search engine crawlers. But LLMs and AI agents read pages differently. They need explicit structured signals to understand what a page is about, who it’s for, and what it offers. Without those signals, AI just guesses.
I spent time fixing this for Anima Felix: adding the right schema markup, semantic metadata, and agent-readable structure. The improvement in how AI tools described us was immediate and noticeable.
Then I thought: every website probably has this gap.
So I packaged the approach into an open source project called agentmarkup. It ships as a plugin for Next.js, Astro, and Vite, and includes a free checker that audits any URL in seconds.
I see it as complementary to what companies like Genezio are building. They focus on how AI sees and recommends a brand. agentmarkup focuses on helping AI read and understand the website itself.
The space between SEO and what I’d call “AI readability” is still wide open. Most sites are optimized for Google but invisible to GPT. That gap is only going to matter more.
You can try the checker at agentmarkup.dev, then I would love to know what you found about your own site.


