feat: agent engine optimization (AEO) for AI crawler visibility#45
Merged
francisfuzz merged 3 commits intomainfrom Apr 16, 2026
Merged
feat: agent engine optimization (AEO) for AI crawler visibility#45francisfuzz merged 3 commits intomainfrom
francisfuzz merged 3 commits intomainfrom
Conversation
Add structured data, llms.txt/llms-full.txt, and robots.txt so AI agents can reliably discover, parse, and cite this site's content. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- Convert llms.txt to Jekyll template so post index auto-updates - Add /gear/ page to llms.txt and llms-full.txt - Add claude-web crawler to robots.txt (Anthropic's third agent) - Fix publisher type to Organization in BlogPosting JSON-LD - Add jsonify filter to bare Liquid interpolations in JSON-LD - Fix placeholder channel metadata in public/feed.xml Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What this does
Implements Agent Engine Optimization (AEO) — the practice of structuring and serving site content so AI agents can discover, parse, and accurately cite it. Five targeted changes, prioritized by hard data.
Why
AI agents (ChatGPT, Claude, Perplexity, Gemini) are increasingly how people find information. They don't use traditional search the same way humans do — they crawl, parse, and reason over structured content. Without explicit signals, agents either miss the site entirely or hallucinate identity details.
Key research driving these changes:
schema.orgJSON-LD during AI response generation; pages with schema get 3.2× more citations in AI answersllms-full.txtreceives dramatically more AI agent traffic thanllms.txt— ChatGPT accounts for the majority; agents prefer embedding full content upfront over follow-up fetchesllms.txtandllms-full.txtfor their own docs — first-party signal that ClaudeBot uses these filesrobots.txtsilently denies AI agents access; it's a required preconditionChanges
1.
_layouts/default.html—PersonJSON-LD schema (highest ROI)Adds
schema.org/Personstructured data to every page's<head>. This tells Google, Bing, and AI agents exactly who Francis is: name, job title, employer, description, and social profile URLs. Confirmed by Google/Microsoft to directly influence AI Overview citations.2.
_layouts/post.html—BlogPostingJSON-LD schemaAdds
schema.org/BlogPostingstructured data to each post: headline, publish date, description, author attribution, and canonical URL. Enables accurate authorship attribution when AI agents cite individual posts.3.
llms-full.txt— Full site content for agent consumptionA Jekyll-templated file at
/llms-full.txtthat outputs clean, stripped text for the About section, Résumé, and all posts (auto-updating via Liquid loop). Mintlify's data showsllms-full.txtgets ~25× more AI traffic thanllms.txtbecause agents prefer full content upfront. This is the file ChatGPT reads most.4.
llms.txt— Structured site indexA root-level markdown file at
/llms.txtfollowing the llms.txt standard. Provides a token-efficient index: who Francis is, what each page covers, and direct URLs. Used by Anthropic, Cloudflare, and Stripe for their own properties. 844,000+ sites have adopted it.5.
robots.txt— Explicit AI crawler permissionsExplicitly allows all major AI crawlers:
anthropic-ai,ClaudeBot,GPTBot,OAI-SearchBot,PerplexityBot. Per Addy Osmani's AEO framework, a missingrobots.txtwill silently block agents. Also points crawlers to the RSS feed as a sitemap.Expected impact
llms-full.txtllms.txtrobots.txtWhat was deprioritized and why
skill.md: Addy Osmani's AEO targets API documentation sites. This is a personal blog — no API surface to describe.jekyll-seo-tagalready outputs correct<meta name="description">tags from post front matterdescriptionfields. No work needed.🤖 Generated with Claude Code