Agentic engine optimization: Google AI director outlines new content playbook

Understanding the Shift: What is Agentic Engine Optimization?

The landscape of digital content is undergoing its most significant transformation since the invention of the search engine. For decades, Search Engine Optimization (SEO) has been the primary framework for how information is organized, discovered, and consumed online. However, as artificial intelligence transitions from simple chatbots to autonomous “agents,” a new discipline is emerging.

Addy Osmani, a Director of Engineering at Google Cloud AI, has recently introduced a new framework called Agentic Engine Optimization (AEO). While the acronym is sometimes shared with “Answer Engine Optimization,” Osmani’s definition is distinct and far more technical. It refers specifically to the process of making web content usable, parsable, and actionable for AI agents—autonomous systems designed to fetch, analyze, and execute tasks on behalf of a user.

In this new paradigm, the target audience is no longer just a human reader scrolling through a browser. Instead, the audience is an agent that skips the user interface entirely, extracting raw data to complete a multi-step workflow. This shift demands a complete rethink of how we structure, format, and deliver content.

How AI Agents Are Redefining the Web Experience

To understand AEO, one must first understand the behavior of an AI agent. Unlike a traditional human user, an agent does not “browse.” It does not appreciate high-resolution hero images, it does not click on internal links to explore a brand’s story, and it certainly does not engage with “sticky” navigation or pop-up newsletters.

AI agents collapse the traditional browsing experience into a single request. If a user asks an agent to “find the best shipping rates for a 5lb package and generate a comparison table,” the agent identifies relevant sources, extracts the specific pricing data, and returns the final result.

Because of this, traditional engagement metrics—such as bounce rate, time on page, and scroll depth—become secondary or even irrelevant. If an agent visits your site, it intends to extract value in milliseconds. If your site structure prevents that extraction, the agent will move on to a competitor’s site that is better optimized for machine readability.

The Token Economy: The New Currency of Content

One of the most critical insights from Osmani’s guidance is the role of the “token.” In the world of Large Language Models (LLMs), text is processed in chunks called tokens. Every AI model has a “context window,” which is the maximum number of tokens it can process at one time.

Osmani highlights that token limits are a primary constraint shaping content performance. When a webpage is too wordy, filled with unnecessary “fluff,” or structurally complex, it consumes a large portion of the agent’s context window. This leads to three significant problems:

1. Truncated Information

If an agent’s context window is filled with your site’s header navigation, sidebar links, and a 500-word introductory anecdote, it may run out of space before it ever reaches the actual data it needs. This results in the agent “dropping” the most important parts of your content.

2. Skipped Pages

Agents are designed for efficiency. If a page appears too dense or computationally “expensive” to parse without a clear payoff, the agent may simply skip the page entirely in favor of a more concise source.

3. Hallucinated Outputs

When an agent is forced to work with truncated or fragmented data due to token limits, the likelihood of “hallucination”—where the AI fills in the gaps with incorrect information—increases dramatically. By providing concise, token-efficient content, you reduce the risk of an AI misrepresenting your brand or data.

Consequently, token count is becoming a primary optimization metric, much like page load speed or keyword density used to be.

Restructuring Content for Machine Patience

For years, SEO experts have debated the value of “long-form content.” While long-form remains valuable for human readers who want deep dives, AI agents have what Osmani describes as “limited patience.” To optimize for these agents, content creators must adopt a “Front-Loaded” strategy.

The First 500 Tokens

Osmani recommends placing the core answers or data points as early as possible—ideally within the first 500 tokens of a page. This ensures that even if the agent has a limited context window, it captures the most vital information immediately.

The End of the “Burying the Lead”

In traditional blogging, it is common to use a “hook” or a long preamble to build rapport with the reader. For AEO, this is counterproductive. Agents want structured data, clear definitions, and direct answers. Subheadings should be descriptive and functional, and paragraphs should be compact and focused on a single concept.

Markdown: The Language of the Agentic Web

Perhaps the most technical recommendation in the AEO playbook is the move toward Markdown over HTML. While HTML is the foundation of the visual web, it is inherently “noisy.” A single paragraph of text in HTML is often wrapped in dozens of lines of code, including div tags, classes, styles, and scripts.

For an AI agent, this code is digital clutter. It costs tokens to process and makes parsing more difficult. Osmani suggests that businesses should consider serving clean Markdown (.md) versions of their pages alongside their traditional HTML versions.

The Benefits of Markdown for AEO

Markdown is lightweight and focuses entirely on content hierarchy. It uses simple symbols to denote headings, lists, and tables, which LLMs are natively designed to understand. By making .md versions of documentation or data directly accessible, you provide a “high-speed lane” for AI agents.

This doesn’t mean deleting your website’s design. Instead, it involves creating a parallel, machine-readable infrastructure. This could be as simple as providing a “View as Markdown” link or using server-side logic to detect an AI crawler and serve it a simplified version of the content.

Discovery and Structure: The New Standards

Just as SEO has sitemaps and robots.txt, Agentic Engine Optimization is seeing the emergence of new standards designed to help agents navigate codebases and content libraries. Osmani points to several files that act as “shortcuts” for AI systems:

llms.txt

A proposed standard, the llms.txt file serves as a structured index of a website’s documentation specifically for LLMs. It provides a map that tells the agent which pages are most relevant and how they are structured, allowing the agent to skip the discovery phase and go straight to the content.

skill.md

This file is used to define the specific capabilities of a tool or a set of content. It tells the agent, “Here is what I can help you do,” which is essential for agents looking for specific functional outcomes rather than just general information.

AGENTS.md

Particularly relevant for technical sites and code repositories, an AGENTS.md file provides a machine-readable entry point. It explains the purpose of the code, how to interact with it, and the intended workflows, reducing the time an agent spends “guessing” how to use the information.

The Internal Debate: AEO vs. Traditional SEO

It is important to note that the world of Google is not a monolith. While Addy Osmani (representing the Cloud AI side) is advocating for these agent-centric optimizations, other prominent figures at Google have a more cautious take.

John Mueller, a Senior Search Analyst at Google, has recently suggested that creating separate Markdown pages specifically for LLMs may not be necessary for traditional search ranking. Furthermore, Google Search has stated that it does not currently use the llms.txt file for organic ranking purposes.

This creates a strategic fork in the road for digital publishers:
– **If your goal is ranking in Google Search:** Continue focusing on traditional SEO, high-quality HTML content, and user experience.
– **If your goal is utility within AI workflows:** You must begin incorporating AEO principles, such as token optimization and machine-readable formats.

The reality for most modern businesses is that they need to do both. You want your content to be found by humans on Google, but you also want it to be accurately used by an AI agent when a user asks their assistant to perform a task using your data.

Practical Steps to Implement AEO Today

Transitioning to an AEO-friendly strategy does not require a total website overhaul. Instead, it is about adding a layer of machine-readability to your existing assets. Here are practical steps to get started:

1. Audit Your Token Usage

Use an LLM tokenizer tool to see how many tokens your top-performing pages consume. Identify areas where navigation menus, footers, and scripts are “stealing” tokens from your core content.

2. Create a “Quick Facts” or “Data Summary” Block

At the top of your articles or product pages, include a concise, bulleted summary of the key information. This ensures that agents get what they need within that critical first 500-token window.

3. Implement Structured Data (Schema.org)

While Osmani emphasizes Markdown, traditional Schema markup remains a vital bridge. Highly structured JSON-LD data helps agents understand the relationships between different data points (e.g., price, availability, and reviews) without needing to parse the natural language text.

4. Experiment with an llms.txt File

For sites with heavy documentation or technical content, creating an llms.txt file is a low-effort, high-reward move. It signals to the developer community and AI companies that your site is “agent-ready.”

5. Optimize Table Structures

Agents are excellent at parsing tables, but only if they are structured logically. Avoid merged cells or complex nesting. Use clear, descriptive headers for every column.

The Future: From Traffic to Outcomes

The most profound takeaway from the AEO playbook is the shift in the “why” behind content creation. In the SEO era, the goal was almost always “the visit.” We wanted users to click a link and land on our site so we could show them ads, capture their email, or sell them a product.

In the AEO era, the goal is “the outcome.” If an agent uses your data to successfully answer a user’s question or complete a task, your brand has provided value—even if the user never actually visits your homepage.

This presents a challenge for traditional monetization models, but it also opens new doors. Being the “trusted source” for an AI agent can lead to higher-order conversions and brand authority in an ecosystem where human attention is increasingly filtered through AI assistants.

As Addy Osmani’s guidance suggests, the web is no longer just for people. It is for agents. Those who optimize for this new reality will be the ones who remain relevant in the AI-first world. While the tools of the trade are shifting from keywords and backlinks to tokens and Markdown, the fundamental principle remains the same: make your information as accessible and useful as possible.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top