Author name: aftabkhannewemail@gmail.com

Uncategorized

How to revise your old content for AI search optimization

If your brand has been producing digital content for several years, you are likely sitting on a goldmine of information. However, the way that information is accessed is changing fundamentally. We are moving away from an era defined solely by traditional Search Engine Optimization (SEO) and into the age of Answer Engine Optimization (AEO). While the two disciplines overlap, the rise of Large Language Models (LLMs) and AI-driven search features means your old content needs a fresh coat of paint to remain visible. I am frequently asked by brand marketers how they can gain traction in AI-generated answers. My favorite response is often the simplest: “Revise your old content.” This usually sparks an “aha” moment. Because AEO feels so futuristic, many people forget that the most valuable data an AI can find is often already living on their own servers, buried in blog posts from 2021 or white papers from 2022. The challenge lies in reformatting that legacy content so it is legible to AI systems while remaining engaging for human readers. How do you reformat content for better AEO performance? The transition from SEO to AEO requires a shift in mindset. Traditional SEO focused on helping a crawler index a page based on keywords. AEO focuses on helping an AI model “understand” and “retrieve” specific facts. When I approach a content refresh for AI optimization, I lean on three core principles: topical breadth and depth, chunk-level retrieval, and answer synthesis. Optimize for topical breadth and depth To succeed in an AI-driven search environment, your website must be viewed as an authority on its core subjects. The best way to achieve this is through a hub-and-spoke model. This structure organizes your site into logical clusters that AI models can easily map. For every primary category or keyword theme, you should build a comprehensive “hub” page. This page introduces the broader topic and serves as a central directory. From that hub, you link out to “spoke” pages—articles that dive deep into specific facets of the topic. Each spoke page should have a clear, distinct purpose and address a specific query intent. Because user questions in AI search often branch into niche directions, having a wide variety of “angles” covered helps expand your overall topical reach. By linking related spoke pages to one another and consistently back to the hub, you provide AI systems with clear signals about the semantic relationships between your topics. Optimize for chunk-level retrieval One of the most significant shifts in the AI era is how information is consumed. We can no longer rely on the AI model using the entire page for context. Instead, AI systems often use a process called Retrieval-Augmented Generation (RAG) to pull specific “chunks” of text to answer a user’s prompt. If your content is buried in long, rambling paragraphs, the AI might fail to extract the relevant data. To fix this, each section of your article should be independently understandable. Keep your passages semantically tight and self-contained. The goal is “one idea per section.” If an AI model “lifts” a single paragraph from your site to answer a question, that paragraph should contain all the necessary context to make sense on its own. Companies like Our Family Wizard have successfully implemented this by breaking complex topics into highly focused, bite-sized sections that are easy for both bots and humans to digest. Optimize for answer synthesis AI models are designed to summarize. You can make their job easier—and increase your chances of being cited—by doing the summarization for them. Start your sections with direct, concise sentences that answer a question immediately. Avoid “fluff” or introductory throat-clearing. A highly effective strategy is to include a “Summary” or “Key Takeaways” section at the top of long-form posts. This provides a “TL;DR” (Too Long; Didn’t Read) that an AI model can quickly synthesize. When formatting these summaries, favor a plain, factual, and non-promotional tone. AI models are trained to look for objective information, and overly “salesy” language can sometimes be filtered out or ignored in favor of more clinical sources. Baseten, for example, uses this approach by placing easily digested summaries at the top of their technical posts, providing a clear roadmap for any AI system scanning the page. For those looking to dive deeper into this concept, you can explore how to keep your content fresh in the age of AI to ensure your updates stay relevant as models evolve. How will humans react to that formatting? A common concern among marketers is that optimizing for AI will make their content unreadable for humans. However, the opposite is usually true. AI readability is fundamentally about clarity, and human readers—especially those browsing on mobile devices—crave clarity and speed. AI systems favor content where: Answers are explicitly named rather than vaguely inferred. Sections have a clear, singular intent. Key points can be understood without reading the entire document. In practice, this means being more explicit than traditional SEO ever required. You should define terms directly, summarize sections, and state your conclusions early. This is the antithesis of the old-school “keyword-stuffed” content that was often overwritten to meet an arbitrary word count that creators thought the Google algorithm preferred. By getting to the point quickly, you improve the experience for the human user who is looking for a quick answer. However, there is a risk: oversimplification. Not every page should be reduced to a single atomic answer. Content that is meant to be strategic, opinionated, or narrative still requires a certain flow. I try to strike a balance by following a specific hierarchy: Explain the core concept first. Elaborate with nuances later. Label your insights clearly. Provide proof or data to back them up. Make the answer obvious before adding layers of sophistication. When this balance is achieved, the content satisfies the AI’s need for data and the human’s need for context. But a word of caution: beware of the “AI look.” LLM-produced content has a very recognizable footprint—think of the generic posts saturating LinkedIn. You must

Uncategorized

Google publishes Universal Commerce Protocol help page

The landscape of digital commerce is undergoing a radical transformation, shifting from a model of discovery and referral to one of seamless, integrated transactions. In a significant move that signals the future of “agentic shopping,” Google has officially published a comprehensive help page detailing the Universal Commerce Protocol (UCP). This move provides merchants with the technical blueprint and operational guidance necessary to participate in Google’s evolving ecosystem, where the line between search and purchase is becoming increasingly blurred. The Universal Commerce Protocol is not just a minor update to the Google Merchant Center; it represents a fundamental shift in how Google handles transactions across its various surfaces, including Search, YouTube, and the AI-driven Gemini. By enabling a native checkout experience directly within Google’s environment, UCP aims to eliminate the friction that often leads to cart abandonment during the transition from a search engine to a merchant’s website. What is the Universal Commerce Protocol (UCP)? At its core, the Universal Commerce Protocol (UCP) is a standardized framework that allows merchants to offer a native “Buy” button on Google surfaces. Unlike traditional Google Shopping ads, which redirect users to a merchant’s website to complete a purchase, UCP-powered checkout allows the entire transaction to happen within the Google interface. However, a critical distinction remains: the merchant stays the “seller of record.” This means that while the user interacts with Google to select items and confirm payment, the merchant is still responsible for fulfillment, customer service, and handling returns. Google acts as the facilitator and the interface, but the legal and operational responsibility for the sale remains with the business selling the product. This hybrid model aims to provide the convenience of a marketplace with the brand autonomy of a direct-to-consumer store. How UCP-Powered Checkout Functions The UCP-powered checkout experience is designed to be as frictionless as possible. When a user finds a product they want to buy on a Google surface—whether through an AI Overview, a YouTube video, or a standard search result—they can click a native purchase button. This triggers a checkout flow that utilizes the user’s stored information in their Google account. Payments are processed using Google Wallet credentials. This is a strategic move for Google, as it leverages the millions of saved payment methods already stored in Google accounts worldwide. For the merchant, this means the technical infrastructure must be able to support Google Pay tokens. The transaction data is passed securely from Google to the merchant’s backend, where the order is processed as if it had occurred on the merchant’s own site. The Implementation of the native_commerce Attribute For merchants looking to activate this feature, the technical gateway lies within the Google Merchant Center. Specifically, Google has introduced the native_commerce attribute. By implementing this attribute in their product feeds, merchants signal to Google that their products are eligible for native checkout via UCP. This attribute serves as the “on switch” for the protocol. Without it, products will continue to be displayed as standard listings that redirect to external websites. The introduction of this attribute suggests that Google is moving toward a self-service model for native commerce, allowing any merchant with the right technical setup to opt-in to this high-conversion experience. The Shift Toward Agentic Shopping and Gemini Integration The timing of the UCP documentation release is no coincidence. Google has been aggressively pivoting toward “agentic” search—an approach where AI doesn’t just provide information but also completes tasks on behalf of the user. In the context of e-commerce, this means an AI agent like Gemini could research products, compare reviews, and eventually finalize the purchase for the user. Without UCP, an AI agent would have to “hand off” the user to a third-party website, where the AI might lose its ability to assist or track the transaction. With UCP, the entire funnel—from the initial query to the final confirmation—happens within an environment that the AI can navigate. This is particularly relevant for “AI Mode” in Google Search and the Gemini app, where a seamless “Buy” button makes the transition from conversation to conversion instantaneous. Reducing Friction in the Path to Purchase In digital marketing, friction is the enemy of conversion. Every extra click, every page load, and every form field a user has to fill out increases the likelihood that they will abandon their purchase. UCP addresses this by removing the need for a user to navigate a new website, create a new account, or manually enter credit card details on a mobile device. For mobile users, who often struggle with small-screen navigation and clunky checkout forms, UCP is a game-changer. By using Google Wallet and a unified interface, Google is essentially providing a “one-click” experience across the entire web for participating merchants. Merchant as the Seller of Record: Why It Matters One of the most important aspects of the UCP documentation is the clarification that the merchant remains the seller of record. This distinguishes UCP from traditional third-party marketplaces where the platform might take a more significant role in the transaction and the customer relationship. Being the seller of record has several implications for merchants: Tax Compliance: Merchants are responsible for calculating and remitting sales tax based on the customer’s location. Customer Data: While Google facilitates the transaction, the merchant still receives the necessary order data to build a relationship with the customer. Brand Experience: The merchant’s name is front and center during the transaction, ensuring that the brand identity isn’t lost within the Google ecosystem. Fulfillment Control: The merchant manages their own shipping carriers, packaging, and delivery timelines. This model is particularly attractive to mid-to-large scale retailers who want the reach of Google but are unwilling to give up control of their customer lifecycle to a marketplace entity. Technical Requirements and Payment Processing The new help page provides a roadmap for the technical requirements merchants must meet to support UCP. Beyond the native_commerce attribute, the most significant requirement is payment processor compatibility. Because UCP relies on Google Pay tokens, a merchant’s payment gateway must be capable of

Uncategorized

How Google’s Universal Commerce Protocol changes ecommerce SEO

For decades, the relationship between ecommerce brands and Google was defined by a predictable exchange: Google provided the traffic, and the merchant’s website provided the storefront. Success was measured by rankings, click-through rates (CTR), and the ability of a landing page to convert a visitor into a buyer. In this model, SEO was essentially a logistics operation for human attention, ensuring that a website appeared at the exact moment a user expressed intent through a search query. That traditional model has officially been disrupted. With the introduction of the Universal Commerce Protocol (UCP) and the deepening integration of AI Mode within the Google Search ecosystem, the search engine is no longer just a digital signpost directing users elsewhere. It is evolving into a transaction layer—a decentralized storefront where discovery, comparison, and the final purchase all happen within a single AI-driven interface. Search is moving from a traffic channel to a commerce engine. For SEO professionals and digital marketers, this shift represents a move “upstream.” Visibility in the age of agentic commerce is no longer just about appearing on page one; it is about ensuring your product data is the primary choice made by an AI agent acting on behalf of the consumer. When the AI makes the recommendation and facilitates the checkout, the battle for the “click” is replaced by the battle for the “selection.” The shift to agentic commerce On January 11, Google launched the Universal Commerce Protocol (UCP), a move that signaled a fundamental change in how the web handles commercial data. UCP is an open standard designed to enable AI agents to discover, evaluate, and purchase products across the internet seamlessly. Unlike previous iterations of Google Shopping, which largely indexed existing web pages, UCP creates a framework where AI can “understand” a product’s lifecycle and utility well enough to represent it inside Gemini or other AI-powered experiences. What makes UCP particularly significant is the ecosystem Google has built to support it. This wasn’t a solo venture. Google collaborated with major industry players including Shopify, Etsy, Wayfair, Target, and Walmart to ensure that the protocol was integrated with the world’s largest inventory and payment networks from day one. This level of institutional support suggests that UCP is the new foundation for the “agentic web”—a web where AI agents perform tasks for users rather than just providing links. Alongside UCP, Google has rolled out three distinct capabilities that transform the shopping journey within its ecosystem: 1. The Business Agent The Business Agent acts as a brand’s digital representative inside Search and the Gemini app. It is not a simple chatbot; it is an AI-powered entity trained on a brand’s specific product data, policies, and brand voice. Shoppers can ask the Business Agent nuanced questions—such as “Is this fabric ethically sourced?” or “Will this part fit a 2022 model?”—and receive authoritative answers without ever navigating to the brand’s actual website. 2. Direct Offers In the traditional model, a discount was something a user found on a site or through an email. Direct Offers allow merchants to inject exclusive discounts and promotional pricing directly into Google’s AI Mode. This means that when an AI compares two similar products, a merchant can programmatically offer a deal that lives inside the recommendation engine itself, influencing the AI’s final suggestion in real-time. 3. Checkout in AI Mode Perhaps the most disruptive element is the ability to complete a purchase entirely within the Google interface. By integrating payment credentials and shipping information directly into the AI Mode experience, Google eliminates the friction of the “handoff” to a mobile site. For the consumer, this is a massive convenience; for the merchant, it means the traditional “site experience” is being bypassed in favor of a universal transaction layer. This shift allows Google to turn natural language conversations into immediate commerce opportunities. A user no longer needs to search for “hiking boots size 10.” Instead, they can tell Gemini, “I’m planning a three-day trip to the Pacific Northwest in October and I need gear that can handle rain.” The AI then pulls live inventory, cross-references weather data, compares durability reviews, and offers a curated selection for instant purchase. What this means for ecommerce strategy The fundamental challenge for ecommerce brands today is that the storefront has moved. For years, marketing teams focused on optimizing the homepage, the category pages, and the checkout flow. While those remain important for direct-to-site traffic, they are increasingly becoming “backend” infrastructure for the AI. If the AI agent never chooses your product to show the user, the quality of your website’s UX is irrelevant. In the past, search engines looked for keywords. Today, AI looks for solutions. Consider the “use case” problem. Many brands struggle to surface the right products because their data is too rigid. A candle retailer might have a product tagged as “Lavender Scent” and “12oz Jar.” However, the consumer isn’t necessarily searching for those attributes. They might be searching for “something to help me relax after a stressful day” or “a candle that eliminates pet odors without smelling like chemicals.” Traditional SEO often failed to bridge this gap unless a specific landing page was built for every possible intent. With UCP and Gemini, the AI can map the shopper’s situational need to the product’s inherent qualities—but only if the product data is rich enough to support that reasoning. If your data only lists scents and sizes, the AI won’t know your lavender candle is the perfect solution for a “stressful day” query. This creates a new competitive landscape. Brands are no longer just competing for a high “Rank.” They are competing for “Inclusion.” When the AI filters a million products down to the top three recommendations, being number four is the same as being invisible. The criteria for inclusion are no longer just about backlinks or keyword density; they are about data completeness, accuracy, and the ability of the AI to “trust” the information provided through the protocol. The new playbook: How SEO and AI optimization help The

Uncategorized

How to increase Google Discover traffic with technical fixes

Google Discover has transformed from a niche mobile feature into one of the most powerful traffic drivers in the digital publishing ecosystem. Since it began gaining significant traction around 2021, many SEOs and site owners have watched as it surpassed traditional organic search in terms of raw click volume. However, because Discover is a “push” medium—meaning Google serves content based on user interests rather than specific search queries—it can feel volatile and unpredictable. The Discover feed is highly personalized. It tracks interests with an almost uncomfortable level of precision, cycling through topics like local news, specific hobbies, sports teams, and professional interests like SEO or world events. It is also pervasive, appearing on Chrome new tabs, the Google mobile app, Android home screens, and the mobile version of Google.com. For publishers and tech-heavy sites, capitalizing on this surface is no longer optional; it is a necessity for growth. While content remains king in the Discover ecosystem, technical roadblocks often prevent high-quality articles from ever appearing in a user’s feed. By implementing specific technical fixes, you can ensure your site meets Google’s rigorous requirements for Discover eligibility and visibility. Essential Considerations Before Optimizing for Discover Before diving into the technical nuances, it is vital to understand the nature of Discover traffic. Unlike traditional search, which can provide a steady stream of “evergreen” traffic over years, Discover is often a game of peaks and valleys. If your brand does not fit the profile of a Discover-friendly site, technical fixes alone may not be enough. Discover Favors Timely, Authoritative Content The content that dominates Discover is almost exclusively time-sensitive. Google prioritizes fresh stories, breaking news, and trending topics from authoritative sources. While evergreen content occasionally makes an appearance, it is the exception rather than the rule. Because of this, sites that focus on news, entertainment, and trending tech updates often see their Discover traffic far outweigh their traditional search traffic. The Impact of Recent Algorithm and Interface Changes The landscape of Discover is shifting. Many publishers have reported a decline in traffic recently, largely due to updates Google made in late 2025. The Discover feed now integrates a significant volume of social media posts and AI-generated summaries of major news stories. These summaries aggregate information from multiple sources, sometimes displacing individual article links that previously occupied prime real estate. Google is also leaning more into social signals. They have recently begun beta testing the ability to track traffic to social platforms within Search Console, reflecting the reality that Discover is becoming more “social” in its curation. This means that your technical strategy must also account for how your brand is perceived as an entity across the broader web, not just on your own domain. Why Technical Fixes Still Matter Even though user interests are constantly changing, a technically sound website provides the foundation for Google’s crawlers to understand and “trust” your content. Technical optimizations for Discover are generally low-effort, template-level changes that do not harm traditional SEO. In many cases, these fixes improve your overall site health and can lead to unexpected spikes in traffic for non-publisher brands. Technical Optimization 1: Refining Your Publisher Profile Google Discover relies heavily on the concept of “entities.” Google needs to know exactly who you are, what you write about, and where else you exist on the internet. This is managed through your Discover publisher profile. Auditing Your Entity Status The first step is to check your Discover publisher profile to ensure your website and social profiles are correctly linked. To do this, you can use specialized tools like the one developed by Damian Tsuabaso. By inserting your brand’s name, URL, or entity ID, you can see how Google categorizes your publisher profile. These profile pages are often linked directly to your Knowledge Graph ID (KGMID), which is the unique identifier Google uses to track your brand’s presence across the web. If you have recently rebranded or been acquired, your publisher profile might be outdated or unclear. Fixing this requires a deep dive into Knowledge Graph optimization, ensuring that Google’s understanding of your brand entity is consistent across all platforms. Linking Social Media and Website Signals As social media posts take up more space in Discover, it is critical that Google recognizes your social accounts as belonging to your primary brand entity. To facilitate this link, follow these steps: Organization Schema: Use “sameAs” elements within your Organization schema to explicitly list your official social media profiles (X, LinkedIn, Facebook, etc.). Reciprocal Linking: Ensure your website footer contains direct links to these social accounts, and conversely, ensure your social media “About” sections link back to your primary domain. Consistent Naming: Use consistent brand names and handles across platforms to help Google’s entity recognition algorithms connect the dots. Technical Optimization 2: High-Resolution Visuals and Tags Google’s own documentation is clear: images are a primary driver of Discover performance. Discover is a visual feed, and the “card” that represents your article needs to be compelling and high-quality to earn a click. The max-image-preview:large Meta Tag One of the most frequent technical oversights is the omission of the “max-image-preview:large” tag. This robots meta tag tells Google that it has permission to use your high-resolution images as the large preview card in the Discover feed. Without this tag, Google may only show a small, low-engagement thumbnail or skip your article entirely. Check your site’s header for this tag: <meta name=”robots” content=”max-image-preview:large”>. Many Content Management Systems (CMSs) do not include this by default in their standard article templates, so it may require a manual addition or a plugin configuration. Image Dimensions and Hero Images Google recommends that images used for Discover be at least 1,200 pixels wide. This applies specifically to your “hero” image—the main image at the top of your article. While the image might be resized for the user’s mobile screen, the source file provided to Google must meet this minimum width to qualify for the high-engagement large card format. Open Graph (OG) Tag Optimization Google often pulls the Discover

Uncategorized

How Google’s Universal Commerce Protocol changes ecommerce SEO

Understanding the Seismic Shift in Ecommerce SEO For decades, the foundation of ecommerce success was built on a predictable, linear model. Google functioned as the world’s largest traffic controller, directing users toward your website through organic listings and paid search ads. Once the user landed on your site, it was your responsibility to convert them through high-quality landing pages, persuasive copy, and a seamless checkout process. In this era, the primary metrics were rankings, click-through rates (CTR), and conversion rates. That model has been fundamentally disrupted. With the official introduction of the Universal Commerce Protocol (UCP) on January 11, Google has signaled a transition from being a mere discovery engine to becoming a comprehensive transaction layer. Through the integration of UCP and AI Mode, Google is no longer just showing links; it is discovering, comparing, and completing purchases entirely within its own ecosystem. This shift represents the dawn of agentic commerce. For SEO professionals and digital marketers, the challenge is no longer just about ranking on page one. It is about ensuring that your product data is the preferred choice for an AI agent that is making decisions on behalf of the consumer. Visibility is being replaced by selection, and the storefront is moving upstream into the AI recommendation layer. What is the Universal Commerce Protocol? The Universal Commerce Protocol is a new open standard designed to harmonize how products are discovered, evaluated, and purchased across the web. While Google is the primary architect, the protocol was developed in collaboration with major industry players, including Shopify, Etsy, Wayfair, Target, and Walmart. This level of coordination suggests that UCP is not a temporary experiment but a long-term infrastructure play for the future of the internet. The core objective of UCP is to allow AI agents—such as Google Gemini—to interact with retail data in real-time. By standardizing how inventory, pricing, and product attributes are shared, Google enables its AI to act as a digital concierge. It can pull live data from a variety of retailers, compare them based on a user’s specific needs, and facilitate a purchase without the user ever having to visit a traditional website. This environment creates a “headless” shopping experience. In this scenario, the retailer provides the supply and the fulfillment, while Google provides the interface and the intelligence. This changes the role of SEO from site optimization to data optimization, where the quality and depth of your product feed become your most important competitive advantages. The Three Pillars of Google’s New Commerce Ecosystem To understand how UCP changes the landscape, we must look at the three platform-level capabilities Google has introduced to make this agentic future a reality for everyday shoppers. 1. The Business Agent The Business Agent is essentially an AI-powered brand representative that lives within Search and the Gemini app. Instead of a shopper reading a static FAQ page, they can engage in a dialogue with a brand’s agent. Shoppers can ask specific questions like, “Which of these jackets is best for a rainy trip to Seattle?” or “Does this vacuum come with a HEPA filter?” The agent uses the brand’s own data to provide authoritative, brand-level guidance. This keeps the conversation within the Google ecosystem while maintaining the brand’s voice and expertise. 2. Direct Offers Promotion strategies are also being reimagined. Through Direct Offers, merchants can inject exclusive discounts and promotional pricing directly into Google’s AI Mode. These offers are not just banners or pop-ups; they are integrated into the recommendation engine itself. If an AI agent recommends a product to a user, it can simultaneously present a tailored offer that makes the purchase more attractive. This means promotions now live inside the decision-making process rather than just on the product page. 3. Checkout in AI Mode The final and perhaps most disruptive pillar is Checkout in AI Mode. By integrating payment networks directly into the search experience, Google allows users to complete a transaction instantly. This removes the friction of site redirects, account creations, and traditional checkout flows. For the consumer, it is a massive convenience. For the retailer, it means that the “site experience” is no longer the final hurdle to a sale—the AI selection is. The Shift to Contextual and Intent-Based Recommendations Traditional SEO was built on keywords. If a user searched for “scented candles,” you optimized your page for that phrase. However, consumer behavior is often much more nuanced. People don’t just shop for products; they shop for solutions to specific problems or to satisfy emotional needs. Consider a shopper looking for a candle to mask pet odors without the scent being overbearing. In the old model, if your product data only included “Vanilla Scent” and “12oz Jar,” the search engine might never connect your product to that specific need. You would have to hope the user typed in your specific keywords or navigated through several site filters to find the “pet-friendly” section. With UCP and Gemini, the interaction becomes conversational. A user might say, “I need something that kills pet odor but doesn’t smell like fake fruit.” Because the AI understands the attributes and use cases of products, it can map that natural language prompt to the right item in your catalog. It understands that a “citrus and charcoal” candle is better for this specific request than a “strawberry shortcake” candle, even if neither is explicitly labeled for “pets.” This transition from keyword matching to reasoning-based selection is the hallmark of agentic commerce. The AI is looking for meaning, not just strings of text. This makes high-quality, descriptive content more valuable than ever before. How SEO Strategy Must Evolve for AI Selection As the “storefront” moves into the AI layer, the playbook for SEO professionals must change. Optimization is no longer just about technical site health and backlink profiles; it is about “feed health” and “attribute density.” Optimizing for the Recommendation Layer To win in this new environment, you must ensure that Google’s AI fully understands what your product is, who it is for, and the specific scenarios

Uncategorized

How to increase Google Discover traffic with technical fixes

Google Discover has transformed from a niche mobile feature into one of the most significant traffic drivers for modern publishers. Since roughly 2021, SEOs and digital marketers have watched as Discover surpassed traditional organic search in click volume for many news-heavy and lifestyle websites. It is a highly personalized feed that seems to understand users with an almost uncanny level of precision, surfacing content based on their browsing history, interests, and location. Unlike traditional search, which relies on a user entering a specific query, Google Discover is proactive. It delivers content before the user even knows they want to read it. It lives in multiple high-visibility locations: the Google app, the “New Tab” page in Chrome for mobile, Android home screens, and even the mobile version of Google.com. Because Discover is so pervasive, optimizing for it is no longer optional for brands looking to maintain a dominant digital presence. However, securing a spot in this elusive feed requires more than just good writing; it requires a foundation of specific technical fixes and authority signals. Essential considerations before we begin optimizing for Discover Before diving into the technical nuances, it is important to understand the fundamental nature of Google Discover. It is not a “one size fits all” platform, and its algorithm behaves very differently from the standard Search Engine Results Pages (SERPs). Managing expectations and understanding the current landscape of the feed is the first step toward a successful strategy. Discover favors timely content Google Discover is primarily a discovery engine for what is happening “now.” Content that performs well is almost always highly time-relevant. This is why major news publishers often see the lion’s share of Discover traffic. While evergreen content can occasionally surface if it aligns perfectly with a user’s sudden interest in a specific hobby or topic, the feed generally prioritizes breaking news, trending stories, and fresh perspectives on current events. For many sites, this creates a unique traffic profile. It is common for a website to receive millions of clicks from Discover while seeing significantly less volume from traditional search. This is because Discover captures “passive” intent, whereas search captures “active” intent. If your site doesn’t focus on timely or trending topics, your Discover potential may be lower than that of a news-oriented competitor. The shifting landscape: Discover traffic is declining It is worth noting that many publishers have reported a decline in Discover traffic over the past year. This is largely due to structural changes Google has made to the feed. As of late 2025, Google has integrated a larger volume of social media posts and AI-generated summaries of major stories into the Discover experience. These elements often displace individual articles that previously dominated the feed. This shift reflects Google’s desire to keep users within its ecosystem while providing a broader range of perspectives. Interestingly, as social media becomes a larger part of the Discover experience, Google has begun beta testing features in Search Console to track traffic specifically from social platforms. This suggests that the lines between “web search” and “social discovery” are blurring. The balance of content and technicality No amount of technical optimization can save poor content. Google Discover’s primary goal is to provide a high-quality, relevant experience for the user. If your content doesn’t resonate or if it uses “clickbaity” tactics that lead to high bounce rates, Google will quickly stop surfacing your articles. However, if you have great content that isn’t showing up, the problem is likely technical. Technical fixes ensure that your content is “eligible” and “attractive” to the algorithm, allowing your editorial efforts to actually reach the audience they deserve. Technical optimizations for Discover When auditing a website for Discover performance, there are three primary pillars to address: the publisher profile, image optimization, and transparency signals. These elements form the technical backbone of how Google identifies and trusts your content enough to put it in front of millions of users. Optimizing your Discover publisher profile Your Discover publisher profile is the entity-level representation of your brand within Google’s ecosystem. It is essential to ensure that your website and all associated social profiles are correctly linked. To do this, you first need to identify your publisher profile page. Since Google doesn’t provide a direct dashboard for this, SEOs often use third-party tools like Damian Tsuabaso’s profile builder. While the interface may be in Spanish, it is highly effective: you simply enter your brand name, URL, or Entity ID to find your profile. These profile pages are intrinsically linked to your entity’s Knowledge Graph ID (KGMID). This is a crucial detail because it means Discover isn’t just looking at your website; it’s looking at what Google “knows” about your brand as a whole. If your brand has recently undergone a rebranding or an acquisition, your publisher profile may be fragmented or unclear. This requires Knowledge Graph optimization to clarify your brand’s entity relationship to Google. To strengthen your publisher profile, focus on the following: Social Media Aggregation: Google is increasingly pulling social media posts directly into the Discover feed. To ensure your social accounts are linked to your publisher profile, you must use Organization schema with “sameAs” elements that list your official profiles. Cross-Linking: Link to your social accounts clearly in your website’s footer. Conversely, ensure your social media bios link back to your primary website. This creates a closed loop of authority that helps Google verify your brand’s legitimacy. High-impact image optimization If there is one technical “fix” that offers the highest return on investment for Discover, it is image optimization. Google’s own documentation explicitly states that large, high-quality images are essential for visibility. When an article appears in Discover, it appears as a card. If the image is small or low-resolution, the click-through rate (CTR) will plummet, and Google will eventually stop showing the content. To optimize your images for the Discover feed, follow these specific technical requirements: The max-image-preview:large tag: This is a non-negotiable meta tag. It tells Google that it has permission to display your

Uncategorized

How to increase Google Discover traffic with technical fixes

Google Discover has transformed from a niche mobile feature into one of the most significant traffic drivers for modern publishers. First catching major attention around 2021, it began delivering millions of monthly clicks to sites that understood how to play by its unique rules. Unlike traditional search, where users actively type a query, Discover is a proactive, AI-driven feed that anticipates what a user wants to see based on their interests, browsing history, and location. Today, Discover is integrated into almost every corner of the Google ecosystem. It appears on Chrome new tabs, within the dedicated Google app, on Android home screens, and even on the Google.com mobile homepage. For many publishers, Discover traffic now frequently exceeds traditional organic search traffic. However, capturing this audience requires more than just high-quality writing; it demands a specific set of technical optimizations and a deep understanding of how Google’s entity-based ranking works. Essential considerations before optimizing for Discover Before diving into technical fixes, it is crucial to understand that Google Discover operates differently than the standard Search Engine Results Page (SERP). It is not a viable traffic source for every brand or every type of content. Success in Discover is built on a foundation of timeliness, authority, and user relevance. Discover favors timely, trending content The content that thrives in the Discover feed is almost always time-sensitive. Google prioritizes breaking news, trending topics, and fresh perspectives from authoritative sources. While evergreen content occasionally surfaces, it is the exception rather than the rule. If your content strategy focuses solely on “how-to” guides from three years ago, you may find Discover a difficult nut to crack. Sites that see the highest engagement often function as news outlets or topical authorities that publish frequently on current events. The changing landscape of Discover traffic It is important to note that many publishers have seen a decline in Discover traffic recently. This is largely due to Google’s evolving strategy for the feed. As of the September 2025 updates, the Discover feed has begun to integrate a significant volume of social media posts and AI-generated summaries of major news stories. These AI summaries aggregate information from multiple sources, often displacing individual article links that used to dominate the space. Google is also placing a heavier emphasis on social signals. In fact, Google has been beta testing the ability to track traffic to social platforms within Search Console. This suggests that a brand’s presence on platforms like X (formerly Twitter), LinkedIn, or YouTube now plays a direct role in how its web content is surfaced in Discover. If your social media strategy is lagging, your Discover performance likely is too. The foundation of high-quality content Technical SEO can provide the infrastructure for success, but it cannot save poor content. Google’s algorithms are increasingly sophisticated at determining user satisfaction. If users consistently swipe past your articles or click and immediately bounce, Google will stop showing your content. Relevance is the ultimate currency in Discover. Before exploring technical causes for a traffic drop, always review your recent content to see if it still aligns with the shifting interests of your target audience. Technical optimizations for Google Discover When auditing a website for Discover performance, there are three primary technical pillars to address: the Publisher Profile, image optimization, and transparency signals. These elements help Google understand who you are, what you publish, and how to display your content most effectively to users. Mastering the Discover publisher profile One of the most overlooked aspects of Discover SEO is the Publisher Profile. This is the entity-level representation of your brand within Google’s Knowledge Graph. To maximize your visibility, you must ensure your website and social profiles are correctly linked in Google’s eyes. To audit your status, you can use specialized tools like the one developed by Damian Tsuabaso. By inserting your brand name, URL, or Entity ID, you can view your profile page as Google sees it. This is vital because Discover profile pages are intrinsically linked to your Knowledge Graph ID (KGMID). The URL string on these profile pages is often a tokenized version of your KGMID, serving as a digital fingerprint for your brand’s authority. When reviewing your profile, ask two critical questions: First, does the profile accurately reflect your current brand? If you have recently rebranded, been acquired, or changed niches, Google may still be associating you with old entities. This requires Knowledge Graph optimization to clarify your brand’s identity. Second, are your social media accounts appearing on this profile? Because social posts now occupy significant real estate in the Discover feed, having your accounts linked is no longer optional. To bridge the gap between your website and your social profiles, you should: Ensure your Organization schema includes “sameAs” elements that list all official social media URLs. Include clear links to your social profiles in the website footer. Ensure your social media bios link back to your primary website domain. Optimizing images for the 1,200px standard Google’s own documentation makes it clear: large, high-quality images are the single most effective way to increase click-through rates (CTR) in Discover. When an article appears in the feed, it is represented by a card. A large, compelling image makes that card significantly more attractive to the user. There are three technical requirements for images in Discover: 1. The max-image-preview tag: Many Content Management Systems (CMS) do not include the “max-image-preview:large” robots meta tag by default. This tag tells Google that it has permission to use your high-resolution images as the article preview. Without it, Google may default to a small thumbnail, which drastically reduces engagement. 2. Image resolution and width: Your hero images must be at least 1,200 pixels wide. While the actual rendered size on a mobile screen might be smaller, providing a high-resolution source file ensures that Google can display a crisp, full-width preview on high-DPI devices. Ensure that your CMS is not automatically compressing or resizing these images to a smaller width before they reach the user. 3. Open Graph (OG)

Uncategorized

Gen Z Preference For TikTok Over Google Drops 50%, Data Shows via @sejournal, @MattGSouthern

Gen Z Preference For TikTok Over Google Drops 50%, Data Shows via @sejournal, @MattGSouthern The Shifting Landscape of Search: Why Gen Z is Returning to Google For the past several years, a dominant narrative has permeated the digital marketing industry: Google is losing its grip on the younger generation. Reports suggested that TikTok was becoming the search engine of choice for Gen Z, with visual-first discovery replacing the traditional text-based query. However, new data suggests that the “death of Google” may have been greatly exaggerated. Recent survey data reveals a significant shift in user behavior. While nearly half of all U.S. consumers—49% to be exact—have utilized TikTok as a search tool, the overwhelming preference for TikTok over Google among Gen Z has seen a dramatic 50% decline. This pivot marks a critical moment for SEO professionals, brand managers, and content creators who have been pivoting their strategies entirely toward social media discovery. Understanding this shift requires a deep dive into how search intent is evolving. It is no longer a winner-take-all battle between social media and traditional search engines. Instead, we are entering an era of search fragmentation, where users choose their platform based on the specific type of information they need. Deconstructing the Data: Usage vs. Preference To understand why Gen Z’s preference for TikTok over Google has dropped so sharply, we must first distinguish between usage and preference. The fact that 49% of consumers use TikTok for search is an impressive feat for a platform that began as a short-form video app. It confirms that “Social SEO” is a legitimate and necessary discipline. However, usage does not always equate to loyalty or primary reliance. The data indicates that while Gen Z continues to use TikTok to discover trends, recipes, and fashion, they are returning to Google for more “high-stakes” or complex information. The 50% drop in preference suggests that the initial novelty of TikTok search is wearing off, or perhaps more accurately, that the limitations of social media search are becoming more apparent to the users themselves. For marketers, this means the strategy cannot be “either-or.” It must be “both.” The decline in preference for TikTok does not mean Gen Z is abandoning the platform; rather, it suggests they are rediscovering the utility and reliability of a traditional search engine for specific tasks. Why TikTok Initially Gained Ground TikTok’s rise as a search engine was driven by a desire for authenticity and visual proof. Gen Z, often characterized by their skepticism of polished corporate advertising, turned to TikTok to see “real people” using products or visiting locations. When a user searches for a restaurant on Google, they get a list of links, a map, and perhaps some static photos. When they search on TikTok, they get a first-person perspective of the atmosphere, the food being plated, and an honest review of the service. This “experiential search” was something Google struggled to replicate for years. Furthermore, TikTok’s algorithm is exceptionally good at understanding user intent through engagement. The “For You Page” (FYP) functions as a passive search engine, serving content that users didn’t even know they wanted. This led to a natural transition where users began using the search bar to find more of what the algorithm was already showing them. The Limitations of Social Search Despite its strengths, TikTok search has inherent flaws that have likely contributed to the 50% drop in preference. The primary issue is the spread of misinformation. Because TikTok prioritizes engagement and virality, content that is sensationalized or factually incorrect can often rise to the top of search results. For informational queries—such as medical advice, financial planning, or technical troubleshooting—Gen Z is realizing that a 60-second video from an unverified creator may not be the most reliable source. Google’s emphasis on E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness) provides a level of security that social platforms currently cannot match. Another factor is the “noise” of social media. On TikTok, search results are often cluttered with trending sounds, memes, and tangentially related content. When a user needs a quick, direct answer—such as “what time does the pharmacy close?” or “how to fix a leaking pipe”—the speed and precision of Google’s snippets and local business listings remain superior. Google’s Response: Adapting to a Visual World Google did not sit idly by while TikTok encroached on its territory. The search giant has undergone significant transformations to appeal to younger audiences who crave visual and interactive content. One of the most notable changes is the integration of “Perspectives” and the increased visibility of Reddit and Quora threads within the Search Engine Results Pages (SERPs). By surfacing first-hand human experiences from forums, Google is effectively competing with the “authenticity” that users previously sought on TikTok. Additionally, Google has improved its ability to index and display short-form video. It is now common to see a “Short videos” carousel in Google mobile search results, featuring content from YouTube Shorts, Instagram Reels, and even TikTok. By bringing social content directly into the SERP, Google has removed the need for users to leave the platform to find the visual answers they want. The introduction of AI Overviews (formerly SGE) also plays a role. By providing synthesized, conversational answers at the top of the page, Google is meeting the demand for instant gratification that social media popularized, while still backing those answers with reputable web sources. The Role of AI in Reclaiming Search Dominance Artificial Intelligence is the new frontier in the battle for search supremacy. As Google integrates Gemini into its search ecosystem, the way users interact with information is changing. For Gen Z, who are early adopters of AI tools, the ability to ask complex, multi-layered questions and receive a comprehensive summary is a major draw. TikTok is also experimenting with AI search features, but Google has a decades-long head start in data crawling and information organization. The ability to link an AI-generated answer to a wide array of reliable sources gives Google a competitive edge in “verification search”—a behavior where users

Uncategorized

See how leaders bridge the engagement divide by attending ‘Engage with SAP Online’ by SAP Engagement Cloud

In the rapidly evolving landscape of digital marketing and customer experience, a fundamental question is keeping CMOs and digital strategists awake at night: How healthy are your customer relationships? It is a question that goes far beyond open rates, click-through percentages, or quarterly sales targets. It strikes at the heart of brand longevity and consumer trust. For most organizations, the answer is more complex than it appears. Over the last two decades, the corporate world has meticulously built its infrastructure around channels. We have email teams, social media departments, e-commerce specialists, and in-store coordinators. Each of these silos typically operates with its own technology stack, its own key performance indicators (KPIs), and its own definition of what a “win” looks like. From an internal perspective, these departments often look successful. But from the customer’s perspective, the experience is often fragmented, repetitive, and frustrating. To address this growing disconnect, SAP Engagement Cloud is hosting a landmark virtual event, “Engage with SAP Online,” on March 11, 2026. This half-day digital summit is designed specifically for leaders who are ready to stop optimizing channels in isolation and start building a cohesive, relationship-driven strategy that actually resonates with modern consumers. Understanding the Engagement Divide The central theme of the upcoming event is the “Engagement Divide.” This term describes the widening gap between what customers expect from a brand and what those brands are actually capable of delivering. While many companies believe they are providing a seamless experience, the data suggests a different reality. According to the SAP Engagement Index—a comprehensive global study of 10,000 consumers and 4,800 senior decision-makers—75% of consumers report being significantly put off by disorganized brands. These are brands that pass customers between multiple departments to resolve a single issue or fail to remember a customer’s previous interactions. Conversely, 77% of brands claim their engagement strategies already deliver seamless experiences. This discrepancy highlights a massive blind spot in the modern enterprise. The Engage with SAP Online event aims to bridge this divide by providing leaders with the tools, data, and insights necessary to align their internal processes with external customer expectations. It is no longer enough to be “omnichannel” in name only; brands must be “relationship-centric” in practice. Key Speakers and Industry Insights The event features a high-caliber lineup of speakers who bring both academic rigor and real-world experience to the table. The day opens with Sara Richter, the CMO of SAP Engagement Cloud. Richter will unveil the full findings of the SAP Engagement Index, providing a data-driven foundation for the sessions that follow. The keynote address will be delivered by Mark Ritson, a professor, founder of MiniMBA, and one of the most respected voices in the marketing world. Known for his “no-nonsense” approach, Ritson’s session, “Trends Shaping Customer Experience: What’s Real, What’s Not, and What Matters Most Now,” will cut through the industry hype. Ritson is expected to challenge the traditional view that loyalty lives solely within the marketing department, arguing instead that every touchpoint—from sales to service—shapes the brand’s health. Following the keynote, attendees will hear from practitioners who are implementing these shifts in large-scale organizations: Jutta Richter (Head of 1:1 Campaign Management, BMW Group): Richter will discuss the complexities of modern customer journeys and how a premium brand like BMW ensures relevance in an era where customers often make up their minds before even speaking to a representative. Daniele Tedesco (Ecommerce Global Process Owner, Essity) and Venky Naravulu (Director of Partner Solutions, Sinch): This duo will join Mark Ritson to discuss the practicalities of modernizing engagement through AI and connected systems. They will provide a deep dive into how technology can facilitate human-like continuity at scale. The Urgency of a Relationship-First Model Why is this conversation happening now? Several macroeconomic and technological factors have converged to make customer engagement more difficult—and more critical—than ever before. First, the cost of customer acquisition (CAC) has skyrocketed across nearly every industry. When winning a new customer costs five to ten times more than retaining an existing one, the stakes for every interaction become incredibly high. Second, the erosion of third-party tracking and the shift toward privacy-first browsing mean that brands can no longer rely on external data to “find” their customers. They must build their own first-party data ecosystems, which are only possible if customers feel a sense of value and trust in their relationship with the brand. Finally, the rise of AI has empowered consumers. With AI tools at their fingertips, shoppers can compare prices, read reviews, and find alternatives in seconds. If a brand fails to recognize a customer or provides a disjointed experience, that customer has the tools and the motivation to switch to a competitor instantly. The Mismatch in Channel Strategy One of the most revealing aspects of the SAP Engagement Index is the mismatch between where customers are and where brands are spending their energy. The research indicates that: 41% of consumers prefer to shop and interact via mobile apps, but only 28% of brands have a significant engagement presence there. 43% of consumers prefer online and e-commerce shopping, yet only 26% of brands successfully engage via these web channels in a personalized way. This “channel mismatch” is a symptom of a larger problem: legacy thinking. Many organizations are still catching up to the digital shifts of the last decade, while their customers have already moved on to the next phase of engagement. The “Engage with SAP Online” event will explore how to reallocate resources to meet customers where they actually spend their time. Moving Toward Maturity: The 21% Goal SAP’s research assessed how well organizations align their people, processes, and technology around customer engagement. The results were sobering. Only 21% of organizations scored at a “high maturity” level. These are the leaders who have unified their data, broken down departmental silos, and can deliver a consistent experience across every touchpoint. The vast majority of businesses—63%—sit in the middle. These “middle-tier” companies can handle basic personalization, such as putting a customer’s name in an email,

Uncategorized

Anthropic’s Claude Bots Make Robots.txt Decisions More Granular via @sejournal, @MattGSouthern

The landscape of web crawling and data indexing is undergoing a monumental shift as artificial intelligence companies seek more efficient ways to interact with the open web. Anthropic, the developer behind the Claude AI family of models, recently updated its crawler documentation to introduce a more nuanced approach to how its bots interact with website content. By moving away from a monolithic crawling system and toward a more granular set of user-agents, Anthropic is providing webmasters and SEO professionals with unprecedented control over how their data is consumed by AI. This update is particularly significant in an era where the tension between content creators and AI developers is at an all-time high. Publishers are increasingly concerned about how their proprietary information is used to train large language models (LLMs) without compensation or attribution. Conversely, AI developers need access to high-quality, up-to-date information to remain competitive. Anthropic’s new granular bot system aims to strike a balance, offering transparency and choice through the standard robots.txt protocol. Understanding the Shift to Granular Bot Control Traditionally, web crawlers were relatively straightforward to manage. Googlebot, Bingbot, and a few others dominated the landscape, and their purpose was clear: index content for search engine results. However, the rise of generative AI has complicated this dynamic. AI companies now crawl the web for multiple reasons, ranging from long-term model training to real-time information retrieval on behalf of a specific user. Anthropic has addressed this complexity by categorizing its crawlers into three distinct entities. Each bot serves a specific purpose, and by separating them, Anthropic allows site owners to decide whether they want their content used for training, for real-time user requests, or for search-style indexing. This move represents a major step forward in technical transparency and digital rights management for publishers. The Three Faces of Claude: Identifying the New Bots The update to Anthropic’s documentation outlines three primary user-agents that webmasters should be aware of. Understanding the difference between these is essential for any SEO strategy that seeks to protect intellectual property while maintaining digital visibility. 1. Anthropic-ai (The Training Bot) The “anthropic-ai” crawler is designed specifically for data collection that will be used to train future versions of the Claude model. When this bot visits a site, it is gathering information to expand the model’s foundational knowledge. For publishers, this is often the most controversial bot, as it involves the consumption of content that might later be synthesized by the AI without driving direct traffic back to the source. 2. Claude-user (The Real-Time Request Bot) The “claude-user” agent functions differently. This bot is triggered when a person using the Claude interface specifically asks the AI to visit a URL, summarize a page, or analyze specific live web data. This is an “on-demand” crawler. If a site owner blocks this bot, they are essentially telling Claude users that they cannot interact with that site’s content through the AI interface. This has significant implications for user experience and how information is shared in AI-driven workflows. 3. Claude-web-search (The Indexing Bot) While still being refined in its utility, the “claude-web-search” bot appears to be aimed at more traditional indexing tasks that support Claude’s ability to “search” the web for answers. This suggests a move toward a more integrated search-AI experience, similar to what we see with Perplexity AI or SearchGPT. By allowing this bot while blocking others, a publisher might permit their site to be found in AI search results while still opting out of having their data used for general model training. The Technical Implementation: Using Robots.txt For SEOs and developers, managing these bots is handled through the standard robots.txt file. This file acts as a gatekeeper, telling automated systems which parts of a site are off-limits. Anthropic’s decision to respect these directives is a sign of good faith in the broader ecosystem. To block the training bot specifically, a webmaster would add the following to their robots.txt file: User-agent: anthropic-aiDisallow: / However, if they want to allow users to still bring Claude into the conversation to summarize their articles, they would need to ensure that the “claude-user” agent remains unblocked. If a webmaster uses a blanket “Disallow: /” for the “anthropic-ai” agent, it does not necessarily block “claude-user” unless specifically specified or if the site uses a generic wildcard block that affects all bots. The Strategic Trade-offs: Visibility vs. Protection The introduction of granular controls presents a strategic dilemma for digital publishers. It is no longer a simple “yes” or “no” decision regarding AI crawling. Instead, it is about weighing the trade-offs of visibility versus data protection. If you block all Anthropic bots, you are effectively taking your site off the map for one of the world’s most popular AI platforms. This means your brand, your data, and your perspectives will not be represented in the answers Claude provides to millions of users. For news organizations, this might mean a loss of influence. For niche technical blogs, it could mean that the AI will provide outdated or incorrect information about their area of expertise because it lacks access to the primary source. On the other hand, allowing the “anthropic-ai” training bot means your content is being used to build a product that may eventually compete with you for user attention. This is the “cannibalization” fear that keeps many publishers awake at night. By providing granular options, Anthropic is allowing sites to opt-in to the utility (claude-user) while opting out of the data harvesting (anthropic-ai). Impact on SEO and Crawl Budgets From a technical SEO perspective, the proliferation of AI bots adds another layer to crawl budget management. Every time a bot visits your site, it consumes server resources. For large-scale enterprise sites with millions of pages, managing how many different bots are hitting the server simultaneously is a legitimate concern. By categorizing their bots, Anthropic allows SEOs to prioritize which crawlers deserve those server resources. If a site’s primary goal is to provide real-time utility to users, they might prioritize the “claude-user” bot and set crawl

Scroll to Top