Uncategorized

Uncategorized

How Google’s Universal Commerce Protocol changes ecommerce SEO

For decades, the relationship between ecommerce brands and Google was defined by a predictable exchange: Google provided the traffic, and the merchant’s website provided the storefront. Success was measured by rankings, click-through rates (CTR), and the ability of a landing page to convert a visitor into a buyer. In this model, SEO was essentially a logistics operation for human attention, ensuring that a website appeared at the exact moment a user expressed intent through a search query. That traditional model has officially been disrupted. With the introduction of the Universal Commerce Protocol (UCP) and the deepening integration of AI Mode within the Google Search ecosystem, the search engine is no longer just a digital signpost directing users elsewhere. It is evolving into a transaction layer—a decentralized storefront where discovery, comparison, and the final purchase all happen within a single AI-driven interface. Search is moving from a traffic channel to a commerce engine. For SEO professionals and digital marketers, this shift represents a move “upstream.” Visibility in the age of agentic commerce is no longer just about appearing on page one; it is about ensuring your product data is the primary choice made by an AI agent acting on behalf of the consumer. When the AI makes the recommendation and facilitates the checkout, the battle for the “click” is replaced by the battle for the “selection.” The shift to agentic commerce On January 11, Google launched the Universal Commerce Protocol (UCP), a move that signaled a fundamental change in how the web handles commercial data. UCP is an open standard designed to enable AI agents to discover, evaluate, and purchase products across the internet seamlessly. Unlike previous iterations of Google Shopping, which largely indexed existing web pages, UCP creates a framework where AI can “understand” a product’s lifecycle and utility well enough to represent it inside Gemini or other AI-powered experiences. What makes UCP particularly significant is the ecosystem Google has built to support it. This wasn’t a solo venture. Google collaborated with major industry players including Shopify, Etsy, Wayfair, Target, and Walmart to ensure that the protocol was integrated with the world’s largest inventory and payment networks from day one. This level of institutional support suggests that UCP is the new foundation for the “agentic web”—a web where AI agents perform tasks for users rather than just providing links. Alongside UCP, Google has rolled out three distinct capabilities that transform the shopping journey within its ecosystem: 1. The Business Agent The Business Agent acts as a brand’s digital representative inside Search and the Gemini app. It is not a simple chatbot; it is an AI-powered entity trained on a brand’s specific product data, policies, and brand voice. Shoppers can ask the Business Agent nuanced questions—such as “Is this fabric ethically sourced?” or “Will this part fit a 2022 model?”—and receive authoritative answers without ever navigating to the brand’s actual website. 2. Direct Offers In the traditional model, a discount was something a user found on a site or through an email. Direct Offers allow merchants to inject exclusive discounts and promotional pricing directly into Google’s AI Mode. This means that when an AI compares two similar products, a merchant can programmatically offer a deal that lives inside the recommendation engine itself, influencing the AI’s final suggestion in real-time. 3. Checkout in AI Mode Perhaps the most disruptive element is the ability to complete a purchase entirely within the Google interface. By integrating payment credentials and shipping information directly into the AI Mode experience, Google eliminates the friction of the “handoff” to a mobile site. For the consumer, this is a massive convenience; for the merchant, it means the traditional “site experience” is being bypassed in favor of a universal transaction layer. This shift allows Google to turn natural language conversations into immediate commerce opportunities. A user no longer needs to search for “hiking boots size 10.” Instead, they can tell Gemini, “I’m planning a three-day trip to the Pacific Northwest in October and I need gear that can handle rain.” The AI then pulls live inventory, cross-references weather data, compares durability reviews, and offers a curated selection for instant purchase. What this means for ecommerce strategy The fundamental challenge for ecommerce brands today is that the storefront has moved. For years, marketing teams focused on optimizing the homepage, the category pages, and the checkout flow. While those remain important for direct-to-site traffic, they are increasingly becoming “backend” infrastructure for the AI. If the AI agent never chooses your product to show the user, the quality of your website’s UX is irrelevant. In the past, search engines looked for keywords. Today, AI looks for solutions. Consider the “use case” problem. Many brands struggle to surface the right products because their data is too rigid. A candle retailer might have a product tagged as “Lavender Scent” and “12oz Jar.” However, the consumer isn’t necessarily searching for those attributes. They might be searching for “something to help me relax after a stressful day” or “a candle that eliminates pet odors without smelling like chemicals.” Traditional SEO often failed to bridge this gap unless a specific landing page was built for every possible intent. With UCP and Gemini, the AI can map the shopper’s situational need to the product’s inherent qualities—but only if the product data is rich enough to support that reasoning. If your data only lists scents and sizes, the AI won’t know your lavender candle is the perfect solution for a “stressful day” query. This creates a new competitive landscape. Brands are no longer just competing for a high “Rank.” They are competing for “Inclusion.” When the AI filters a million products down to the top three recommendations, being number four is the same as being invisible. The criteria for inclusion are no longer just about backlinks or keyword density; they are about data completeness, accuracy, and the ability of the AI to “trust” the information provided through the protocol. The new playbook: How SEO and AI optimization help The

Uncategorized

How to increase Google Discover traffic with technical fixes

Google Discover has transformed from a niche mobile feature into one of the most powerful traffic drivers in the digital publishing ecosystem. Since it began gaining significant traction around 2021, many SEOs and site owners have watched as it surpassed traditional organic search in terms of raw click volume. However, because Discover is a “push” medium—meaning Google serves content based on user interests rather than specific search queries—it can feel volatile and unpredictable. The Discover feed is highly personalized. It tracks interests with an almost uncomfortable level of precision, cycling through topics like local news, specific hobbies, sports teams, and professional interests like SEO or world events. It is also pervasive, appearing on Chrome new tabs, the Google mobile app, Android home screens, and the mobile version of Google.com. For publishers and tech-heavy sites, capitalizing on this surface is no longer optional; it is a necessity for growth. While content remains king in the Discover ecosystem, technical roadblocks often prevent high-quality articles from ever appearing in a user’s feed. By implementing specific technical fixes, you can ensure your site meets Google’s rigorous requirements for Discover eligibility and visibility. Essential Considerations Before Optimizing for Discover Before diving into the technical nuances, it is vital to understand the nature of Discover traffic. Unlike traditional search, which can provide a steady stream of “evergreen” traffic over years, Discover is often a game of peaks and valleys. If your brand does not fit the profile of a Discover-friendly site, technical fixes alone may not be enough. Discover Favors Timely, Authoritative Content The content that dominates Discover is almost exclusively time-sensitive. Google prioritizes fresh stories, breaking news, and trending topics from authoritative sources. While evergreen content occasionally makes an appearance, it is the exception rather than the rule. Because of this, sites that focus on news, entertainment, and trending tech updates often see their Discover traffic far outweigh their traditional search traffic. The Impact of Recent Algorithm and Interface Changes The landscape of Discover is shifting. Many publishers have reported a decline in traffic recently, largely due to updates Google made in late 2025. The Discover feed now integrates a significant volume of social media posts and AI-generated summaries of major news stories. These summaries aggregate information from multiple sources, sometimes displacing individual article links that previously occupied prime real estate. Google is also leaning more into social signals. They have recently begun beta testing the ability to track traffic to social platforms within Search Console, reflecting the reality that Discover is becoming more “social” in its curation. This means that your technical strategy must also account for how your brand is perceived as an entity across the broader web, not just on your own domain. Why Technical Fixes Still Matter Even though user interests are constantly changing, a technically sound website provides the foundation for Google’s crawlers to understand and “trust” your content. Technical optimizations for Discover are generally low-effort, template-level changes that do not harm traditional SEO. In many cases, these fixes improve your overall site health and can lead to unexpected spikes in traffic for non-publisher brands. Technical Optimization 1: Refining Your Publisher Profile Google Discover relies heavily on the concept of “entities.” Google needs to know exactly who you are, what you write about, and where else you exist on the internet. This is managed through your Discover publisher profile. Auditing Your Entity Status The first step is to check your Discover publisher profile to ensure your website and social profiles are correctly linked. To do this, you can use specialized tools like the one developed by Damian Tsuabaso. By inserting your brand’s name, URL, or entity ID, you can see how Google categorizes your publisher profile. These profile pages are often linked directly to your Knowledge Graph ID (KGMID), which is the unique identifier Google uses to track your brand’s presence across the web. If you have recently rebranded or been acquired, your publisher profile might be outdated or unclear. Fixing this requires a deep dive into Knowledge Graph optimization, ensuring that Google’s understanding of your brand entity is consistent across all platforms. Linking Social Media and Website Signals As social media posts take up more space in Discover, it is critical that Google recognizes your social accounts as belonging to your primary brand entity. To facilitate this link, follow these steps: Organization Schema: Use “sameAs” elements within your Organization schema to explicitly list your official social media profiles (X, LinkedIn, Facebook, etc.). Reciprocal Linking: Ensure your website footer contains direct links to these social accounts, and conversely, ensure your social media “About” sections link back to your primary domain. Consistent Naming: Use consistent brand names and handles across platforms to help Google’s entity recognition algorithms connect the dots. Technical Optimization 2: High-Resolution Visuals and Tags Google’s own documentation is clear: images are a primary driver of Discover performance. Discover is a visual feed, and the “card” that represents your article needs to be compelling and high-quality to earn a click. The max-image-preview:large Meta Tag One of the most frequent technical oversights is the omission of the “max-image-preview:large” tag. This robots meta tag tells Google that it has permission to use your high-resolution images as the large preview card in the Discover feed. Without this tag, Google may only show a small, low-engagement thumbnail or skip your article entirely. Check your site’s header for this tag: <meta name=”robots” content=”max-image-preview:large”>. Many Content Management Systems (CMSs) do not include this by default in their standard article templates, so it may require a manual addition or a plugin configuration. Image Dimensions and Hero Images Google recommends that images used for Discover be at least 1,200 pixels wide. This applies specifically to your “hero” image—the main image at the top of your article. While the image might be resized for the user’s mobile screen, the source file provided to Google must meet this minimum width to qualify for the high-engagement large card format. Open Graph (OG) Tag Optimization Google often pulls the Discover

Uncategorized

How Google’s Universal Commerce Protocol changes ecommerce SEO

Understanding the Seismic Shift in Ecommerce SEO For decades, the foundation of ecommerce success was built on a predictable, linear model. Google functioned as the world’s largest traffic controller, directing users toward your website through organic listings and paid search ads. Once the user landed on your site, it was your responsibility to convert them through high-quality landing pages, persuasive copy, and a seamless checkout process. In this era, the primary metrics were rankings, click-through rates (CTR), and conversion rates. That model has been fundamentally disrupted. With the official introduction of the Universal Commerce Protocol (UCP) on January 11, Google has signaled a transition from being a mere discovery engine to becoming a comprehensive transaction layer. Through the integration of UCP and AI Mode, Google is no longer just showing links; it is discovering, comparing, and completing purchases entirely within its own ecosystem. This shift represents the dawn of agentic commerce. For SEO professionals and digital marketers, the challenge is no longer just about ranking on page one. It is about ensuring that your product data is the preferred choice for an AI agent that is making decisions on behalf of the consumer. Visibility is being replaced by selection, and the storefront is moving upstream into the AI recommendation layer. What is the Universal Commerce Protocol? The Universal Commerce Protocol is a new open standard designed to harmonize how products are discovered, evaluated, and purchased across the web. While Google is the primary architect, the protocol was developed in collaboration with major industry players, including Shopify, Etsy, Wayfair, Target, and Walmart. This level of coordination suggests that UCP is not a temporary experiment but a long-term infrastructure play for the future of the internet. The core objective of UCP is to allow AI agents—such as Google Gemini—to interact with retail data in real-time. By standardizing how inventory, pricing, and product attributes are shared, Google enables its AI to act as a digital concierge. It can pull live data from a variety of retailers, compare them based on a user’s specific needs, and facilitate a purchase without the user ever having to visit a traditional website. This environment creates a “headless” shopping experience. In this scenario, the retailer provides the supply and the fulfillment, while Google provides the interface and the intelligence. This changes the role of SEO from site optimization to data optimization, where the quality and depth of your product feed become your most important competitive advantages. The Three Pillars of Google’s New Commerce Ecosystem To understand how UCP changes the landscape, we must look at the three platform-level capabilities Google has introduced to make this agentic future a reality for everyday shoppers. 1. The Business Agent The Business Agent is essentially an AI-powered brand representative that lives within Search and the Gemini app. Instead of a shopper reading a static FAQ page, they can engage in a dialogue with a brand’s agent. Shoppers can ask specific questions like, “Which of these jackets is best for a rainy trip to Seattle?” or “Does this vacuum come with a HEPA filter?” The agent uses the brand’s own data to provide authoritative, brand-level guidance. This keeps the conversation within the Google ecosystem while maintaining the brand’s voice and expertise. 2. Direct Offers Promotion strategies are also being reimagined. Through Direct Offers, merchants can inject exclusive discounts and promotional pricing directly into Google’s AI Mode. These offers are not just banners or pop-ups; they are integrated into the recommendation engine itself. If an AI agent recommends a product to a user, it can simultaneously present a tailored offer that makes the purchase more attractive. This means promotions now live inside the decision-making process rather than just on the product page. 3. Checkout in AI Mode The final and perhaps most disruptive pillar is Checkout in AI Mode. By integrating payment networks directly into the search experience, Google allows users to complete a transaction instantly. This removes the friction of site redirects, account creations, and traditional checkout flows. For the consumer, it is a massive convenience. For the retailer, it means that the “site experience” is no longer the final hurdle to a sale—the AI selection is. The Shift to Contextual and Intent-Based Recommendations Traditional SEO was built on keywords. If a user searched for “scented candles,” you optimized your page for that phrase. However, consumer behavior is often much more nuanced. People don’t just shop for products; they shop for solutions to specific problems or to satisfy emotional needs. Consider a shopper looking for a candle to mask pet odors without the scent being overbearing. In the old model, if your product data only included “Vanilla Scent” and “12oz Jar,” the search engine might never connect your product to that specific need. You would have to hope the user typed in your specific keywords or navigated through several site filters to find the “pet-friendly” section. With UCP and Gemini, the interaction becomes conversational. A user might say, “I need something that kills pet odor but doesn’t smell like fake fruit.” Because the AI understands the attributes and use cases of products, it can map that natural language prompt to the right item in your catalog. It understands that a “citrus and charcoal” candle is better for this specific request than a “strawberry shortcake” candle, even if neither is explicitly labeled for “pets.” This transition from keyword matching to reasoning-based selection is the hallmark of agentic commerce. The AI is looking for meaning, not just strings of text. This makes high-quality, descriptive content more valuable than ever before. How SEO Strategy Must Evolve for AI Selection As the “storefront” moves into the AI layer, the playbook for SEO professionals must change. Optimization is no longer just about technical site health and backlink profiles; it is about “feed health” and “attribute density.” Optimizing for the Recommendation Layer To win in this new environment, you must ensure that Google’s AI fully understands what your product is, who it is for, and the specific scenarios

Uncategorized

How to increase Google Discover traffic with technical fixes

Google Discover has transformed from a niche mobile feature into one of the most significant traffic drivers for modern publishers. Since roughly 2021, SEOs and digital marketers have watched as Discover surpassed traditional organic search in click volume for many news-heavy and lifestyle websites. It is a highly personalized feed that seems to understand users with an almost uncanny level of precision, surfacing content based on their browsing history, interests, and location. Unlike traditional search, which relies on a user entering a specific query, Google Discover is proactive. It delivers content before the user even knows they want to read it. It lives in multiple high-visibility locations: the Google app, the “New Tab” page in Chrome for mobile, Android home screens, and even the mobile version of Google.com. Because Discover is so pervasive, optimizing for it is no longer optional for brands looking to maintain a dominant digital presence. However, securing a spot in this elusive feed requires more than just good writing; it requires a foundation of specific technical fixes and authority signals. Essential considerations before we begin optimizing for Discover Before diving into the technical nuances, it is important to understand the fundamental nature of Google Discover. It is not a “one size fits all” platform, and its algorithm behaves very differently from the standard Search Engine Results Pages (SERPs). Managing expectations and understanding the current landscape of the feed is the first step toward a successful strategy. Discover favors timely content Google Discover is primarily a discovery engine for what is happening “now.” Content that performs well is almost always highly time-relevant. This is why major news publishers often see the lion’s share of Discover traffic. While evergreen content can occasionally surface if it aligns perfectly with a user’s sudden interest in a specific hobby or topic, the feed generally prioritizes breaking news, trending stories, and fresh perspectives on current events. For many sites, this creates a unique traffic profile. It is common for a website to receive millions of clicks from Discover while seeing significantly less volume from traditional search. This is because Discover captures “passive” intent, whereas search captures “active” intent. If your site doesn’t focus on timely or trending topics, your Discover potential may be lower than that of a news-oriented competitor. The shifting landscape: Discover traffic is declining It is worth noting that many publishers have reported a decline in Discover traffic over the past year. This is largely due to structural changes Google has made to the feed. As of late 2025, Google has integrated a larger volume of social media posts and AI-generated summaries of major stories into the Discover experience. These elements often displace individual articles that previously dominated the feed. This shift reflects Google’s desire to keep users within its ecosystem while providing a broader range of perspectives. Interestingly, as social media becomes a larger part of the Discover experience, Google has begun beta testing features in Search Console to track traffic specifically from social platforms. This suggests that the lines between “web search” and “social discovery” are blurring. The balance of content and technicality No amount of technical optimization can save poor content. Google Discover’s primary goal is to provide a high-quality, relevant experience for the user. If your content doesn’t resonate or if it uses “clickbaity” tactics that lead to high bounce rates, Google will quickly stop surfacing your articles. However, if you have great content that isn’t showing up, the problem is likely technical. Technical fixes ensure that your content is “eligible” and “attractive” to the algorithm, allowing your editorial efforts to actually reach the audience they deserve. Technical optimizations for Discover When auditing a website for Discover performance, there are three primary pillars to address: the publisher profile, image optimization, and transparency signals. These elements form the technical backbone of how Google identifies and trusts your content enough to put it in front of millions of users. Optimizing your Discover publisher profile Your Discover publisher profile is the entity-level representation of your brand within Google’s ecosystem. It is essential to ensure that your website and all associated social profiles are correctly linked. To do this, you first need to identify your publisher profile page. Since Google doesn’t provide a direct dashboard for this, SEOs often use third-party tools like Damian Tsuabaso’s profile builder. While the interface may be in Spanish, it is highly effective: you simply enter your brand name, URL, or Entity ID to find your profile. These profile pages are intrinsically linked to your entity’s Knowledge Graph ID (KGMID). This is a crucial detail because it means Discover isn’t just looking at your website; it’s looking at what Google “knows” about your brand as a whole. If your brand has recently undergone a rebranding or an acquisition, your publisher profile may be fragmented or unclear. This requires Knowledge Graph optimization to clarify your brand’s entity relationship to Google. To strengthen your publisher profile, focus on the following: Social Media Aggregation: Google is increasingly pulling social media posts directly into the Discover feed. To ensure your social accounts are linked to your publisher profile, you must use Organization schema with “sameAs” elements that list your official profiles. Cross-Linking: Link to your social accounts clearly in your website’s footer. Conversely, ensure your social media bios link back to your primary website. This creates a closed loop of authority that helps Google verify your brand’s legitimacy. High-impact image optimization If there is one technical “fix” that offers the highest return on investment for Discover, it is image optimization. Google’s own documentation explicitly states that large, high-quality images are essential for visibility. When an article appears in Discover, it appears as a card. If the image is small or low-resolution, the click-through rate (CTR) will plummet, and Google will eventually stop showing the content. To optimize your images for the Discover feed, follow these specific technical requirements: The max-image-preview:large tag: This is a non-negotiable meta tag. It tells Google that it has permission to display your

Uncategorized

How to increase Google Discover traffic with technical fixes

Google Discover has transformed from a niche mobile feature into one of the most significant traffic drivers for modern publishers. First catching major attention around 2021, it began delivering millions of monthly clicks to sites that understood how to play by its unique rules. Unlike traditional search, where users actively type a query, Discover is a proactive, AI-driven feed that anticipates what a user wants to see based on their interests, browsing history, and location. Today, Discover is integrated into almost every corner of the Google ecosystem. It appears on Chrome new tabs, within the dedicated Google app, on Android home screens, and even on the Google.com mobile homepage. For many publishers, Discover traffic now frequently exceeds traditional organic search traffic. However, capturing this audience requires more than just high-quality writing; it demands a specific set of technical optimizations and a deep understanding of how Google’s entity-based ranking works. Essential considerations before optimizing for Discover Before diving into technical fixes, it is crucial to understand that Google Discover operates differently than the standard Search Engine Results Page (SERP). It is not a viable traffic source for every brand or every type of content. Success in Discover is built on a foundation of timeliness, authority, and user relevance. Discover favors timely, trending content The content that thrives in the Discover feed is almost always time-sensitive. Google prioritizes breaking news, trending topics, and fresh perspectives from authoritative sources. While evergreen content occasionally surfaces, it is the exception rather than the rule. If your content strategy focuses solely on “how-to” guides from three years ago, you may find Discover a difficult nut to crack. Sites that see the highest engagement often function as news outlets or topical authorities that publish frequently on current events. The changing landscape of Discover traffic It is important to note that many publishers have seen a decline in Discover traffic recently. This is largely due to Google’s evolving strategy for the feed. As of the September 2025 updates, the Discover feed has begun to integrate a significant volume of social media posts and AI-generated summaries of major news stories. These AI summaries aggregate information from multiple sources, often displacing individual article links that used to dominate the space. Google is also placing a heavier emphasis on social signals. In fact, Google has been beta testing the ability to track traffic to social platforms within Search Console. This suggests that a brand’s presence on platforms like X (formerly Twitter), LinkedIn, or YouTube now plays a direct role in how its web content is surfaced in Discover. If your social media strategy is lagging, your Discover performance likely is too. The foundation of high-quality content Technical SEO can provide the infrastructure for success, but it cannot save poor content. Google’s algorithms are increasingly sophisticated at determining user satisfaction. If users consistently swipe past your articles or click and immediately bounce, Google will stop showing your content. Relevance is the ultimate currency in Discover. Before exploring technical causes for a traffic drop, always review your recent content to see if it still aligns with the shifting interests of your target audience. Technical optimizations for Google Discover When auditing a website for Discover performance, there are three primary technical pillars to address: the Publisher Profile, image optimization, and transparency signals. These elements help Google understand who you are, what you publish, and how to display your content most effectively to users. Mastering the Discover publisher profile One of the most overlooked aspects of Discover SEO is the Publisher Profile. This is the entity-level representation of your brand within Google’s Knowledge Graph. To maximize your visibility, you must ensure your website and social profiles are correctly linked in Google’s eyes. To audit your status, you can use specialized tools like the one developed by Damian Tsuabaso. By inserting your brand name, URL, or Entity ID, you can view your profile page as Google sees it. This is vital because Discover profile pages are intrinsically linked to your Knowledge Graph ID (KGMID). The URL string on these profile pages is often a tokenized version of your KGMID, serving as a digital fingerprint for your brand’s authority. When reviewing your profile, ask two critical questions: First, does the profile accurately reflect your current brand? If you have recently rebranded, been acquired, or changed niches, Google may still be associating you with old entities. This requires Knowledge Graph optimization to clarify your brand’s identity. Second, are your social media accounts appearing on this profile? Because social posts now occupy significant real estate in the Discover feed, having your accounts linked is no longer optional. To bridge the gap between your website and your social profiles, you should: Ensure your Organization schema includes “sameAs” elements that list all official social media URLs. Include clear links to your social profiles in the website footer. Ensure your social media bios link back to your primary website domain. Optimizing images for the 1,200px standard Google’s own documentation makes it clear: large, high-quality images are the single most effective way to increase click-through rates (CTR) in Discover. When an article appears in the feed, it is represented by a card. A large, compelling image makes that card significantly more attractive to the user. There are three technical requirements for images in Discover: 1. The max-image-preview tag: Many Content Management Systems (CMS) do not include the “max-image-preview:large” robots meta tag by default. This tag tells Google that it has permission to use your high-resolution images as the article preview. Without it, Google may default to a small thumbnail, which drastically reduces engagement. 2. Image resolution and width: Your hero images must be at least 1,200 pixels wide. While the actual rendered size on a mobile screen might be smaller, providing a high-resolution source file ensures that Google can display a crisp, full-width preview on high-DPI devices. Ensure that your CMS is not automatically compressing or resizing these images to a smaller width before they reach the user. 3. Open Graph (OG)

Uncategorized

Gen Z Preference For TikTok Over Google Drops 50%, Data Shows via @sejournal, @MattGSouthern

Gen Z Preference For TikTok Over Google Drops 50%, Data Shows via @sejournal, @MattGSouthern The Shifting Landscape of Search: Why Gen Z is Returning to Google For the past several years, a dominant narrative has permeated the digital marketing industry: Google is losing its grip on the younger generation. Reports suggested that TikTok was becoming the search engine of choice for Gen Z, with visual-first discovery replacing the traditional text-based query. However, new data suggests that the “death of Google” may have been greatly exaggerated. Recent survey data reveals a significant shift in user behavior. While nearly half of all U.S. consumers—49% to be exact—have utilized TikTok as a search tool, the overwhelming preference for TikTok over Google among Gen Z has seen a dramatic 50% decline. This pivot marks a critical moment for SEO professionals, brand managers, and content creators who have been pivoting their strategies entirely toward social media discovery. Understanding this shift requires a deep dive into how search intent is evolving. It is no longer a winner-take-all battle between social media and traditional search engines. Instead, we are entering an era of search fragmentation, where users choose their platform based on the specific type of information they need. Deconstructing the Data: Usage vs. Preference To understand why Gen Z’s preference for TikTok over Google has dropped so sharply, we must first distinguish between usage and preference. The fact that 49% of consumers use TikTok for search is an impressive feat for a platform that began as a short-form video app. It confirms that “Social SEO” is a legitimate and necessary discipline. However, usage does not always equate to loyalty or primary reliance. The data indicates that while Gen Z continues to use TikTok to discover trends, recipes, and fashion, they are returning to Google for more “high-stakes” or complex information. The 50% drop in preference suggests that the initial novelty of TikTok search is wearing off, or perhaps more accurately, that the limitations of social media search are becoming more apparent to the users themselves. For marketers, this means the strategy cannot be “either-or.” It must be “both.” The decline in preference for TikTok does not mean Gen Z is abandoning the platform; rather, it suggests they are rediscovering the utility and reliability of a traditional search engine for specific tasks. Why TikTok Initially Gained Ground TikTok’s rise as a search engine was driven by a desire for authenticity and visual proof. Gen Z, often characterized by their skepticism of polished corporate advertising, turned to TikTok to see “real people” using products or visiting locations. When a user searches for a restaurant on Google, they get a list of links, a map, and perhaps some static photos. When they search on TikTok, they get a first-person perspective of the atmosphere, the food being plated, and an honest review of the service. This “experiential search” was something Google struggled to replicate for years. Furthermore, TikTok’s algorithm is exceptionally good at understanding user intent through engagement. The “For You Page” (FYP) functions as a passive search engine, serving content that users didn’t even know they wanted. This led to a natural transition where users began using the search bar to find more of what the algorithm was already showing them. The Limitations of Social Search Despite its strengths, TikTok search has inherent flaws that have likely contributed to the 50% drop in preference. The primary issue is the spread of misinformation. Because TikTok prioritizes engagement and virality, content that is sensationalized or factually incorrect can often rise to the top of search results. For informational queries—such as medical advice, financial planning, or technical troubleshooting—Gen Z is realizing that a 60-second video from an unverified creator may not be the most reliable source. Google’s emphasis on E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness) provides a level of security that social platforms currently cannot match. Another factor is the “noise” of social media. On TikTok, search results are often cluttered with trending sounds, memes, and tangentially related content. When a user needs a quick, direct answer—such as “what time does the pharmacy close?” or “how to fix a leaking pipe”—the speed and precision of Google’s snippets and local business listings remain superior. Google’s Response: Adapting to a Visual World Google did not sit idly by while TikTok encroached on its territory. The search giant has undergone significant transformations to appeal to younger audiences who crave visual and interactive content. One of the most notable changes is the integration of “Perspectives” and the increased visibility of Reddit and Quora threads within the Search Engine Results Pages (SERPs). By surfacing first-hand human experiences from forums, Google is effectively competing with the “authenticity” that users previously sought on TikTok. Additionally, Google has improved its ability to index and display short-form video. It is now common to see a “Short videos” carousel in Google mobile search results, featuring content from YouTube Shorts, Instagram Reels, and even TikTok. By bringing social content directly into the SERP, Google has removed the need for users to leave the platform to find the visual answers they want. The introduction of AI Overviews (formerly SGE) also plays a role. By providing synthesized, conversational answers at the top of the page, Google is meeting the demand for instant gratification that social media popularized, while still backing those answers with reputable web sources. The Role of AI in Reclaiming Search Dominance Artificial Intelligence is the new frontier in the battle for search supremacy. As Google integrates Gemini into its search ecosystem, the way users interact with information is changing. For Gen Z, who are early adopters of AI tools, the ability to ask complex, multi-layered questions and receive a comprehensive summary is a major draw. TikTok is also experimenting with AI search features, but Google has a decades-long head start in data crawling and information organization. The ability to link an AI-generated answer to a wide array of reliable sources gives Google a competitive edge in “verification search”—a behavior where users

Uncategorized

See how leaders bridge the engagement divide by attending ‘Engage with SAP Online’ by SAP Engagement Cloud

In the rapidly evolving landscape of digital marketing and customer experience, a fundamental question is keeping CMOs and digital strategists awake at night: How healthy are your customer relationships? It is a question that goes far beyond open rates, click-through percentages, or quarterly sales targets. It strikes at the heart of brand longevity and consumer trust. For most organizations, the answer is more complex than it appears. Over the last two decades, the corporate world has meticulously built its infrastructure around channels. We have email teams, social media departments, e-commerce specialists, and in-store coordinators. Each of these silos typically operates with its own technology stack, its own key performance indicators (KPIs), and its own definition of what a “win” looks like. From an internal perspective, these departments often look successful. But from the customer’s perspective, the experience is often fragmented, repetitive, and frustrating. To address this growing disconnect, SAP Engagement Cloud is hosting a landmark virtual event, “Engage with SAP Online,” on March 11, 2026. This half-day digital summit is designed specifically for leaders who are ready to stop optimizing channels in isolation and start building a cohesive, relationship-driven strategy that actually resonates with modern consumers. Understanding the Engagement Divide The central theme of the upcoming event is the “Engagement Divide.” This term describes the widening gap between what customers expect from a brand and what those brands are actually capable of delivering. While many companies believe they are providing a seamless experience, the data suggests a different reality. According to the SAP Engagement Index—a comprehensive global study of 10,000 consumers and 4,800 senior decision-makers—75% of consumers report being significantly put off by disorganized brands. These are brands that pass customers between multiple departments to resolve a single issue or fail to remember a customer’s previous interactions. Conversely, 77% of brands claim their engagement strategies already deliver seamless experiences. This discrepancy highlights a massive blind spot in the modern enterprise. The Engage with SAP Online event aims to bridge this divide by providing leaders with the tools, data, and insights necessary to align their internal processes with external customer expectations. It is no longer enough to be “omnichannel” in name only; brands must be “relationship-centric” in practice. Key Speakers and Industry Insights The event features a high-caliber lineup of speakers who bring both academic rigor and real-world experience to the table. The day opens with Sara Richter, the CMO of SAP Engagement Cloud. Richter will unveil the full findings of the SAP Engagement Index, providing a data-driven foundation for the sessions that follow. The keynote address will be delivered by Mark Ritson, a professor, founder of MiniMBA, and one of the most respected voices in the marketing world. Known for his “no-nonsense” approach, Ritson’s session, “Trends Shaping Customer Experience: What’s Real, What’s Not, and What Matters Most Now,” will cut through the industry hype. Ritson is expected to challenge the traditional view that loyalty lives solely within the marketing department, arguing instead that every touchpoint—from sales to service—shapes the brand’s health. Following the keynote, attendees will hear from practitioners who are implementing these shifts in large-scale organizations: Jutta Richter (Head of 1:1 Campaign Management, BMW Group): Richter will discuss the complexities of modern customer journeys and how a premium brand like BMW ensures relevance in an era where customers often make up their minds before even speaking to a representative. Daniele Tedesco (Ecommerce Global Process Owner, Essity) and Venky Naravulu (Director of Partner Solutions, Sinch): This duo will join Mark Ritson to discuss the practicalities of modernizing engagement through AI and connected systems. They will provide a deep dive into how technology can facilitate human-like continuity at scale. The Urgency of a Relationship-First Model Why is this conversation happening now? Several macroeconomic and technological factors have converged to make customer engagement more difficult—and more critical—than ever before. First, the cost of customer acquisition (CAC) has skyrocketed across nearly every industry. When winning a new customer costs five to ten times more than retaining an existing one, the stakes for every interaction become incredibly high. Second, the erosion of third-party tracking and the shift toward privacy-first browsing mean that brands can no longer rely on external data to “find” their customers. They must build their own first-party data ecosystems, which are only possible if customers feel a sense of value and trust in their relationship with the brand. Finally, the rise of AI has empowered consumers. With AI tools at their fingertips, shoppers can compare prices, read reviews, and find alternatives in seconds. If a brand fails to recognize a customer or provides a disjointed experience, that customer has the tools and the motivation to switch to a competitor instantly. The Mismatch in Channel Strategy One of the most revealing aspects of the SAP Engagement Index is the mismatch between where customers are and where brands are spending their energy. The research indicates that: 41% of consumers prefer to shop and interact via mobile apps, but only 28% of brands have a significant engagement presence there. 43% of consumers prefer online and e-commerce shopping, yet only 26% of brands successfully engage via these web channels in a personalized way. This “channel mismatch” is a symptom of a larger problem: legacy thinking. Many organizations are still catching up to the digital shifts of the last decade, while their customers have already moved on to the next phase of engagement. The “Engage with SAP Online” event will explore how to reallocate resources to meet customers where they actually spend their time. Moving Toward Maturity: The 21% Goal SAP’s research assessed how well organizations align their people, processes, and technology around customer engagement. The results were sobering. Only 21% of organizations scored at a “high maturity” level. These are the leaders who have unified their data, broken down departmental silos, and can deliver a consistent experience across every touchpoint. The vast majority of businesses—63%—sit in the middle. These “middle-tier” companies can handle basic personalization, such as putting a customer’s name in an email,

Uncategorized

Anthropic’s Claude Bots Make Robots.txt Decisions More Granular via @sejournal, @MattGSouthern

The landscape of web crawling and data indexing is undergoing a monumental shift as artificial intelligence companies seek more efficient ways to interact with the open web. Anthropic, the developer behind the Claude AI family of models, recently updated its crawler documentation to introduce a more nuanced approach to how its bots interact with website content. By moving away from a monolithic crawling system and toward a more granular set of user-agents, Anthropic is providing webmasters and SEO professionals with unprecedented control over how their data is consumed by AI. This update is particularly significant in an era where the tension between content creators and AI developers is at an all-time high. Publishers are increasingly concerned about how their proprietary information is used to train large language models (LLMs) without compensation or attribution. Conversely, AI developers need access to high-quality, up-to-date information to remain competitive. Anthropic’s new granular bot system aims to strike a balance, offering transparency and choice through the standard robots.txt protocol. Understanding the Shift to Granular Bot Control Traditionally, web crawlers were relatively straightforward to manage. Googlebot, Bingbot, and a few others dominated the landscape, and their purpose was clear: index content for search engine results. However, the rise of generative AI has complicated this dynamic. AI companies now crawl the web for multiple reasons, ranging from long-term model training to real-time information retrieval on behalf of a specific user. Anthropic has addressed this complexity by categorizing its crawlers into three distinct entities. Each bot serves a specific purpose, and by separating them, Anthropic allows site owners to decide whether they want their content used for training, for real-time user requests, or for search-style indexing. This move represents a major step forward in technical transparency and digital rights management for publishers. The Three Faces of Claude: Identifying the New Bots The update to Anthropic’s documentation outlines three primary user-agents that webmasters should be aware of. Understanding the difference between these is essential for any SEO strategy that seeks to protect intellectual property while maintaining digital visibility. 1. Anthropic-ai (The Training Bot) The “anthropic-ai” crawler is designed specifically for data collection that will be used to train future versions of the Claude model. When this bot visits a site, it is gathering information to expand the model’s foundational knowledge. For publishers, this is often the most controversial bot, as it involves the consumption of content that might later be synthesized by the AI without driving direct traffic back to the source. 2. Claude-user (The Real-Time Request Bot) The “claude-user” agent functions differently. This bot is triggered when a person using the Claude interface specifically asks the AI to visit a URL, summarize a page, or analyze specific live web data. This is an “on-demand” crawler. If a site owner blocks this bot, they are essentially telling Claude users that they cannot interact with that site’s content through the AI interface. This has significant implications for user experience and how information is shared in AI-driven workflows. 3. Claude-web-search (The Indexing Bot) While still being refined in its utility, the “claude-web-search” bot appears to be aimed at more traditional indexing tasks that support Claude’s ability to “search” the web for answers. This suggests a move toward a more integrated search-AI experience, similar to what we see with Perplexity AI or SearchGPT. By allowing this bot while blocking others, a publisher might permit their site to be found in AI search results while still opting out of having their data used for general model training. The Technical Implementation: Using Robots.txt For SEOs and developers, managing these bots is handled through the standard robots.txt file. This file acts as a gatekeeper, telling automated systems which parts of a site are off-limits. Anthropic’s decision to respect these directives is a sign of good faith in the broader ecosystem. To block the training bot specifically, a webmaster would add the following to their robots.txt file: User-agent: anthropic-aiDisallow: / However, if they want to allow users to still bring Claude into the conversation to summarize their articles, they would need to ensure that the “claude-user” agent remains unblocked. If a webmaster uses a blanket “Disallow: /” for the “anthropic-ai” agent, it does not necessarily block “claude-user” unless specifically specified or if the site uses a generic wildcard block that affects all bots. The Strategic Trade-offs: Visibility vs. Protection The introduction of granular controls presents a strategic dilemma for digital publishers. It is no longer a simple “yes” or “no” decision regarding AI crawling. Instead, it is about weighing the trade-offs of visibility versus data protection. If you block all Anthropic bots, you are effectively taking your site off the map for one of the world’s most popular AI platforms. This means your brand, your data, and your perspectives will not be represented in the answers Claude provides to millions of users. For news organizations, this might mean a loss of influence. For niche technical blogs, it could mean that the AI will provide outdated or incorrect information about their area of expertise because it lacks access to the primary source. On the other hand, allowing the “anthropic-ai” training bot means your content is being used to build a product that may eventually compete with you for user attention. This is the “cannibalization” fear that keeps many publishers awake at night. By providing granular options, Anthropic is allowing sites to opt-in to the utility (claude-user) while opting out of the data harvesting (anthropic-ai). Impact on SEO and Crawl Budgets From a technical SEO perspective, the proliferation of AI bots adds another layer to crawl budget management. Every time a bot visits your site, it consumes server resources. For large-scale enterprise sites with millions of pages, managing how many different bots are hitting the server simultaneously is a legitimate concern. By categorizing their bots, Anthropic allows SEOs to prioritize which crawlers deserve those server resources. If a site’s primary goal is to provide real-time utility to users, they might prioritize the “claude-user” bot and set crawl

Uncategorized

The 10 Best PPC Ad Networks via @sejournal, @LisaRocksSEM

The digital advertising landscape has undergone a seismic shift as we move further into 2026. For performance marketers, the “set it and forget it” mentality of a decade ago is a relic of the past. Today, Pay-Per-Click (PPC) advertising is a sophisticated dance between human strategy and generative AI. With the phasing out of traditional third-party cookies and the rise of privacy-first tracking, the networks you choose to invest in determine not just your reach, but your brand’s ultimate survival in a competitive market. To maximize your Return on Ad Spend (ROAS), you need to look beyond mere clicks. You need to evaluate platforms based on their targeting depth, the maturity of their automation tools, and their ability to reach users at various stages of the buyer’s journey. Whether you are scaling a niche e-commerce store, promoting a high-growth tech startup, or launching the next big gaming title, these are the 10 best PPC ad networks to prioritize in 2026. 1. Google Ads: The Unrivaled King of Search Google Ads remains the cornerstone of any comprehensive PPC strategy. By 2026, the platform has fully transitioned into an AI-first ecosystem. While Search remains the crown jewel, Google’s reach extends across YouTube, Gmail, Maps, and millions of partner websites via the Display Network. The biggest evolution in Google Ads is the refinement of Performance Max (PMax) campaigns. PMax now leverages Google’s Gemini AI to dynamically generate ad copy, images, and even short-form video content based on your landing page. For marketers, this means less time spent on manual A/B testing and more time focusing on high-level strategy and creative direction. Google’s Search Generative Experience (SGE) has also fundamentally changed how ads appear. Ads are now integrated directly into AI-generated answers, providing a more conversational and contextual experience for the user. If your goal is high-intent traffic, Google Ads is non-negotiable. 2. Meta Ads: The Power of Social Influence Meta (Facebook and Instagram) continues to dominate the social PPC space through sheer scale and the sophistication of its “Advantage+” suite. In 2026, Meta’s algorithm has become so efficient at predicting user behavior that manual audience targeting is often less effective than letting the AI find your customers for you. For tech and gaming brands, Instagram remains a visual powerhouse. The integration of augmented reality (AR) ads allows users to “try on” products or experience a game environment before clicking through. Furthermore, Meta’s Conversions API (CAPI) has become the gold standard for navigating the cookieless world, allowing advertisers to send data directly from their servers to Meta to maintain attribution accuracy. 3. Amazon Advertising: The Retail Media Giant If you are selling a physical product, Amazon Advertising is no longer optional—it is essential. Amazon has surpassed many traditional networks to become the third-largest digital ad platform. Its greatest strength lies in its first-party data; Amazon knows exactly what people buy and when they buy it. In 2026, Amazon has expanded its PPC offerings beyond simple Sponsored Products. The Amazon Demand-Side Platform (DSP) now allows advertisers to reach audiences across the web and on high-value properties like Twitch and Prime Video. For gaming companies, the synergy between Amazon and Twitch offers a unique opportunity to reach a highly engaged, tech-savvy audience through interactive video ads and sponsored streams. 4. Microsoft Advertising: The B2B and AI Powerhouse Often overshadowed by Google, Microsoft Advertising (formerly Bing Ads) has seen a massive resurgence. This growth is driven largely by the integration of AI-powered search via Copilot. Users are increasingly turning to Bing for complex queries, and the ad placements within these AI chats offer exceptionally high engagement rates. Microsoft Advertising also holds a unique advantage for B2B marketers: LinkedIn profile targeting. Because Microsoft owns LinkedIn, you can target users on the Bing search results page based on their job title, company name, or industry. With a typically lower Cost-Per-Click (CPC) than Google, Microsoft Advertising is a high-value network for those looking to stretch their budget further. 5. LinkedIn Ads: Precision Targeting for Professionals When it comes to B2B lead generation, LinkedIn Ads remains the undisputed leader. No other platform offers the same level of granular targeting for professional demographics. In 2026, LinkedIn has doubled down on “Thought Leader Ads,” allowing companies to sponsor posts from their executives’ personal profiles to build trust and authenticity. While LinkedIn has a reputation for being more expensive, the quality of the leads often justifies the premium. The platform’s “Lead Gen Forms” have become increasingly frictionless, often pre-filling user data to ensure high conversion rates. For tech companies offering SaaS solutions or gaming studios looking for industry partners, LinkedIn is a critical component of the marketing mix. 6. TikTok Ads: The Home of Viral Growth TikTok has evolved from a simple video-sharing app into a full-fledged search and discovery engine. For the younger demographic—Gen Z and Gen Alpha—TikTok is often the first place they go to search for products, tutorials, or game reviews. TikTok Ads are unique because they require a “native” feel. The phrase “Don’t make ads, make TikToks” is more relevant in 2026 than ever before. The platform’s Spark Ads allow brands to boost organic content, leveraging the power of creators to build social proof. For gaming brands, TikTok is a goldmine for user-generated content (UGC) campaigns and viral challenges that can drive millions of installs overnight. 7. X (Formerly Twitter) Ads: Real-Time Engagement Despite a turbulent few years, X remains the go-to platform for real-time news, tech discussions, and the gaming community. Its PPC network is particularly effective for event-based marketing, product launches, and high-velocity conversations. X’s ad platform has pivoted toward more performance-based models in 2026, offering better tracking for app installs and website conversions. For tech news sites and gaming developers, the ability to trend via promoted hashtags or target specific “communities” makes X a powerful tool for maintaining brand relevance in fast-moving industries. 8. Pinterest Ads: The Discovery Engine Pinterest is often undervalued by PPC marketers, but it occupies a unique space in the funnel: the

Uncategorized

New Platforms Won’t Save Social Media: Here’s What’s Actually Shifting via @sejournal, @rio_seo

The Illusion of the New Platform For the past several years, the digital world has been caught in a cycle of migration. Whenever a major social media platform faces a crisis of leadership, a shift in policy, or a perceived decline in “vibes,” a mass exodus begins. We saw it with the rise of Mastodon, the rapid surge of Threads, and the niche appeal of Bluesky. Each time, the narrative is the same: this new platform will be the one to save our digital social lives. It will be the one to restore the “old internet” or provide a safer, more curated space for discourse. However, the reality is far more complex. The fundamental challenges facing social media—fragmentation, algorithmic fatigue, and the erosion of trust—cannot be solved by simply changing the user interface or moving to a different server. New platforms are merely different containers for the same evolving behaviors. The real transformation isn’t happening in the “where” of social media, but in the “how” and “why.” We are witnessing a monumental shift away from the platform-centric model toward a world defined by machine interpretation, behavioral signals, and critical decision-making moments. The Death of the Social Graph and the Rise of the Interest Graph To understand what is actually shifting, we must first look at the decline of the traditional social graph. In the early days of Facebook and Twitter, your experience was defined by who you followed. If you followed your friends, family, and a few celebrities, your feed was a chronological or semi-algorithmic reflection of those connections. This was the “social” in social media. Today, that model is largely obsolete. Led by the success of TikTok, the industry has pivoted toward the “interest graph.” In this new paradigm, the algorithm doesn’t care who you are friends with; it cares about what you are watching, how long you are watching it, and what you do immediately afterward. Machine interpretation has replaced human connection as the primary architect of the user experience. This shift means that “new platforms” are often just trying to replicate a better version of this machine-led curation. But the machine is only as good as the data it processes. When we move from one platform to another, we are often just feeding the same behavioral data into a different black box. The underlying mechanism—the prioritization of engagement over connection—remains the same. Machine Interpretation: The New Gatekeeper One of the most significant shifts in the digital landscape is the move toward advanced machine interpretation of content. In the past, algorithms relied heavily on metadata: tags, keywords, and captions. Today, AI models can “see” and “hear” content with a level of nuance that was previously impossible. They can detect sentiment, identify objects in the background of a video, and understand the cultural context of a meme without a single line of descriptive text. This has profound implications for brands and creators. It means that the old tricks of SEO and “hacking the algorithm” are becoming less effective. You cannot simply optimize for a keyword if the machine interpretation of your video suggests that the content is low-quality or irrelevant to the user’s current mood. The algorithm is no longer just a sorter; it is an interpreter of intent. For marketers, this requires a total rethink of content strategy. It’s no longer about hitting a certain frequency of posts or using the right hashtags. It’s about creating content that provides a clear, interpretable signal to the machine that your content matches a specific user behavior or need. This leads us directly into the next major shift: the rise of decision-making moments. Social Media as a Decision-Making Engine We are moving past the era where social media was primarily for “killing time.” Increasingly, social platforms are functioning as search engines and decision-making tools. Whether it’s a Gen Z user searching for a restaurant on TikTok instead of Google Maps, or a professional looking for B2B software recommendations on LinkedIn, the intent behind social media usage is shifting toward utility. These decision-making moments are where the real value lies for the future of the web. Users are looking for trust and authority in an environment that is increasingly saturated with AI-generated noise. When a user reaches a decision-making moment, they aren’t looking for a “platform”; they are looking for a signal they can trust. This might be a recommendation from a creator they’ve followed for years, or a highly relevant video that demonstrates a product in a real-world setting. The platforms that “win” in this new era won’t necessarily be the ones with the most users, but the ones that successfully facilitate these moments of intent. This is why we see platforms like Instagram and Pinterest leaning so heavily into shopping features. They are trying to close the gap between discovery and action. The Role of Trust in a Post-Truth Social Landscape As machine interpretation becomes more sophisticated, the value of human trust skyrockets. We are entering an era of “synthetic abundance,” where AI can generate endless streams of content, images, and even personas. In this environment, the “social” aspect of social media is being redefined as a search for authenticity. Users are becoming hyper-aware of polished, corporate messaging. They are gravitating toward “unfiltered” content and community-driven spaces like Discord or niche Reddit subreddits. This is the “trust shift.” If a new platform wants to “save” social media, it cannot do so with better code alone; it must foster an environment where trust can actually be built and maintained. For brands, this means that the “influencer” model is evolving. It’s no longer enough to have a large following. Influence is being replaced by authority. Can you prove that you know what you’re talking about? Can you provide value that the machine cannot replicate? Trust is the only currency that isn’t being devalued by the rise of AI. Why New Platforms Fail to Solve the Core Issues Every time a new platform launches, it experiences a “honeymoon phase.” The early adopters

Scroll to Top