Author name: aftabkhannewemail@gmail.com

Uncategorized

Bing is testing a much larger sponsored product carousel in shopping results

The Evolution of Microsoft Bing’s Shopping Interface In the competitive world of search engines, real estate is the most valuable currency. For years, Google has dominated the visual landscape of e-commerce search results, but Microsoft Bing is signaling a major shift in its strategy. Recently, Bing has been spotted testing a significantly expanded sponsored products section within its shopping search results. This new layout, which features a prominent double-rowed carousel, marks a departure from the more conservative, single-row formats users have grown accustomed to. For digital marketers and e-commerce retailers, this change is more than just a cosmetic update. It represents a fundamental shift in how Microsoft intends to monetize its search traffic and how it plans to challenge Google’s long-standing supremacy in the shopping vertical. By doubling the number of products visible at the top of the Search Engine Results Page (SERP), Bing is effectively prioritizing paid visibility over organic discovery in a way that mirrors the most aggressive layouts seen in modern search technology. Understanding the New Double-Row Carousel Format The experimental layout was first brought to light by digital marketer Sachin Patel, who observed the change while performing a search for “cushions” on Bing. Instead of the standard horizontal scroll that displays a handful of products in a single line, the new test presents a grid-like double-row carousel. This format allows for a much higher density of product listings to appear “above the fold,” which is the area of the screen visible without scrolling. This expanded sponsored section pairs large, high-resolution product images with essential metadata such as price, retailer name, and star ratings. Beneath this massive block of advertisements, organic cards from individual websites are still present, but they are pushed further down the page. The result is a search experience that feels more like an online marketplace—similar to Amazon or Wayfair—than a traditional list of web links. The visual impact of this change cannot be overstated. In a mobile-first world where screen space is limited, or even on large desktop monitors, a double-row carousel captures the user’s immediate attention. It limits the likelihood that a user will scroll past the sponsored content to find organic results, thereby increasing the probability of a click on a paid advertisement. The Discovery and the Nature of Search Engine Testing As with many major updates in the search world, this feature was not launched globally with a press release. Instead, it was discovered through live environment testing. Sachin Patel shared his findings on social media, showcasing how the “cushion” search query triggered this massive ad block. However, the search community quickly realized that this was not a universal rollout. Mordy Oberstein, a well-known figure in the search industry, attempted to replicate the results and found a much more compact, traditional layout. This discrepancy is a classic example of “A/B testing” or “bucket testing.” Search engines like Bing and Google constantly run experiments on a small percentage of their user base to gather data on user behavior. They measure metrics such as click-through rates (CTR), bounce rates, and total revenue generated per search session. Only if the data shows a clear benefit—usually in the form of increased ad revenue without a significant drop in user satisfaction—will the feature be rolled out to the general public. Why Microsoft is Expanding Its Ad Footprint Microsoft’s push for larger ad formats is a calculated move driven by several market factors. Over the last two years, Microsoft has successfully integrated AI-driven features like Copilot into Bing, which has led to a modest but notable increase in market share. With more eyes on the platform, the company is looking for ways to maximize the value of each user session. Retail and shopping queries are among the most lucrative in the advertising world. When a user searches for a product like “cushions,” “gaming laptops,” or “running shoes,” they are signaling high commercial intent. They are not looking for information; they are looking to buy. By expanding the sponsored product carousel, Microsoft is providing advertisers with more opportunities to get their products in front of these ready-to-buy consumers. Competitive Parity with Google Shopping For a long time, Microsoft Ads (formerly Bing Ads) was seen as a secondary platform—a place to capture the “leftover” traffic that Google didn’t reach. However, as Google’s SERP has become increasingly cluttered with “People Also Ask” boxes, AI Overviews, and massive shopping grids, Bing has had to evolve to stay competitive. A double-row carousel brings Bing closer to the visual density that Google offers, making it a more attractive platform for retailers who want their product imagery to do the heavy lifting in the sales process. The Impact on E-Commerce Advertisers If this format becomes a permanent fixture of the Bing search experience, the implications for retailers running Microsoft Shopping campaigns are significant. There are several key areas where advertisers will feel the impact: Increased Visibility and Impressions The most immediate benefit is a surge in impressions. In a single-row carousel, products positioned at the end of the list are rarely seen. In a double-row format, twice as many products are visible at once. This means that mid-tier bidders who might have previously been buried in the third or fourth scroll position could now find themselves visible on the initial page load. For retailers with high-quality product photography, this is a golden opportunity to steal market share from larger competitors. Potential Changes in Click-Through Rates (CTR) While impressions will likely go up, the impact on CTR is more complex. With more options on the screen, users may take longer to decide which product to click. This could lead to a more “window-shopping” style of behavior. However, because the ads take up so much space, the “leakage” to organic results is likely to decrease. Retailers will need to ensure their pricing and promotional offers (such as “Free Shipping” or “20% Off”) are clearly visible within the ad unit to win the click in a more crowded field. Strategic Bidding Adjustments A larger carousel

Uncategorized

SEO leads martech replacements, but not for the reason you think

SEO leads martech replacements, but not for the reason you think The marketing technology landscape is undergoing a tectonic shift. For years, the industry has focused on marketing automation platforms (MAPs) and Customer Relationship Management (CRM) systems as the primary drivers of stack evolution. However, 2025 marks a significant turning point. According to the 2025 MarTech Replacement Survey, SEO tools have officially become the most frequently replaced application in the marketing stack. At a time when search is being fundamentally redefined by Large Language Models (LLMs), AI-generated answers, and the proliferation of zero-click search experiences, one might assume that the high replacement rate of SEO tools indicates a sector in distress. On the surface, it looks like churn—a sign of dissatisfaction or a desperate search for answers in a volatile market. Yet, the data suggests something entirely different. While SEO tools are being swapped out more than any other category, the underlying reason isn’t instability. Instead, we are witnessing a massive “upgrade cycle” driven by the need for advanced AI capabilities, smarter cost management, and a new generation of technical requirements that traditional SEO platforms simply weren’t built to handle. The SEO Paradox: Most Replaced but Highly Stable One of the most striking findings of the 2025 MarTech Replacement Survey is that SEO tools topped the replacement list for the first time, ending a five-year streak where marketing automation platforms held that position. But the headline only tells half the story. The paradox of the 2025 data is that while SEO tools are the most replaced category, they are actually being replaced at a slower rate than in previous years. To understand this, we have to look at the maturation of the SEO industry. In the past, high churn in software categories often signaled that the tools weren’t delivering on their promises or that the users didn’t know how to utilize them effectively. In 2025, the narrative has changed. The decrease in the replacement rate suggests that the category is stabilizing. Marketers aren’t abandoning their SEO tools because they’ve given up on organic search; they are refining their stacks to meet the specific challenges of the AI era. While SEO tools saw this nuanced stabilization, other major categories experienced much sharper declines in replacement activity. For instance, CRM replacements dropped by more than 12% compared to 2024, hitting the lowest level in the history of the survey. Marketing automation platforms and Content Management Systems (CMS) also saw fewer swaps. This indicates that while the “core” of the marketing stack (CRM and MAP) has become entrenched, the “growth” layer of the stack—specifically SEO—is where the most active innovation and re-evaluation are happening. The AI Catalyst: Upgrading to AI-Native Capabilities If the high replacement rate of SEO tools isn’t a sign of industry failure, what is driving it? The primary engine is the rapid integration of artificial intelligence. For the first time, the 2025 survey specifically investigated how AI impacts the decision to swap one tool for another, and the results were definitive. The survey found that 37.1% of marketers cited AI capabilities as a crucial factor in their replacement decisions, while 33.9% explicitly stated they were looking for better AI features when selecting a new tool. In the world of SEO, this shift is more than just a trend; it is a necessity for survival. Modern SEO is no longer just about tracking a list of 500 keywords and ensuring meta tags are the correct length. The rise of AI-driven Search Generative Experiences (SGE) means that the very nature of a “Search Engine Results Page” (SERP) has changed. To compete, marketing teams are moving away from legacy platforms and toward tools that offer: Advanced Content Optimization: Moving beyond simple keyword density toward semantic relevance and intent modeling powered by LLMs. SERP Analysis: Tools that can analyze AI Overviews and predict how AI-driven search engines will summarize a brand’s content. Workflow Automation: The ability to automate the technical drudgery of SEO—such as internal linking, schema markup, and image alt-text—allowing human experts to focus on strategy. In many cases, replacing an SEO tool in 2025 is an act of modernization. Marketers are opting for AI-native platforms that can help them navigate a world where a “search” might result in a synthesized answer rather than a list of links. Economic Pressures and the ROI of the Tech Stack While AI is the “shiny” reason for tech stack changes, the second major driver is more grounded in reality: cost. The economic climate of 2025 has forced a rigorous re-evaluation of every line item in the marketing budget. The survey data shows a massive spike in cost-consciousness. In 2025, 43.8% of marketers cited cost reduction as a primary reason for replacing a martech application. This is a dramatic increase from 23% in 2024 and 22% in 2023. As marketing budgets face increased scrutiny, the “stack bloat” of the last decade is being trimmed. For SEO teams, this often means moving away from a fragmented collection of niche tools—one for backlink analysis, one for rank tracking, one for technical audits—and consolidating into a single, high-performance platform that offers better value. Alternatively, some teams are moving in the opposite direction, ditching expensive “all-in-one” suites that are underutilized in favor of leaner, more specialized tools that perform specific tasks at a lower price point. The takeaway for SEO tool providers is clear: having the best features is no longer enough. The value proposition must be tied directly to efficiency and cost-effectiveness. Marketers are no longer willing to pay for “shelfware” that looks good on a demo but doesn’t contribute to the bottom line. The Resurgence of the Homegrown SEO Solution One of the most unexpected findings of the 2025 MarTech Replacement Survey is the return of the “homegrown” tool. For years, the trend in marketing was “buy, don’t build.” Organizations preferred the security and support of commercial SaaS platforms over the headache of maintaining internal software. However, that trend is beginning to reverse. In 2025, replacing a commercial

Uncategorized

Why too many micro-conversions hurt PPC performance

The Modern PPC Paradox: More Data, Less Performance In the current landscape of digital advertising, we are told that data is the lifeblood of success. AI-powered bidding systems, such as Google’s Smart Bidding and Meta’s Advantage+, are marketed as highly sophisticated engines that thrive on massive datasets. To feed these engines, ad platforms frequently encourage advertisers to track every conceivable user action—from newsletter signups and video views to simple page scrolls. These are known as micro-conversions. The logic seems sound: the more signals you give the algorithm, the faster it learns who your customers are. However, a growing number of PPC experts are sounding the alarm. While under-signaling leaves an algorithm “blind,” over-signaling creates “noise.” When an account is flooded with loosely defined micro-conversions, the result is often a catastrophic erosion of real-world results, even as platform dashboards report record-breaking performance. Understanding why too many micro-conversions hurt PPC performance is critical for any advertiser looking to maintain a healthy return on ad spend (ROAS) in 2024 and beyond. The Myth of the Data-Hungry PPC Algorithm The industry has embraced a fundamental assumption: algorithms need as much data as possible. This “data-hungry” narrative is reinforced by platform documentation, automated recommendations, and endless blog posts. While it is true that bidding systems require a minimum level of signal density to function, they do not benefit from indiscriminate data. In the world of machine learning, quality consistently beats quantity. Machine learning systems do not possess “common sense.” They do not evaluate the strategic relevance of a signal or understand your business goals in a human context. Instead, they evaluate three things: frequency, consistency, and predictability. When you mix high-intent signals (like a completed purchase) with low-intent signals (like a 30-second stay on a page), the algorithm treats them all as valid targets to optimize toward unless specifically told otherwise. Without a clear value hierarchy, the bidding algorithm naturally gravitates toward the path of least resistance. It will prioritize the actions that are the easiest and cheapest to achieve. This creates a structural bias toward high-frequency, low-value actions, inflating your conversion volume while your actual revenue remains stagnant or declines. Why PPC Bidding Follows the Path of Least Resistance To understand why too many micro-conversions are dangerous, we must understand how automation “thinks.” Imagine a bidding algorithm as a river. Water always takes the easiest route to reach the bottom of a hill. In a PPC campaign, the “bottom of the hill” is the conversion goal you have set. If you tell the system that a “Product Page View” is a primary conversion alongside a “Purchase,” the system sees two ways to succeed. Generating a “Product Page View” might cost $0.50, whereas a “Purchase” might cost $50.00. To the algorithm, the $0.50 action is an “efficient” way to hit its conversion volume targets. It will then shift budget away from the high-intent keywords that drive sales and toward broader, cheaper traffic that is likely to click around the site but never buy anything. This is the path of least resistance. The system is performing exactly as instructed—it is maximizing conversions—but the inputs lack the discipline required to drive business growth. The Specific Risks of Performance Max (PMax) This dynamic is most visible in Google’s Performance Max and “Search plus PMax” setups. PMax is a “black box” system that optimizes across all of Google’s inventory, including YouTube, Display, Search, Discover, and Gmail. Because PMax has such a wide reach, it is incredibly efficient at finding the cheapest possible “conversions” available in the entire Google ecosystem. If you include a “Contact Page Visit” as a primary conversion in a PMax campaign, the system might realize it can get thousands of these “conversions” via cheap Display placements or bot-heavy traffic on obscure apps. Your dashboard will show a massive spike in conversions and a plummeting Cost Per Acquisition (CPA). However, your sales team will likely report that lead quality has vanished. The system has optimized for a signal that has no real correlation with revenue. How False Performance Signals Inflate Platform Metrics When low-value micro-conversions are treated as primary actions, platform-reported performance becomes disconnected from reality. This creates a “hallucination” of success that can lead to disastrous financial decisions. Advertisers often see the following patterns: Artificial CPA Reduction: A campaign shows a 40% reduction in CPA because the system is optimizing for “Add to Carts” rather than final sales. Inflated ROAS: If value-based bidding is used but micro-conversions are assigned high arbitrary values, the reported ROAS may look incredible while the bank account remains empty. Volume Spikes: Conversion volume may skyrocket, leading advertisers to scale budgets prematurely, only to find that the additional spend is being wasted on low-intent users. These illusory gains erode contribution margins. Advertisers pay for the “success” reported by the platform, but that success does not translate into the cash flow needed to sustain the business. The Problem of Diluted Intent and Double-Counting Tracking too many micro-conversions also leads to the “double-counting” trap. Consider a standard user journey: a customer clicks an ad, views a product, signs up for a newsletter, adds the item to their cart, and finally makes a purchase. If all of these steps are tracked as primary conversions, a single user journey could be counted as four or five “wins” for the algorithm. This distorts the bidding behavior significantly. The system interprets this single user as a goldmine of conversion activity and begins overbidding on similar profiles. Furthermore, it inflates the perceived value of the traffic. In many accounts, micro-conversions outnumber real conversions by a ratio of 500 to 1. When the signal mix is this unbalanced, the “noise” of the micro-conversions completely drowns out the “signal” of the actual purchase. When Frequency Overwhelms Value Even if you assign different values to different actions (Value-Based Bidding), the math can still fail. If you have 500 pageviews worth $1 each and 1 purchase worth $500, the system has 500 signals for the pageviews and only one for the purchase.

Uncategorized

How to run Google Ads in sensitive categories without remarketing

Understanding the Landscape of Sensitive Categories in Google Ads For digital marketers operating in high-stakes industries like legal services, healthcare, finance, or real estate, the Google Ads dashboard can often feel like a minefield of restrictions. One of the most common and frustrating sights for these professionals is the “Eligible (Limited)” status next to their campaigns. This status indicates that while your ads are running, they are being restricted by Google’s Personalized Advertising policies. At first glance, it may seem as though Google is intentionally hindering your ability to reach potential clients. However, these regulations are not arbitrary. They are built upon a foundation of legal compliance and ethical standards designed to protect users from predatory practices and maintain a high level of privacy. When you are operating in a sensitive interest category, the platform essentially removes the “identity” layer of targeting, forcing advertisers to rely more heavily on “intent” and “context.” Successfully navigating these restrictions requires a shift in mindset. You cannot rely on traditional remarketing or customer lists to nurture leads. Instead, you must master the art of intent-based search, creative-led qualification, and sophisticated data feedback loops. This guide will explore the depth of these policies and provide actionable strategies to thrive when your usual marketing toolkit is restricted. The Rationale Behind Personalized Advertising Policies To overcome the limitations of sensitive categories, it is helpful to understand why they exist in the first place. Google’s policies generally stem from two primary areas: legal requirements and ethical considerations. Legal Compliance and Anti-Discrimination In many regions, particularly the United States, strict laws govern how specific services can be marketed. For example, the Fair Housing Act, the Equal Credit Opportunity Act, and various employment laws prohibit discrimination based on protected characteristics such as race, religion, gender, age, or familial status. If Google allowed a real estate developer to exclude specific zip codes or age groups from seeing their ads, the platform could be held liable for facilitating housing discrimination. Consequently, for industries like housing, credit, and employment (often referred to as HEC categories), Google automatically “strips” certain demographic targeting options to ensure compliance with federal law. Ethical Standards and User Privacy The ethical side of these policies concerns the user’s right to privacy and a non-intrusive browsing experience. Imagine a user who has recently searched for addiction recovery services or sensitive medical treatments. If an advertiser were allowed to use remarketing for these services, that user might be followed across the internet by banners reminding them of their personal struggles. This is viewed as predatory and a violation of the “safe space” Google aims to provide its users. By restricting personalized advertising in healthcare and certain legal niches, Google prevents advertisers from targeting individuals based on their personal hardships or health status. What Features Are Restricted in Sensitive Categories? When an account or a specific campaign falls under the “sensitive interest” umbrella, several of the most powerful targeting tools in the Google Ads ecosystem become unavailable. Understanding these gaps is the first step toward building an alternative strategy. Remarketing Lists Standard website remarketing is the biggest casualty. You cannot tag visitors who come to your site for a sensitive service and then show them ads later. This applies to both the Display Network and Search (RLSA). If your business relies on a long sales cycle where multiple touchpoints are necessary, you must find other ways to remain top-of-mind without using a remarketing pixel. Customer Match Customer Match allows advertisers to upload their own first-party data, such as email addresses or phone numbers, to target specific individuals. In sensitive categories, this is largely prohibited. You cannot upload a list of “lost leads” to try and win them back if those leads were looking for sensitive services like debt consolidation or criminal defense. YouTube Engagement Audiences While you can still run ads on YouTube, you cannot build audiences based on how users have interacted with your channel. For instance, you cannot retarget someone who watched your video explaining “how to file for bankruptcy” because that interaction is tied to a sensitive personal interest. Detailed Demographic Stripping In specific categories like housing, credit, and employment in the U.S. and Canada, you lose the ability to target or exclude by age, gender, parental status, or specific geographic locations like zip codes. Even Google’s Smart Bidding algorithms are restricted from using these signals as inputs, meaning your bidding strategy must rely on other data points to find the right audience. What Tools and Tactics Still Work? While the list of restrictions is long, the list of available tools is even longer. You still have access to the “core engine” of Google Search, which is driven by user intent. The key is to leverage the features that prioritize *what* the user is looking for rather than *who* the user is. Keyword-Based Targeting Search keywords remain the most potent tool for any restricted advertiser. Because keyword targeting is based on the query the user types into the search bar, it is considered “intent-based” rather than “identity-based.” You can still bid on high-intent terms, use negative keywords to filter out irrelevant traffic, and use keyword-rich ad copy to attract the right clicks. Google-Defined Audiences While you cannot use *your* audiences (remarketing), you can often still use *Google’s* audiences. Depending on the specific sensitivity, you may still have access to Affinity Segments (people with a long-term interest in a topic), In-Market Segments (people currently researching a purchase), and Life Events (people getting married, moving, or graduating). These are powerful top-of-funnel tools that help you reach users before they even search for your specific service. Optimized Targeting and AI Optimized targeting is Google’s AI-driven approach to finding new customers. Even if you cannot provide a remarketing list, Google’s algorithm can analyze your historical conversion data to find patterns among people who have completed a goal on your site. In campaigns like Demand Gen and Performance Max, this AI can effectively “find your people” even without explicit audience targeting. Content and

Uncategorized

20 practical ways to use AI in SEO

Artificial Intelligence has fundamentally reshaped the landscape of digital marketing, particularly within the realm of Search Engine Optimization. After nearly two decades of watching the industry evolve through manual link building, keyword stuffing, and eventual algorithmic sophistication, the arrival of Large Language Models (LLMs) represents a seismic shift. This shift is not about replacing the human element of SEO, but rather about augmenting it—freeing up mental bandwidth, reducing the friction of repetitive tasks, and accelerating the pace of technical analysis. In a real-world agency or in-house environment, AI isn’t a “magic button” that generates instant rankings. Instead, it serves as a sophisticated tool that makes the arduous parts of the job more manageable. Whether you are managing real-time client deadlines or overseeing a massive content repository, AI allows you to focus on strategy while it handles the heavy lifting of data processing and drafting. Below are 20 practical, tested ways to integrate AI into your SEO workflow to drive efficiency without sacrificing quality. Content Creation and Copywriting Content remains the backbone of SEO, but the sheer volume required to stay competitive can lead to burnout. AI’s greatest strength in this category is its ability to act as a collaborative partner rather than a solo author. 1. Writing First Drafts The most effective way to utilize AI for content is to treat it as a “first-draft machine.” The “blank page syndrome” is one of the biggest bottlenecks in content production. By feeding an AI tool your detailed brief, target keywords, specific audience personas, and a unique angle, you can generate a structured outline and a rough draft in seconds. The key to success here is the “Human-in-the-Loop” model. AI-generated content can often feel generic or “vanilla.” Your role is to inject the draft with your unique voice, industry-specific expertise, and real-world case studies. Use the AI to build the skeleton, then use your experience to provide the muscle and heart. This approach can cut production time by 50% or more while maintaining the high standards required by Google’s E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness) guidelines. 2. Generating Meta Title and Description Variations Writing meta tags for a handful of pages is simple; writing them for a site with 5,000 product pages is a logistical nightmare. AI tools like Claude or ChatGPT are exceptionally good at adhering to strict character limits while maintaining a persuasive tone. You can provide the AI with a list of target keywords and page topics and ask for ten variations for each. This allows you to choose the one that best fits the brand’s tone or even A/B test different versions. For large-scale operations, you can export your data to a CSV, upload it to an AI interface, and have it process hundreds of titles and descriptions at once. However, never skip the human review phase—ensure that the AI hasn’t hallucinated details or used repetitive “marketing-speak” that could lower click-through rates. 3. Refreshing Underperforming Content Content decay is a natural part of the SEO lifecycle. If a previously high-ranking post has slipped to the second or third page, it often just needs a refresh. Instead of reading through the entire piece to find what’s missing, you can paste the text into an AI tool and ask it to identify outdated statistics, missing subtopics, or areas where the competitors are providing more depth. By providing the AI with the current top-ranking results for that keyword, it can act as a gap analysis tool. It might suggest adding a new section on a recent industry trend or updating a guide to reflect changes in software or regulations. This creates a clear roadmap for your content update without requiring hours of manual research. 4. Generating FAQ Sections Frequently Asked Questions (FAQs) are a goldmine for capturing Featured Snippets and “People Also Ask” (PAA) traffic. AI is highly efficient at identifying common questions surrounding a specific topic. By prompting the AI to generate the most common queries related to your target keyword, you can quickly build out a comprehensive FAQ section. Once the questions are generated, you can cross-reference them with actual PAA data from search result pages. This dual approach ensures your content is not only answering what the AI thinks people want to know but what Google’s data proves they are searching for. This is also an excellent way to perform a quick content gap analysis for your existing pages. 5. Writing Alt Text at Scale Image accessibility is vital for both SEO and user experience, yet writing descriptive alt text for hundreds of images is a task most SEOs dread. AI can streamline this by analyzing image file names or descriptions and generating contextually relevant alt text. A practical workflow involves using a crawler like Screaming Frog to export all images missing alt text into a CSV. You can then upload this list to an AI tool, providing it with the context of the page each image resides on. If your file names are descriptive (e.g., “blue-nike-running-shoe.jpg”), the AI can generate high-quality, keyword-rich alt text that helps search engines understand your visual content better while improving the experience for visually impaired users. Technical SEO Technical SEO often requires a bridge between marketing and web development. AI serves as a translator and a specialized assistant for tasks that usually require coding knowledge. 6. Understanding Error Messages and Log Files Not every SEO professional is a seasoned developer. When Google Search Console throws a cryptic indexing error or a server log shows a series of confusing status codes, AI can be a lifesaver. You can paste the raw error message or a snippet of a log file into an AI and ask it to “explain this in plain English.” Beyond just explaining the “what,” you can ask the AI for the “how.” For example, “How do I fix a 5xx error on an Nginx server?” The AI can provide step-by-step instructions that you can either implement yourself or pass along to the development team, significantly reducing the time spent

Uncategorized

‘Google Zero’ misses the real problem: Your next visitor isn’t human

For months, the digital marketing world has been gripped by a singular, paralyzing fear often referred to as “Google Zero.” The narrative is simple: as Google integrates generative AI into its search results, the traditional “blue link” will disappear, and with it, the organic traffic that has sustained the open web for three decades. The fear is that we are heading toward a zero-click future where Google becomes a “walled garden” that consumes content without ever sending a visitor back to the source. However, recent data suggests that the panic over “Google Zero” might be misplaced—not because the threat isn’t real, but because it’s targeting the wrong symptom. While many SEOs are busy tracking a 2% or 3% dip in human referrals, a much more seismic shift is occurring beneath the surface of the internet’s infrastructure. The real problem isn’t just that humans are clicking less; it’s that your next visitor probably isn’t human at all. The Myth of the Great Traffic Collapse In a recent edition of the SEO for Google News newsletter, industry veteran Barry Adams published a provocative piece titled “Google Zero is a Lie.” Adams argued that the widespread narrative of disappearing traffic is not only false but dangerously misleading for publishers. He points to data from Similarweb and Graphite showing that, globally, Google traffic to top websites has only declined by approximately 2.5%. Despite the rise of chatbots like ChatGPT and Perplexity, Google still commands nearly 20% of all web visits. So, where did the “33% decline” figure that frequently makes headlines come from? According to Adams, those numbers—often cited from Chartbeat—are skewed by a small group of massive publishers who were disproportionately hit by specific algorithm updates, particularly the Helpful Content Updates (HCU). For the vast majority of the web, the “collapse” has been more of a slight erosion. Adams warns that publishers who abandon SEO out of panic are creating a self-fulfilling prophecy, effectively handing over their market share to competitors who continue to optimize for human intent. Adams is correct in his data, but he may be missing the larger evolution. While humans are still clicking on Google results for now, the nature of a “visit” is being fundamentally redefined by the rise of automated traffic. The Tipping Point: When Machines Outnumbered Humans The transition from a human-centric web to a machine-centric web is no longer a future prediction; it is a current reality. According to the 2025 Imperva Bad Bot Report, automated traffic has officially surpassed human activity for the first time in a decade. Bots now account for 51% of all web traffic globally. We have officially crossed the tipping point. This automated traffic isn’t just limited to the “bad bots” that launch DDoS attacks or attempt brute-force logins. The fastest-growing segment of this non-human traffic consists of AI crawlers. These are the engines behind the Large Language Models (LLMs) and AI agents that the world now relies on for information. Data from Cloudflare’s 2025 Year in Review highlights the scale of this explosion. AI bot crawling has grown more than 15x year-over-year. By late 2025, Cloudflare observed roughly 50 billion AI crawler requests per day. These crawlers now represent 51.69% of all crawler traffic, effectively dethroning traditional search engine crawlers (like Googlebot), which have dropped to 34.46% of the share. Akamai has observed a similar trend, reporting a 300% surge in AI bot activity over the past year. Interestingly, OpenAI alone accounts for a staggering 42.4% of all AI bot requests. While your analytics might show a stable line of “human” traffic, your server logs are likely screaming with the weight of machines digesting your content to feed the next generation of AI. The Fraying Social Contract: Take vs. Give For twenty years, the relationship between publishers and search engines was a “give and take” deal. Search engines like Google were allowed to crawl and index a website’s content; in exchange, they provided a discovery mechanism that sent human visitors back to that website. This was a symbiotic relationship that fueled the growth of the internet. AI bots operate on a different philosophy. Cloudflare recently published data on “crawl-to-referral” ratios that should give every digital publisher pause. The numbers reveal a predatory imbalance: The Disproportionate AI Crawl Anthropic’s ClaudeBot: Crawls 23,951 pages for every single referral it sends back to a website. OpenAI’s GPTBot: Crawls 1,276 pages for every 1 referral. Training now drives nearly 80% of all AI bot activity, up from 72% the previous year. This means the vast majority of these “visitors” are not looking to interact with your brand or buy your products; they are there to harvest your data so that a third-party platform can answer a user’s question without that user ever needing to visit your site. Compare this to the traditional Googlebot model. Historically, Google has sent 831x more visitors to websites than AI systems do. However, even Google is rewriting the terms of this deal. Studies from Ahrefs and Seer Interactive show that queries where Google displays an AI Overview (AIO) see organic click-through rates (CTR) drop by 58% to 61%. Even more concerning is the data on Google’s “AI Mode,” where Semrush has observed a zero-click rate as high as 93%. The Rise of “Self-Citing” AI When AI systems do provide citations, they are increasingly circular. An SE Ranking study of over 1.3 million AI Mode citations found that Google.com is the number one cited source in 19 out of 20 niches. Google is essentially citing its own ecosystem—including YouTube and other Google properties—in roughly 20% of all AI Mode sources. This “citation” doesn’t help the independent publisher; it keeps the user within the Google ecosystem. The Agentic Shift: Moving Beyond Search If the rise of scraping bots was the first wave, the “Agentic Shift” is the second, much larger wave. We are moving from a world where humans use AI to find information to a world where AI agents act on behalf of humans to execute tasks. In

Uncategorized

How to reduce cost-per-hire with LinkedIn recruitment campaigns

LinkedIn has established itself as the gold standard for professional networking and B2B engagement. For recruiters and talent acquisition specialists, it is an indispensable tool for finding high-quality candidates. However, there is a significant difference between using LinkedIn to find people and using LinkedIn to hire people efficiently. Without a refined strategy, recruitment campaigns on the platform can quickly become expensive, leading to an inflated cost-per-hire (CPH) that drains departmental budgets. The core challenge many organizations face is a focus on reach rather than relevance. In the digital advertising world, it is tempting to chase high impression counts and a high volume of clicks. Yet, in recruitment, a high volume of unqualified applicants is actually a liability. It creates a bottleneck for hiring managers and increases the time-to-hire, which in turn increases the total cost of the recruitment cycle. To reduce cost-per-hire, recruiters must shift their mindset toward intent-based targeting and rigorous pre-qualification. Shift your strategy: Optimize for intent vs. reach The foundational mistake in many LinkedIn recruitment campaigns is a reliance on broad targeting. While targeting by job title, industry, and years of experience is a standard starting point, it often results in a “noisy” audience. You may reach people who have the right title but zero interest in moving, or people who are technically in the industry but lack the specific expertise required for your unique role. High-performing campaigns move beyond these surface-level demographics and focus on intent. This means identifying candidates who are not just qualified but are also psychologically or circumstancially ready to consider a new opportunity. A layered targeting approach is the most effective way to achieve this. The three layers of intent-based targeting To maximize the efficiency of your budget, consider your audience through three distinct lenses: 1. Core Fit: This includes the non-negotiables. You target specific job titles, verified skills, and necessary certifications. If you are hiring a Senior DevOps Engineer, your core fit includes specific cloud platform certifications and years of experience in high-stakes environments. 2. Behavioral Signals: LinkedIn provides data on how users interact with the platform. You can target users who have signaled they are “Open to Work,” those who are members of specific professional groups, or those who frequently engage with industry-specific content. These behaviors suggest a candidate who is actively thinking about their professional standing. 3. Career Friction Indicators: This is a more advanced tactic. It involves identifying cohorts of talent who may be experiencing “friction” in their current roles. This could include employees at companies currently undergoing major restructuring or layoffs, or professionals in roles traditionally known for high burnout rates. By positioning your company as the solution to their current professional pain points, your conversion rate on ads will naturally increase. By combining these layers, you reduce the “waste” in your ad spend. You aren’t just paying for anyone with a specific title to see your ad; you are paying for the right person who is likely ready to listen to your pitch. Use ad creative to pre-qualify candidates In most forms of digital marketing, the goal of an ad is to get as many clicks as possible. In recruitment, the goal is different: you want the *right* people to click and the *wrong* people to keep scrolling. Every click from an unqualified candidate is a direct hit to your budget that will never provide a return on investment. Your ad creative should act as a filter. A strong recruitment ad doesn’t just sell the “dream” of working at your company; it sets clear expectations about the reality of the role. When you use your ad copy to pre-qualify, you save money by discouraging “aspirational” applicants who don’t meet your criteria. Elements of a pre-qualifying recruitment ad To create an ad that filters while it attracts, follow this structure: Address the pain point immediately: Start by calling out a specific challenge your ideal candidate is facing. For example, “Tired of the 80-hour work week in corporate law?” This immediately identifies who the ad is for and, more importantly, who it is not for. Define the identity: Be explicit about the level of expertise required. Instead of saying “We are hiring engineers,” say “This role is designed for Senior Backend Engineers with 5+ years of Python experience.” This level of specificity signals to junior talent that they should not click, preserving your budget. Highlight specific value: Why should a happy, well-paid professional leave their current job for yours? Focus on tangible benefits like flexible working arrangements, clear paths to leadership, or the opportunity to work on specific cutting-edge technologies. Generic claims like “great culture” are less effective than “4-day work weeks” or “fully remote options.” Set boundaries: Don’t be afraid to state what the job isn’t. Phrases like “Not an entry-level position” or “Requires extensive travel” are vital. While they may decrease your total click-through rate (CTR), they will drastically increase your conversion-to-interview rate, which is the metric that actually lowers cost-per-hire. Structure campaigns by candidate intent level A “one size fits all” campaign strategy is rarely efficient on LinkedIn. Different candidates are at different stages of their career journey, and your campaign structure should reflect that. By segmenting your campaigns based on intent levels, you can allocate your budget where it will have the most immediate impact while still building a long-term talent pipeline. High-intent (bottom funnel) These candidates are actively looking for a new job right now. They are the most expensive to “win” in the auction because everyone is bidding for them, but they also offer the fastest time-to-hire. These campaigns should use direct-response messaging like “Apply Now” or “Join Our Team.” Target “Open to Work” users and retarget those who have already visited your careers page. Warm passive talent (mid funnel) This is where the most significant cost savings are often found. These professionals aren’t scouring job boards every day, but they are open to the right conversation. Your messaging here should focus on career upgrades—better pay, better balance, or more interesting

Uncategorized

SMX Now: Learn how brands must adapt for AI-driven search

The Evolution of Search: From Blue Links to Generative Answers The digital marketing landscape is currently undergoing its most significant transformation since the invention of the search engine itself. For decades, the primary goal of Search Engine Optimization (SEO) was clear: rank as high as possible in the “ten blue links.” However, the emergence of Large Language Models (LLMs) and generative AI has fundamentally altered how users interact with information online. We are moving away from a world of simple search results and into an era of AI-driven synthesis. Today, visibility is no longer just about where you appear on a results page; it is about whether your brand’s content is discovered, evaluated, and ultimately selected by an AI to be part of a generated answer. Whether it is Google’s AI Overviews, Perplexity, or ChatGPT Search, these systems act as filters. They don’t just point users toward websites; they interpret information and provide direct answers, citing only the most relevant and authoritative sources. For brands, this shift necessitates a complete overhaul of traditional digital strategies. SMX Now: A New Series for a New Era of Marketing To help marketers navigate this complex transition, the new monthly SMX Now webinar series is launching with a deep dive into the mechanics of AI-driven search. On April 1 at 1 p.m. ET, industry leaders will gather to discuss the critical strategies brands must adopt to remain visible in an AI-first world. The session features experts from iPullRank, a leading agency known for its technical depth and forward-thinking approach to SEO. Zach Chahalis, Patrick Schofield, and Garrett Sussman will lead the discussion, sharing insights into how AI search engines “pick winners” and how brands can position themselves to be among those selected. This webinar is not just a high-level overview; it is a tactical session designed to introduce the framework for the next decade of search visibility. What is GEO? Understanding Generative Engine Optimization As traditional SEO tactics face diminishing returns in the age of AI, a new discipline has emerged: Generative Engine Optimization (GEO). While SEO focuses on optimizing for algorithms that rank pages, GEO focuses on optimizing for models that generate responses. The goal of GEO is to ensure that a brand’s content is not only crawlable but also “retrievable” and “citeable” by an LLM. The SMX Now session will introduce iPullRank’s proprietary Relevance Engineering (r19g) framework. This framework is a systematic approach to executing GEO through an omnichannel content strategy. It acknowledges that AI models do not look at content in a vacuum. They pull from a vast array of sources, including social media, technical documentation, news articles, and user-generated content, to form a cohesive answer. To succeed in GEO, brands must ensure their core messaging is consistent and authoritative across every digital touchpoint. How AI Search Engines Discover and Select Sources One of the most technical and fascinating aspects of the upcoming webinar is the exploration of “query fan-outs.” To understand why your brand might be excluded from an AI summary, you first have to understand how these engines process a user’s intent. When a user types a complex question into an AI-powered search engine, the system rarely looks for a single page that answers the whole thing. Instead, it performs a “fan-out,” breaking the main query into multiple sub-queries. It then searches for the best information to satisfy each of those sub-queries. If your content only addresses a broad topic without providing the specific, granular data points the AI is looking for during the fan-out process, you will likely be ignored in favor of a source that provides more precise relevance. Furthermore, the session will examine the process of Retrieval-Augmented Generation (RAG). This is the mechanism by which an AI fetches facts from an external database (like the web) to inform its response. The speakers will explain how to structure content so it is more easily retrieved and surfaced during this critical window of interaction. The Three-Tier Measurement Model for AI Visibility One of the biggest challenges for modern marketers is measurement. In a traditional SEO world, we tracked rankings, click-through rates (CTR), and organic sessions. But in a generative environment, a user might get all the information they need without ever clicking on a link. Does that mean the brand didn’t provide value? Certainly not. It means we need a new way to measure success. The iPullRank team advocates for a three-tier measurement model that tracks the journey of content through an AI’s processing pipeline: 1. Discovery This is the baseline tier. Is the AI actually finding your content? This involves technical health, indexing, and presence in the datasets that feed the LLMs. If the model doesn’t “know” your content exists, it can never cite it. Discovery measurement looks at how often your brand appears in the pool of potential sources the AI considers. 2. Selection Once discovered, the AI must decide if your content is the “winner” for a specific part of the query. Selection is the stage where the AI evaluates the relevance and authority of your content against competitors. Measuring selection involves tracking how often your brand’s information is used to synthesize a response, even if a direct link isn’t immediately prominent. 3. Citation Impact The final tier is citation. Being cited is the gold standard of GEO. It provides the brand with third-party validation and gives the user a path to the website. Measuring citation impact involves looking at the frequency and quality of links provided within generative summaries and the subsequent “referral” traffic that stems from these AI interactions. Why Success in GEO Isn’t Universal A key takeaway from the upcoming SMX Now session is that there is no “one-size-fits-all” strategy for AI search. What works for a B2B SaaS company might not work for a local retail brand or a lifestyle blog. AI models treat different intents with different levels of scrutiny. Success in this new era requires constant testing and tailored strategies. Brands must experiment with content formats—ranging from structured

Uncategorized

YouTube adds AI creator matching and ad formats to its partnerships platform

In the rapidly evolving landscape of digital advertising, the bridge between brands and content creators has often been fraught with friction. While influencer marketing has matured into a billion-dollar industry, the processes of discovering the right talent, managing partnerships, and accurately measuring return on investment (ROI) have remained significant hurdles for many organizations. YouTube, the world’s largest video-sharing platform, is looking to solve these challenges through a suite of sophisticated new tools. During its recent NewFront presentation, YouTube unveiled a major upgrade to its Creator Partnerships platform. By integrating Gemini—Google’s most capable AI model—YouTube is fundamentally changing how advertisers interact with the platform’s massive ecosystem of three million creators. These updates go beyond simple search functions, introducing AI-powered creator matching, enhanced measurement tools, and innovative ad formats designed to turn creator-driven content into high-performance paid media. The Power of Gemini in Creator Discovery The primary challenge for brands today is not a lack of creators, but a surplus of them. With more than three million creators currently participating in the YouTube Partner Program (YPP), finding the specific voice that aligns with a brand’s values, audience, and campaign goals is a daunting task. Traditionally, this required manual vetting or the use of third-party influencer databases that often lacked real-time internal data. YouTube’s new Gemini-powered matching engine aims to cut through this noise. By leveraging generative AI and deep data analysis, the platform can now recommend creators based on specific campaign objectives. Instead of relying on basic filters like category or subscriber count, advertisers can input complex requirements—such as target audience sentiment, niche content themes, or historical performance metrics—and receive a curated list of potential partners. This level of precision is designed to maximize “brand fit.” When a creator’s audience naturally aligns with a product, the resulting content feels more authentic and less like a traditional commercial. By automating the discovery phase, YouTube is lowering the barrier to entry for brands that previously felt overwhelmed by the scale of the creator economy. Creator Partnerships Boost: Bridging Organic and Paid Media One of the most significant announcements from the NewFront presentation is the “Creator Partnerships boost.” This feature allows brands to take content created by their partners and run it directly as paid advertisements across the platform. Specifically, these ads can be deployed as YouTube Shorts and in-stream video ads. This approach addresses a long-standing divide in digital marketing: the gap between organic influencer content and paid performance advertising. In the past, a brand might pay a creator for a video, hope it goes viral, and then separately create a high-production studio ad for their paid campaigns. The “boost” feature merges these two worlds. It allows the authentic, relatable voice of the creator to reach a much wider, targeted audience through paid amplification. The results of this hybrid approach are already showing promise. YouTube reports that running creator-made content as paid ads delivers an average 30% lift in conversions compared to standard brand-produced ads. This lift is likely attributed to the “trust factor” that creators build with their audiences, which translates into higher engagement and lower resistance from viewers when that content appears as an advertisement. Driving Results with YouTube Shorts The inclusion of Shorts in this new ad infrastructure is no coincidence. YouTube Shorts has seen explosive growth, now surpassing 70 billion daily views. For advertisers, Shorts represent a unique opportunity to capture attention in a fast-paced, vertical viewing environment that is particularly popular with younger demographics like Gen Z and Millennials. By enabling creator-led content to be boosted as Shorts ads, YouTube is directly competing with other short-form video platforms. The advantage YouTube offers, however, is its deep integration with the broader Google ecosystem. Brands can now use AI to find a creator who excels at short-form storytelling, collaborate on a piece of content, and then use YouTube’s robust targeting tools to ensure that Short is seen by the people most likely to convert. Improving ROI and Measurement Accountability For a long time, influencer marketing was criticized for being difficult to measure. “Vanity metrics” like likes and comments often failed to provide a clear picture of how a campaign impacted the bottom line. YouTube’s updated platform aims to silence these criticisms by providing stronger, more transparent measurement tools. Because these partnerships are now more closely integrated with YouTube’s ad-buying tools, advertisers can track the entire customer journey. From the moment a viewer sees a boosted creator Short to the final purchase or sign-up, the data is captured within the same dashboard used for standard Google Ads. This level of visibility allows marketing teams to prove ROI with the same level of confidence they have in search or display campaigns. Building on the Foundation of BrandConnect These new features aren’t an entirely new direction for YouTube; rather, they are a significant evolution of BrandConnect. Formerly known as FameBit, BrandConnect has been YouTube’s internal influencer marketing platform for years. It was designed to help creators monetize their work while helping brands find authentic ways to reach viewers. By doubling down on BrandConnect’s infrastructure and layering Gemini AI on top of it, YouTube is signaling that the creator economy is no longer a peripheral content strategy. Instead, it is becoming a central growth lever for the platform’s advertising business. YouTube is effectively positioning itself as a full-service agency and platform combined, providing the talent, the creative canvas, the amplification tools, and the analytical data all in one place. The Competitive Landscape: YouTube vs. TikTok and Meta YouTube’s latest moves are a direct response to the increasing competition for creator talent and advertiser dollars. Platforms like TikTok have thrived by making creator-brand collaborations central to their business model through tools like the TikTok Creator Marketplace. Similarly, Meta has integrated creator ads across Instagram and Facebook. However, YouTube’s advantage lies in its diversity of formats. While TikTok is predominantly short-form, YouTube offers a “multi-format” ecosystem where a single creator might produce a 20-minute deep-dive video, a 60-second Short, and a live stream. The new AI

Uncategorized

YouTube adds AI creator matching and ad formats to its partnerships platform

The Evolution of Influencer Marketing on YouTube The digital advertising landscape is currently witnessing a massive transformation, driven largely by the intersection of artificial intelligence and the creator economy. During its recent NewFront presentation, YouTube announced a significant suite of updates to its Creator Partnerships platform, signaling a shift in how brands and creators interact. By integrating Gemini-powered AI and introducing sophisticated new ad formats, YouTube is aiming to solve the two most persistent challenges in influencer marketing: discovery and measurement. For years, influencer marketing was often viewed as a “top-of-funnel” strategy—useful for brand awareness but difficult to track in terms of direct conversions. YouTube’s latest moves seek to change that perception. By leveraging the power of Google’s Gemini AI to match brands with the right creators and providing tools to run creator-made content as paid advertisements, the platform is turning social influence into a performance-driven powerhouse. Gemini AI: Solving the Discovery Problem at Scale One of the biggest hurdles for any marketing team is finding the right voice to represent their brand. With over three million creators currently enrolled in the YouTube Partner Program (YPP), the task of manual scouting has become nearly impossible. Brands often find themselves stuck in a cycle of working with the same “mega-influencers,” missing out on the high-engagement niche audiences found in the “long tail” of the creator ecosystem. YouTube is addressing this by integrating Gemini, Google’s advanced multimodal AI, into its creator matching engine. This isn’t just a basic search tool that looks for keywords in video titles. Gemini-powered matching analyzes a vast array of data points, including content sentiment, audience demographics, historical performance, and the nuanced “vibe” of a creator’s channel. This AI-driven approach allows advertisers to input specific campaign goals—such as increasing brand favorability among Gen Z tech enthusiasts or driving sales for a new skincare line—and receive highly curated recommendations. By cutting through the noise of millions of channels, Gemini ensures that partnerships are based on data-backed compatibility rather than guesswork. This efficiency saves brands hundreds of hours in the research phase and opens doors for smaller creators who have highly loyal, specialized followings. Bridging the Gap Between Organic Content and Paid Media Historically, there has been a divide between a creator’s organic post and a brand’s paid ad campaign. A brand might pay a creator for a sponsored video, but that video’s reach was largely limited to the creator’s existing subscribers and the YouTube recommendation algorithm. If the brand wanted to put “ad spend” behind that content, the process was often clunky and didn’t always feel native to the platform. The updated Creator Partnerships platform introduces a revamped “Creator Partnerships boost.” This feature allows brands to take content created by their partners and run it directly as Shorts or in-stream ads. This is a strategic move that acknowledges the high trust factor associated with creator-led content. When a viewer sees a standard corporate ad, their “ad radar” often goes up, leading to skips or disengagement. However, when the ad is a piece of authentic creator content, the engagement levels tend to be significantly higher. YouTube reports that utilizing creator content as paid ads can lead to an average 30% lift in conversions. This statistic is a game-changer for performance marketers. It proves that the “creator touch” doesn’t just build brand affinity; it drives actual sales and measurable actions. By streamlining the ability to turn a viral Short into a high-performing ad unit, YouTube is providing a seamless bridge between creative storytelling and hard-hitting performance marketing. The Power of YouTube Shorts as an Ad Format YouTube Shorts has grown at an explosive rate, now amassing billions of views daily. As consumer habits shift toward short-form, vertical video, advertisers have been looking for ways to capitalize on this trend without losing the depth of traditional YouTube advertising. The new ad formats introduced at NewFront specifically prioritize Shorts. Running creator content as a paid Short allows for a more immersive and native advertising experience. These ads appear in the Shorts feed, appearing indistinguishable from organic content except for a “Sponsored” label. Because these ads are created by individuals who understand the visual language of the platform—fast cuts, trending audio, and direct-to-camera addresses—they resonate more deeply with the mobile-first audience. Furthermore, these boosted Shorts are backed by YouTube’s robust targeting and measurement tools. Advertisers can now apply the same level of precision to a creator’s video as they would to a standard Google Search or Display ad. This includes targeting by interest, geography, and even remarketing to users who have previously interacted with the brand. Enhanced Measurement and Proving ROI For a long time, the ROI of influencer marketing was measured in “vanity metrics”—likes, shares, and comments. While these are useful for tracking engagement, they don’t always satisfy the needs of data-driven CMOs who need to justify marketing spend. YouTube’s updated platform addresses this by offering stronger measurement tools integrated directly into the partnership workflow. By running creator content through the official ad infrastructure, brands gain access to full-funnel analytics. This includes view-through conversions, click-through rates, and even the ability to track how a creator’s video influenced a purchase that happened days or weeks later. This level of transparency is essential for the long-term growth of the creator economy, as it allows brands to treat creator partnerships as a reliable, scalable media channel rather than an experimental tactic. Expanding the BrandConnect Infrastructure These new features are built upon the foundation of BrandConnect, YouTube’s existing creator monetization and brand safety toolset. BrandConnect (formerly FameBit) has long served as the intermediary that facilitates deals between advertisers and creators. The addition of Gemini and the new “Boost” formats represents the next stage in the evolution of this infrastructure. By doubling down on BrandConnect, YouTube is emphasizing that the creator economy is a core growth lever for the entire platform. It is no longer just about hosting videos; it is about creating a sophisticated marketplace where creativity meets commerce. This integration also ensures that brand safety

Scroll to Top