The Evolution of Search and the Rise of AI Citations
The digital marketing landscape is currently undergoing its most significant transformation since the invention of the search engine itself. For decades, the primary goal of Search Engine Optimization (SEO) was to secure a position in the “ten blue links” on the first page of Google. However, with the emergence of Large Language Models (LLMs) and generative AI search tools like ChatGPT, Perplexity, and Google’s own AI Overviews, the metric for success is shifting. Today, visibility is increasingly defined by “citations”—the references and links provided by AI models when they answer user queries.
As these AI tools become more integrated into the daily search habits of millions, a critical question has emerged among SEO professionals and digital publishers: Is there a direct link between traditional organic search performance and AI citation frequency? Recent research, including a notable analysis by Lily Ray, suggests that the answer is a resounding yes. There appears to be a profound correlation between a website’s health in Google’s organic index and its ability to be cited as a source by generative AI.
This relationship suggests that the foundational principles of SEO—authority, relevance, and helpfulness—are not just relics of the past but are the very pillars that support visibility in the future of AI-driven discovery.
The Direct Link: Analyzing the Correlation
The core of the recent investigation into AI visibility centered on an analysis of 11 specific websites. These sites were selected because they had all experienced significant declines in organic visibility following major Google algorithm updates, such as the Helpful Content Update (HCU) and various Core Updates. By tracking how these sites performed in AI search environments during their period of decline in Google, a clear pattern emerged.
When a website loses its “trust” or ranking power in Google’s eyes, it simultaneously begins to vanish from the citation lists of AI search engines. This trend was observed most aggressively in ChatGPT’s search capabilities. As these 11 sites saw their organic traffic from Google crater, their presence as sources for ChatGPT’s responses dropped in near-unison.
This correlation is not a coincidence. It reflects the technical reality of how AI search engines function. While an LLM like GPT-4 is trained on a massive static dataset, modern “AI search” features rely on Retrieval-Augmented Generation (RAG). This process involves the AI searching the live web to find the most relevant, high-quality information to satisfy a user’s prompt. If a site is no longer deemed authoritative or “helpful” by the primary gatekeepers of the web (search engines), the AI tools that use those search indexes as their source material will naturally stop citing them.
How AI Search Engines Source Information
To understand why Google visibility impacts AI citations, one must understand how AI search engines “read” the internet. Tools like ChatGPT (with Search), Perplexity AI, and Google AI Overviews do not simply guess the answers. They operate as sophisticated aggregators.
When a user asks a complex question, the AI performs a search—often using existing search engine APIs like Bing or Google—to retrieve a set of documents. It then synthesizes the information from those documents into a natural language response. The websites that appear at the top of these real-time search results are the ones most likely to be cited by the AI.
Therefore, if a website is penalized or demoted in traditional search results, it essentially becomes invisible to the RAG process. If you aren’t on the first page of the search results that the AI “reads,” you won’t be included in the AI’s summary. This creates a double-jeopardy scenario for publishers: a loss in Google rankings leads to a simultaneous loss in AI referral traffic and brand mentions.
The Impact of Google’s Helpful Content Updates
The 11 sites analyzed were primarily victims of Google’s shift toward prioritizing “helpful content” created for humans rather than search engines. Over the past two years, Google has refined its ability to identify sites that exist primarily to capture search traffic through mass-produced, low-value, or overly optimized content.
When the Helpful Content Update (HCU) hits a site, it often results in a site-wide suppression of visibility. The analysis shows that AI models are effectively “inheriting” these quality signals. If Google determines that a site lacks E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness), AI models seem to reach the same conclusion, likely because they rely on Google’s (or Bing’s) index to filter for quality.
ChatGPT, in particular, showed the strongest correlation in the study. This suggests that OpenAI’s search integrations are heavily reliant on the authority signals already established by major search engines. For publishers, this means that the “quality” of their content is being judged by a singular standard that governs both traditional and generative search.
ChatGPT vs. Perplexity: Different Degrees of Impact
While the correlation between Google visibility and AI citations is broad, the degree of impact varies across different platforms. The analysis noted that while ChatGPT showed a very tight correlation with Google’s organic losses, other platforms like Perplexity AI sometimes showed more resilience—though they were not entirely immune.
ChatGPT’s search functionality appears to prioritize highly authoritative, “mainstream” sources that are already dominant in search engine result pages (SERPs). When a niche site loses its standing in Google, ChatGPT is quick to replace it with a more “stable” source like Wikipedia, a major news outlet, or a high-authority Reddit thread.
Perplexity, on the other hand, occasionally sources from a wider variety of “long-tail” results. However, even in Perplexity, the downward trend for the 11 impacted sites was visible. This indicates that while different AI models have different “sorting” algorithms for their citations, they all rely on the same fundamental data: the searchable web. If a site is excluded from the top tier of the web index, it loses its “sourceability” across the entire AI ecosystem.
The Role of E-E-A-T in the AI Era
Experience, Expertise, Authoritativeness, and Trustworthiness (E-E-A-T) have been the cornerstone of Google’s search quality evaluator guidelines for years. The recent data suggests that E-E-A-T is now the cornerstone of AI optimization as well.
AI models are designed to provide accurate, safe, and helpful information. To do this, they must prioritize sources that have a high degree of perceived authority. When an AI search engine chooses which site to cite, it looks for signals that indicate the site is a leader in its field. These signals include:
1. Backlink Profile: High-quality links from other authoritative sites remain a primary signal of trust that both Google and AI models recognize.
2. Brand Recognition: If a brand is frequently mentioned across the web in a positive context, AI models are more likely to include it in their knowledge base and citations.
3. Content Depth and Originality: AI models are increasingly better at identifying “copycat” content. Sites that provide unique data, first-hand experience, and original perspectives are more likely to be cited as primary sources.
For the 11 sites in the study, a common thread was a perceived lack of these deep E-E-A-T signals, which led to their decline in both Google and AI search results.
The Future of “Generative Engine Optimization” (GEO)
As the link between traditional organic visibility and AI citations becomes clearer, the strategy for SEO is evolving into what some call Generative Engine Optimization (GEO). However, GEO is not a replacement for SEO; it is an extension of it.
To maintain visibility in AI search, publishers must focus on:
Maintaining Organic Authority
Since AI search engines use traditional search indexes to find sources, you cannot have high AI visibility without a healthy organic presence. This means continuing to follow SEO best practices: optimizing for core web vitals, maintaining a clean site architecture, and ensuring content is indexable and crawlable.
Niche Dominance and Topical Authority
AI models tend to cite sites that are “authorities” on a specific topic. Instead of trying to cover everything, publishers should focus on becoming the go-to resource for a specific niche. This topical authority makes it more likely that an AI will select your site when a user asks a question within that specific domain.
Structuring Content for LLM Consumption
While AI models are good at reading natural language, structured data (Schema markup) and clear, concise headings help AI models parse and extract information more efficiently. Using clear H2 and H3 tags, bulleted lists, and direct answers to common questions increases the “citability” of your content.
Why High-Quality Content is the Only Path Forward
The data from the analysis of these 11 sites serves as a warning for the digital publishing industry. The era of “gaming the system” with low-quality, high-volume content is coming to an end. In the past, a site might lose its ranking in Google but still find traffic through other channels. In the new era, a loss of organic visibility in Google is a signal of a broader loss of digital authority that will exclude a site from the AI conversation entirely.
AI models are the ultimate “distillate” of the web. They take the vast amount of information available and filter it down to the most essential truths and sources. If your content is not deemed essential, high-quality, or trustworthy by the initial filters of the web (search engines), it will never make it into the distilled output of an AI response.
Conclusion: The Convergence of Search and AI
The analysis of AI citations and Google organic visibility changes confirms that we are not entering a post-SEO world, but rather a more rigorous SEO world. The correlation between losing Google visibility and losing AI citations is a clear indicator that the “signals” Google uses to rank sites are the same signals AI models use to select their sources.
For SEOs and site owners, the mission remains the same: create the most helpful, authoritative, and trustworthy content possible. The platforms may change, and the way users interact with information may evolve from clicking links to reading AI summaries, but the value of a high-quality source remains constant. If you want to be cited by the AI of tomorrow, you must be respected by the search engines of today.
As AI search continues to mature, we can expect the integration between search indexes and LLMs to become even tighter. The sites that thrive will be those that view SEO and AI visibility as two sides of the same coin, focusing on brand authority and content integrity above all else.