/

Do AI Tools Matter for Search Yet?

Do AI Tools Matter for Search Yet?

Blog header image - do AI tools matter for search yet?

We looked at our own data — and the answer is complicated

AI tools are changing how people find information. That’s not just a marketing observation, it’s a search behavior one. Whether you run a public website, a product catalog, a knowledge platform, or an enterprise search tool, what’s happening in public AI search is relevant to you: not just because it might send you traffic, but because it’s reshaping what people expect from search everywhere they encounter it.

The data on AI referral traffic offers a useful reality check on a lot of overheated claims. But the more interesting question sits behind the numbers. What does AI-assisted search behavior tell us about how people now expect to find things? And what does that mean for the search experiences you’re responsible for?

We looked at our own data to find out. The short answer: 

AI tools matter more than the traffic numbers suggest, and the implications extend well beyond your analytics dashboard.

What we can measure: the referral traffic picture

The data is consistent across multiple independent sources. SEO platform BrightEdge analyzed thousands of queries across major brands from January to August 2025, and the results were a bit surprising.

Recent research concludes AI traffic currently accounted for less than 1% of total referral traffic.

Source: Source: BrightEdge AI Search Report, 2025.

Organic search remained the primary driver of conversions across all industries. Other research tells a similar story — Conductor’s analysis across 10 industries put AI referral share at around 1% through late 2025.

Comparing regular search traffic to AI search traffic on your website

Our own data from Google Analytics — July 2025 to March 2026 — reinforces these findings. At 2.3%, our AI referral share is above the broad industry average, likely reflecting an audience of search practitioners and technology evaluators who tend to be early AI adopters. But in volume terms, traditional search isn’t under threat.

But the quality signal is striking

When we looked at how those AI-referred visitors actually behaved on site, the picture shifted. Across ChatGPT, Perplexity, Gemini, and Claude.ai, visitors engaged at an average rate of 44.7% — just five percentage points below Google organic.

The explanation is intuitive. Someone who has already asked an AI tool a specific research question and then clicked through to your site isn’t browsing, they’re verifying. That’s a fundamentally different level of intent from someone clicking the third result in a search page. Independent research backs this up: SEMrush found LLM visitors convert at 4.4 times the rate of standard organic search visitors — suggesting the quality gap between AI-referred and organic traffic may be even smaller than engagement rates alone indicate.

Note that the differences between individual tools — ChatGPT at 46.4%, Perplexity at 45.8%, Gemini at 38.5%, Claude.ai at 36.4% — shouldn’t be over-interpreted yet. Each is at a different point on its adoption curve, and the variation likely reflects that more than any meaningful difference in traffic quality.

AI referral traffic isn’t a volume story. It’s a quality story, and the quality is already comparable to organic search.

Honestly, the data was more reassuring than we expected. But one significant gap remains: buried inside our Google traffic is a number we simply can’t see. And that matters.

The Google blind spot

There’s an important caveat to all of this: Google’s AI Overviews and AI Mode traffic is invisible in standard analytics reporting. When Google serves an AI-generated summary at the top of a search results page, any click from that summary is attributed to ‘google / organic’ – indistinguishable from a standard blue-link click.

That matters because AI Overviews are now ubiquitous. Research from TrustRadius found that 72% of B2B buyers encounter Google’s AI Overviews during their research. The Pew Research Center’s controlled study of 900 participants found that searches displaying AI Overviews saw click-through rates drop to 8%, compared to 15% for traditional results – but crucially, TrustRadius found that 90% of buyers click through to sources actually featured in the overview.

Being cited in a Google AI Overview is therefore very different from being absent from one. And your analytics won’t tell you which of the two is happening.

But the click is only half the story

AI tools are influencing how people research, compare, and shortlist options long before they visit any website, meaning the decisions that matter are often already forming before your analytics register a single session. That’s as true for someone evaluating an enterprise search vendor as it is for someone choosing a SaaS product, an information service, or a platform tool.

The trajectory reinforces the point. AI-referred sessions grew 527% year-on-year through 2025, and this is not a channel to park in the “monitor later” bucket. By the time the volume becomes undeniable, the citation patterns and brand associations will already be established — much like organic search in the early 2000s, when the companies that understood PageRank early built advantages that took years for latecomers to close.

What public AI search behavior means for search on your own platform

The habits people are forming with ChatGPT, Google AI Mode, and Perplexity don’t stay in those tools. They come with users when they arrive on your website, your product catalog, your member portal, or your knowledge platform. Google’s own data shows “Tell me about” queries jumped 70% year-over-year in 2025. Users are increasingly treating search as a conversation, not a keyword-matching exercise. The tolerance for a search box that returns a list of links is dropping fast.

This plays out differently depending on your context:

  • E-commerce and product search — Users now expect to describe what they need conversationally rather than navigate category trees or guess the right keyword. Site search that can’t handle natural language is falling behind user expectations, not just technical benchmarks.
  • Information publishers and knowledge platforms — The competitive question is no longer “does your search return the right document?” It’s “does your search answer the question?” That’s a fundamentally different design problem, and one that’s increasingly urgent as public AI tools improve.
  • Enterprise and internal search — People using AI search are not looking for a list of places to go. They want an answer complete enough to act on. That expectation doesn’t switch off when someone opens an internal tool.

Public AI search behavior is a product signal, not just a marketing metric. The experience your users are getting from Google AI Mode and ChatGPT is actively setting the bar for what they’ll expect from search everywhere — including yours.

It’s a question we work through with our customers every day – and the answer almost always starts with understanding how their users actually search, not just what they’re searching for.

What this means in practice

For your external presence:

  • Good SEO is still the foundation. AI tools predominantly cite sources that already rank well organically. Ahrefs found pages ranking #1 in Google are 3.5× more likely to be cited by ChatGPT than those outside the top 20. The best investment in AI visibility is continued investment in content quality and authority.
  • Structure and directness matter more than they used to. Content that directly answers questions, uses clear headings, and makes specific citable claims performs well in both traditional and AI search. No new strategy required, just a higher bar on execution.
  • Your whole information footprint counts. Reviews, analyst mentions, case studies, and press coverage all contribute to how AI tools represent your brand — not just your own domain.

For your own search experience:

  • Mind the expectation gap. Users arriving from AI tools have been trained to ask questions conversationally, not just type keywords. Ask whether the search you’re delivering still matches what they now expect. Because the gap between public AI tools and most site search or knowledge platforms is widening, not narrowing.
  • Hybrid search is increasingly the right prescription. Search that combines keyword precision with semantic understanding handles both ends of the spectrum — a simple keyword query and a complex conversational question — without forcing a choice. It’s what users have been trained to expect from public AI tools, and it’s fast becoming the baseline for any serious search experience.
  • You don’t need an expensive platform to start understanding your gap. Systematically asking the major AI tools the questions your customers ask, and noting whether your content appears, costs nothing and tells you a lot about where your visibility and your search experience both stand.
  • Watch your AI referral sources as a leading indicator. The volume is small now but the trend is the signal, and it’s one worth tracking before it becomes impossible to ignore.

Ther verdict

Do AI tools matter for search yet? Yes, but probably not in the way most of the coverage suggests.

The referral volume is still small, and the data was more reassuring than we expected going in. But referral clicks were always an incomplete way to measure influence, and the behavioral shift happening in public AI search is real regardless of what your analytics show. Users are being trained — right now — to expect conversational, synthesized, direct answers from search. That expectation doesn’t stay in ChatGPT or Google AI Mode. It comes with them everywhere they search.

That’s the part worth acting on. Not panic-buying a GEO platform but honestly asking whether the search experiences you’re responsible for are keeping pace with what your users now expect. The gap between public AI search and most site search, knowledge platforms, and internal tools is widening. Hybrid search is a practical starting point. Better content structure and visibility across the broader information ecosystem are the foundation.

The early SEO parallel is worth taking seriously. The companies that paid attention early built advantages that took latecomers years to close. The same dynamic is forming here, quietly, in referral logs that still look small.

If you’d like to discuss what this means for your website, platform, or search experience, we’d love to talk.

– The Pureinsights Team

References and further reading

Pureinsights Resources

LinkedIn
X
Email

Stay up to date with our latest insights!