AI Max for Search vs DSAs: what you lose when you accept the trade
Google is replacing Dynamic Search Ads with AI Max for Search. Better automated targeting and easier set-up in exchange for less visibility into which query matched, which page got served, and what triggered the auction. The five-year pattern: slightly better metrics, slightly less transparency. Here is how to take the upside without losing the audit layer.
Google is sunsetting Dynamic Search Ads (DSAs) in favour of AI Max for Search. The pitch is the one Google has been making for five years on every product cycle: better automated targeting, smarter use of landing-page content, less manual work. The migration is happening regardless of whether you like it, DSAs will stop accepting new budgets, then stop serving entirely.
The fine print: less visibility into what triggered your ad, which query matched, which page was served. More decisions made inside a black box you can audit only through aggregate reporting that Google itself controls.
This is not new. It is the same trade Google has been offering since broad-match modifier disappeared, since search-terms reports started hiding low-volume queries, since Performance Max blurred the line between Search and the rest of the platform. Each product cycle: slightly better benchmark performance in exchange for slightly less control.
What AI Max actually changes
AI Max for Search is, in practical terms, three things bundled together. Understanding what each does is the difference between using it competently and being used by it.
- Generative ad copy. Headlines and descriptions written by Google's models, served dynamically, optimised against engagement signals you cannot see directly. Replaces the discipline of writing your own RSA variants and watching which combinations perform.
- Asset-driven targeting. The system pulls from your landing pages, sitelinks, and uploaded assets to decide what user intent each impression is targeting. Replaces the precision of keyword-level intent matching with a model's inference.
- Bid logic tied to "value". AI Max prefers bidding strategies that optimise toward conversion value (or proxy values it infers). Replaces tCPA-style discipline with a model that decides what each click is worth.
Each of these moves a decision that used to be the operator's into the platform. The marginal click probably converts at a slightly better rate when the system makes those choices. The catch is what happens at the margins you cannot see, the queries you would never have bid on, the audiences you would have excluded, the pages you would have refused to serve from.
What you actually lose
Three specific losses that show up 30-60 days after the migration:
- Search-term exposure shrinks further. AI Max search-term reports surface aggregate categories, not individual queries. You can see "branded vs non-branded" or "high-intent vs research" splits. You cannot see the specific query that was actually expensive last week the way DSA reporting lets you. That kind of waste is invisible by design.
- Assisted-conversion paths become harder to verify. Because AI Max optimises against blended value signals, the attribution path for any given conversion is reconstructed inside the model. Last-click reporting still works; understanding which keywords actually moved revenue in the upper funnel does not, without separate analysis infrastructure.
- Account-level "what changed?" stops being answerable in the UI. When something shifts in CPA or volume, you used to be able to pull a search-term report, an impression-share report, and a placement report and triangulate. AI Max consolidates these into a smaller set of dashboards optimised for explanation, not investigation.
Practitioners who resist AI Max entirely tend to lose on headline metrics, the system genuinely does deliver decent default performance and refusing to use it leaves volume on the table. Practitioners who accept it wholesale tend to miss the inefficiencies that aggregate reporting conceals, because the audit layer they used to rely on is no longer in the box.
What to actually do
Building an independent audit layer is the answer. Specifically:
- Run a parallel "old-school" Search campaign alongside AI Max. Same products or services, exact-match keywords, manual structure. Use it as a benchmark, when AI Max says you cannot bid lower on a query, the parallel campaign tells you what the true incremental cost is on that intent.
- Pull search-term reports weekly. Even with aggregated reporting, weekly cadence catches drift. What was "high-intent commercial" two weeks ago might be "research-level" now if the model's inference shifted. See the most common ways Google Ads accounts leak money for the longer version of why this matters.
- Wire your own conversion value model into the account. Enhanced Conversions for Leads plus offline conversion import plus Conversion Value Rules. If AI Max is optimising toward "value", you control what value means rather than letting the platform's defaults define it. Deeper version in Conversion Value Rules: the most overlooked setting in Google Ads.
- Hold a quarterly structural review against business outcomes. AI Max performance against an in-platform target says nothing about whether the campaigns are serving the business. Compare AI Max conversion volume against actual closed deals, qualified leads, lifetime value in the CRM. If the platform reports growth and the business does not see it, something in the modelling is wrong.
- Document what you are giving up. When AI Max becomes the default, write down the specific controls you used to have and where their substitutes live now. Someone joining the team six months from now needs to understand the gap.
The deeper point
If the only lens you have is Google's reporting, you are evaluating their product with their ruler. The ruler is not neutral, the company designing it has commercial reasons to define performance in ways that reward continued spend on its platforms. None of this is malicious. It is just what self-reported metrics from any commercial vendor look like.
The answer is not to boycott AI Max. The platform is moving, the volume is real, and refusing to migrate is a way to lose efficiency without saving any independence. The answer is to build the audit layer that sits beside it: independent conversion tracking, independent business-outcome measurement, independent decision-making on what gets retained when the platform changes its defaults next quarter.
This is the same principle that drives every part of our approach to PPC management. The platforms are useful tools. They are not the source of truth about whether your advertising is working. That truth lives in your CRM, your accounting, your customer support inbox, and the audit work that connects platform spend to those outcomes is the part of "management" most agencies do badly and most clients do not realise they are missing.
If you want a free review of how your account would look under AI Max with a proper audit layer in place, book a free audit. We will show you what's likely to move when the migration completes, where the gaps in your tracking will become visible, and what to put in place before they do.
Get a free PPC audit from the team that wrote this.
We'll review your Google Ads or Microsoft Ads account and show you three specific things we'd change in the first 30 days.
