How to Find Wasted Spend in Google Ads: A Diagnostic Framework
Most Google Ads accounts have 20 to 40 percent wasted spend hiding behind healthy averages. Here is the diagnostic framework that finds it.
Last quarter I audited an account spending $25,000 per month on Google Ads. Total conversions looked healthy. CPA was within the target range. The marketing manager reported to leadership that paid search was performing well.
Here is what the audit found. $7,200 per month flowing to search terms for competitor products nobody had intended to bid on. $3,100 going to Display placements through two Search campaigns that had the Display Network enabled since launch. $1,800 on keywords with quality scores of 1 and 2, paying a 400% premium per click. Location targeting set to “Presence or Interest,” leaking budget to users in states the business does not serve.
Total wasted spend: roughly $14,000 per month. 56% of budget. The account looked fine because the winners were strong enough to mask the losers.
This is not an outlier. In most accounts I audit, Google Ads wasted spend represents 20 to 40 percent of total budget. Here is the diagnostic framework I use to find it.
Why Google Ads wasted spend hides behind healthy averages
Waste does not show up in aggregate metrics. It hides behind them.
An account has 10 campaigns. Three produce conversions at $40 CPA against a $75 target. Four produce conversions at $90 CPA. Above target, but not alarming on their own. Three produce zero conversions while spending $4,000 each per month.
Blended CPA across all 10 campaigns: $72. Under the $75 target. The account looks fine.
But those three zero conversion campaigns represent $12,000 per month in pure waste. Remove them and the effective CPA of the remaining seven drops to $58. That $12,000 could fund more of the $40 CPA campaigns that are already working.
Most teams check overall CPA, overall ROAS, or total conversions. Those numbers tell you whether the account is working in aggregate. They do not tell you where money is being burned. Finding wasted spend in Google Ads requires going layer by layer.
Layer one: campaign settings that leak budget from day one
Four campaign settings in Google Ads default to configurations that leak money from the moment a campaign is created. In the $25,000 per month account, these settings accounted for $4,900 of the waste.
Display Network on Search campaigns. Enabled by default at campaign creation. This sends a portion of search budget to display placements where purchase intent is near zero. In this account, two campaigns had it enabled. The display placements generated 14,000 impressions, 83 clicks, and zero conversions. Total cost: $3,100 per month. To check: open Campaign Settings, look under Networks. If “Google Display Network” is checked on a Search campaign, uncheck it.
Location targeting. This account targeted the United States. But the business only served 12 states. The location report showed 15% of clicks coming from California, Texas, and Florida, where the company had no presence. Those clicks cost $1,800 per month. Zero converted. The default setting “Presence or Interest” was the cause. It showed ads to anyone who searched for topics related to the target area, not just people physically there. Switching to “Presence only” eliminated this waste.
Search Network partners. Enabled by default. In this account, partner traffic represented 9% of spend and converted at roughly one third the rate of Google Search. A smaller leak, but a consistent one.
Auto applied recommendations. Google had automatically added broad match versions of keywords that already existed as exact match. Both versions triggered the same search terms, competing against each other in the same auction and inflating CPCs by 15 to 20% on the affected keywords.
These settings are covered in more depth in the 15 point audit checklist I run on every account. The point here is the dollar impact. $4,900 per month from settings that were never reviewed after launch.
Layer two: keywords spending without converting
The next layer requires going keyword by keyword. The threshold I use: any keyword that consumed more than three times the target CPA without a single conversion has had enough data to prove it does not work. In this account, with a $75 target CPA, that means any keyword that spent $225 or more with zero conversions.
Sorting active keywords by spend and filtering for zero conversions surfaced 12 keywords above the threshold. Total spend: $4,800 in the past 30 days.
But before pausing any of them, the surgical step that most teams skip.
One keyword, “enterprise resource planning consulting,” spent $380 with zero conversions. I checked its search terms. It had triggered “ERP implementation consulting for manufacturing” (3 clicks, $45) and “SAP consulting services for mid market” (7 clicks, 2 conversions, $105). The keyword itself looked like a loser. But it was generating high intent terms that actually converted. I saved those terms as exact match keywords, then paused the broad match parent. The discovery value was preserved. The waste was cut.
Another keyword, “business software solutions,” spent $290 with zero conversions. Its search terms were all generic: “software solutions,” “business solutions meaning,” “free software solutions.” No specificity. No intent. This keyword was pure waste.
Then the quality score layer. Eight keywords in the account had quality scores between 1 and 3. At those scores, Google charges a 200 to 400% premium per click compared to a keyword with a quality score of 7. Combined spend on these eight keywords: $2,100 per month. Conversions: 2. Effective CPA: $1,050. The account average was $72.
I go deeper on the process of reading search terms surgically in how to read a search terms report like a senior account manager.
Layer three: search terms you are paying for but never chose
This is where the largest volume of wasted spend typically hides. Not in a few large line items, but in hundreds of small ones.
This account generated 1,400 unique search terms in 30 days. Of those, 890 had spend with zero conversions. Individually, most spent under $20. No single term looked significant enough to trigger a review. In aggregate, those 890 terms consumed $8,200. One third of total budget.
The same pattern appears in nearly every account running broad match or phrase match with smart bidding. Google expands match types aggressively. The algorithm tests thousands of query variations. Some work. Many do not. Without regular search term reviews, the waste accumulates silently.
I categorized the 890 non converting terms into four groups.
Competitor and brand names. 180 terms, $1,900. Searches like “Smith and Associates law firm” and “Jones Legal Group reviews” that triggered because broad match keywords saw loose semantic similarity to the business category. Unless there is a deliberate competitor strategy, this is budget going to people looking for someone else.
Irrelevant queries. 340 terms, $3,400. “Law school requirements,” “how to become a lawyer,” “legal assistant salary,” “paralegal certification online.” Completely unrelated to the service but matched because the words overlap.
Informational and low intent. 220 terms, $1,800. “Do I need a business lawyer,” “how much does a commercial lease lawyer cost,” “types of business attorneys.” People researching, not hiring. These may have a place in a deliberate awareness strategy but are waste when running on campaigns optimized for leads.
Relevant but not converting. 150 terms, $1,100. “Commercial real estate attorney downtown,” “business contract lawyer for startups.” These match the service and show real intent. The issue may be the landing page, the offer, or simply insufficient data. Do not add these as negatives too quickly.
The fix is not adding 890 individual negative keywords. It is building themed negative keyword lists. A “careers and education” list with 15 entries covers 200 or more of those irrelevant terms. A “competitor” list with 12 entries covers all 180 competitor queries. Each themed list blocks dozens of future matches, not just the ones found today.
Layer four: the waste that never shows up in standard reports
The first three layers are visible if you know where to look. This layer requires pulling reports that most teams never open.
Geographic waste. The location report for this account showed 22% of spend coming from five states that produced zero conversions over 90 days. The business operated in 12 states. The remaining spend was geographic waste, showing ads to people in markets the company could not serve. The fix: exclude non-serviceable states, or add negative bid adjustments of 100%.
Device waste. This was a B2B account where the buying process happens on a desktop. Mobile clicks cost roughly the same per click as desktop, but mobile converted at 40% of the rate. Mobile represented 55% of total clicks. A 40% negative bid adjustment on mobile reduced that imbalance. In some accounts, segmenting campaigns by device gives even more control.
Time of day waste. The hour of day report showed 18% of spend occurring between 8pm and 6am. For a B2B service, nobody is answering phones or reviewing form submissions at midnight. Leads submitted at 2am sit untouched until 9am the next day. By then they have gone cold or contacted a competitor who responded faster. An ad schedule limiting spend to business hours eliminated this entirely.
Weekend waste. Saturday and Sunday represented 14% of spend. Conversion rate on weekends was 60% lower than weekdays. Form submissions sat untouched for 48 hours. Reducing weekend bids by 50% cut the waste while preserving the small percentage of weekend conversions that did come through.
Each of these produces single digit percentages of waste individually. Combined, they represented 18% of total spend in this account. The reason they persist is that no standard dashboard surfaces them. You have to pull the geographic report, the device report, the hour of day report, and the day of week report separately. Most teams never do.
How waste compounds: the math most teams never run
The $25,000 per month account had $14,000 in identified waste across all four layers. That is $168,000 per year.
But waste does more than burn budget. It trains Google’s algorithm on bad data.
Every form fill from a display placement teaches the algorithm that display placements produce conversions. Every click from someone in the wrong state reinforces geographic signals that do not align with the business. Every broad match query from someone looking for a competitor tells the algorithm that those queries are worth chasing.
Over six months, the algorithm progressively tilts spend toward the patterns that produce the cheapest clicks, not the best leads. CPA drifts upward. Lead quality drifts downward. The team responds by increasing budget to maintain volume, which feeds more money into the same broken patterns.
Here is the comparison. The $25,000 per month account with waste eliminated. The remaining $11,000 in productive spend generates the same number of conversions because the waste was producing nothing anyway. CPA drops from $72 to $58. Impression share on the winning campaigns increases because budget is no longer consumed elsewhere. The account can scale with confidence instead of scaling waste alongside growth.
The reallocation: turning waste into growth
Identifying wasted spend is only half the framework. The other half is redirecting that budget to what is already working.
After eliminating waste in this account, $14,000 per month was freed. Here is the reallocation process.
Check impression share on the top performers. The three campaigns producing $40 CPA conversions were only showing for 35% of eligible searches. That means 65% of the time someone searched for exactly what this business sells, the ad did not appear. Budget was being consumed by waste elsewhere in the account.
Calculate the headroom. Those campaigns convert at $40 CPA with 35% impression share. Increasing their budget by $8,000 could push impression share toward 70%. At the same conversion rate, that would produce roughly 200 additional conversions per month at a CPA the business had already validated.
Fund the winners. Move the freed budget to the proven campaigns. Do not create new campaigns, test new keywords, or expand targeting. The first reallocation should always be more of what already works.
Monitor for two to four weeks. As impression share increases, CPA may rise slightly because the marginal auctions are more competitive. If CPA stays below the target, continue scaling. If it creeps above, you have found the ceiling for that keyword set and can hold the budget there.
The sequence is always the same. Kill the waste first. Then fund the winners. Trying to scale before eliminating waste amplifies both the good and the bad.
Making the diagnostic process repeatable
This framework is not a one time cleanup. The same process should run on a regular cycle.
Weekly: search term review. Thirty to sixty minutes. Waste compounds fastest here because broad match and smart bidding introduce new terms constantly. A monthly review lets four weeks of waste accumulate before anyone catches it.
Monthly: keyword and ad group review. Flag anything above the three times CPA threshold. Check quality scores. Pull the geographic and device reports.
Quarterly: full campaign level diagnostic. Settings audit, bidding strategy evaluation, account structure review, and a fresh pass through all four layers.
The accounts that maintain strong performance over time are not the ones that found waste once. They are the ones that have a process for finding it continuously. I built an AI workflow that runs this diagnostic with the same rigor on every account, every time.
This is the framework I apply to every account I manage. The free audit produces a complete waste analysis across all four layers with prioritized actions and dollar impact estimates. For operators who want to build this diagnostic process into their own weekly rhythm, coaching is where that work happens.