The question every marketer should be able to answer
Here's the uncomfortable truth: most can't. The CMO Survey found that 64% of CMOs cannot prove the ROI of their marketing spend. That's two-thirds of marketing leaders flying blind on their largest investment.
This isn't a data problem. Most brands have access to more data than ever—platform dashboards, CRM systems, sales records. The problem is that the data tells conflicting stories. Facebook says it drove the sale. Google says it drove the sale. Your media agency says awareness drove the sale. Everyone has a vested interest in looking good, and everyone's got numbers to back them up.
Without an independent measurement system, you're making budget decisions on a biased foundation. And the further your brand moves away from direct-response performance metrics, the worse this problem becomes.
Why platform reporting doesn't answer this
Every digital platform uses last-click attribution (or variations of it). Facebook says it drove the conversion if the customer's last interaction before purchase was on Facebook. Google says it drove the conversion if it was Google. This creates a fundamental problem: you're counting the same sale multiple times across different platforms.
The math doesn't add up. If platform attribution claims total 120% of your sales, you're in a normal situation. Everyone's crediting themselves more than their true contribution.
Offline channels make it worse. How do you attribute a sale to radio or out-of-home? You can't. So most brands either ignore them or use a broad heuristic ("brand awareness probably contributed 20%"). This leaves your biggest budget allocation—overall mix—decided by guesswork.
The core issue: attribution and incrementality are not the same thing. A customer might have seen your ad, but they would have purchased anyway. You're measuring who was the last touchpoint, not what caused the decision.
What MMM tells you about channel ROI
Marketing Mix Modelling takes a completely different approach. Instead of tracking individual customer journeys, it looks at the total picture: weekly spend across all channels, total sales that week, and patterns over time.
The model then isolates the incremental contribution of each channel—meaning the actual sales lift caused by each pound spent—while controlling for everything else. Seasonality, pricing changes, promotions, competitor activity, distribution, all of it. You get a true picture of what each channel returns.
This works because of scale. You need enough historical variation to see the causal patterns, but once you have it, the math is clean. Digital display spending goes up in March? Did sales go up independently, or was it seasonal? MMM isolates the true effect.
The output is simple: each channel gets an ROI number. Not an attribution percentage. An actual return on investment. Search might deliver £3.50 for every pound spent. Display might deliver £0.80. Radio might deliver £2.10. Now you have something actionable.
What the output looks like
A typical MMM results set presents a simple ranking table:
- Channel — Search, Display, Social, Radio, OOH, Email, TV
- Annual Spend — Total budget allocated
- Incremental Revenue — Actual sales lift driven by this channel
- ROI — Revenue divided by spend
What surprises most brands:
- Digital display rarely looks good. Most brands find display ROI between £0.50 and £1.50 per pound spent. It's often a brand-building channel, which is useful to know, but it's expensive relative to its incremental impact.
- Radio and OOH typically outperform expectations. They're often allocated as "brand awareness" channels with flexible targets. When measured properly, they frequently deliver better ROI than search, and certainly better than display.
- Email is almost always the winner. It's low-cost and drives immediate action. ROI often exceeds £5 per pound spent for e-commerce and direct businesses.
- TV is nuanced. If your brand is big enough to make TV efficient, it builds awareness at scale. If you're below a certain spending threshold, it looks weak. Most smaller brands would do better reallocating TV budget elsewhere.
The specific pattern depends on your category, audience, and how your market works. But the structure is always the same: some channels work harder than others.
Example channel ROI ranking: email and search outperform traditional and social channels
Diminishing returns: ROI efficiency decreases as spend increases on a channel
What to do with the results
This isn't about killing underperforming channels. That's a common misunderstanding. You run an MMM to rebalance, not to eliminate.
Here's the reality: a typical reallocation of 10-15% of your budget based on MMM findings delivers meaningful uplift. You move money from below-average channels to above-average ones. The CFO loves this because it's an efficiency play—you're not asking for more budget, you're asking to spend your existing budget smarter.
The process works like this:
- Identify your lowest-ROI channels
- Identify your highest-ROI channels with capacity to absorb more spend
- Run scenario modelling: what if we move £50k from display to email?
- Build the case to stakeholders: better performance, same budget
- Test and monitor the reallocation
Conservative reallocation limits: most brands move 5-20% of budget. More than that becomes risky because you're getting outside the range where the model is reliable, and you might be losing channel synergies or brand coverage.
The beauty is that this is a low-risk way to test the model's recommendations. If MMM says moving £30k from display to radio will increase ROI, you can run that experiment, measure the results, and validate the model.
Most brands we work with discover that 20-30% of their media budget is delivering below-average returns. The opportunity isn't about spending more—it's about spending better.
The catch: data quality and refresh
MMM needs good data, but not as much as you'd think. The minimum is: weekly spend by channel and a weekly or monthly sales figure. That's it. If you have that for 12+ months, you can build a model.
The catch is freshness. An MMM built on 2023 data isn't useful in 2026. The market changes, consumer behaviour shifts, new channels emerge, competitor activity evolves. A model that was accurate last year might be pointing you in the wrong direction now.
This is why the trend is toward quarterly or semi-annual MMM refreshes. You're not building it once and storing it. You're updating it regularly to stay aligned with what's working right now. This is where AI-assisted MMM has a real advantage—the cost and timeline for refresh becomes manageable, not a major project.
The investment in setting up clean data pipelines and automation pays dividends here. The first model costs more to build. Each refresh is faster.