All case studies
National ecommerce grocery · market expansion

Real-time ad performance by market and channel, so the marketing team optimized spend on launch day, not in the next monthly report.

12+
Live markets on one ad performance dashboard
20-25%
Improvement in overall media efficiency from spend reallocation
Daily
Marketing decision cadence (was post-campaign only)
5
Channels normalized into one cross-channel performance model
Days
Time from campaign launch to first read on performance (was weeks)
Client context

A national ecommerce-native grocery retailer investing significant paid media behind each new store launch as they expanded into new metro markets. Marketing leadership needed to know what was actually driving awareness and foot traffic, not just what they were spending.

The problem

They were putting real money into multi-channel campaigns (digital, out of home, and more) tied to new store rollouts, but the connection between spend and store performance was unclear. They had the spend data and they had the sales numbers, but nothing tied them together by market or channel. Which creative was working in Phoenix and falling flat in Denver? Was out-of-home pulling its weight against digital? Was a market saturating, or just slow to respond? Nobody on the marketing team could answer those questions with anything more rigorous than instinct.

Every post-mortem happened weeks after a campaign wrapped, which meant the learnings came too late to change the current quarter's decisions. New market launches were running on the previous market's playbook regardless of whether that playbook had actually worked. The marketing leader couldn't walk into the next operating review with a defensible answer to 'where did our last $5M of media spend go and what did it produce.'

What we built
01

Foundation: ad-platform ingestion

Custom data pipelines pulled spend, impressions, clicks, and creative-level performance from each ad platform on a daily cadence: Google, Meta, the OOH partners, and the in-house promo channels. Every channel's quirks (attribution windows, geo granularity, currency) got normalized at ingest so downstream models compared apples to apples.

02

Modeling: market-level attribution

A dbt project that joined ad spend to store-level performance by market and time window, with the attribution logic layered on top: spend lag windows by channel, demographic-adjusted market sizes, and a shared definition of incremental sales tied to launch periods. Definitions ran through a small marketing-finance working group so both sides agreed before models shipped.

03

Activation: in-flight Tableau dashboards

Tableau dashboards the marketing team opened daily: ad performance by market, by channel, by creative, with drill-paths into the campaigns that were and weren't working. Refresh on the dashboards moved from 'when someone exports CSVs' to a daily morning load, so the marketing team had Tuesday-afternoon decisions on Tuesday morning.

How we worked

The Blueprint opened with one week of platform audit: every ad account setup, every attribution window, every campaign tagging convention. Half of them turned out to be inconsistent with the rest. The other two weeks went to scoping the model with marketing and finance together, since the answer to 'did this market launch work' lives at the intersection of both teams' definitions.

Through the four-month Build, we paired with the in-house performance marketing analyst, who became the day-two owner of the system. Twice-weekly working sessions on the model, weekly review with marketing leadership, monthly steering with finance to keep the attribution definitions consistent with how the CFO talks about marketing investment.

Knowledge transfer was less about training and more about pairing. The performance marketing analyst was committing dashboard changes by month two, and by month four she was running the marketing analytics review herself. We delivered runbooks for the pipelines and a decision framework for when to retune the attribution windows, but the day-to-day operation was on her team from the start.

Results
  • Live ad performance visibility across 12+ markets during expansion
  • Marketing team shifted from post-campaign analysis to in-flight decisions
  • Spend reallocation between channels drove an estimated 20 to 25% improvement in overall media efficiency
  • New market launches had a feedback loop instead of flying blind
  • Channel mix decisions got made with data instead of gut
Before, we'd find out a campaign in Phoenix was wasted spend three weeks after it ended. Now we know by Tuesday and can move budget to the markets that are actually responding while the launch is still live.
Director of Performance Marketing, National grocery client

Outcomes start with a Blueprint. We plan, build and run from there.

Thirty minutes with a 829 Analytics partner. You leave with a prioritized view of what to build first, what's worth waiting on, and the business metric anchoring each move. Whether or not we end up working together.