At a large ecommerce brand with a marketplace app at the center of its business, leadership could see the symptoms before they could clearly define the problem:

  • Growth had lost momentum
  • Marketing was not contributing the way the business needed it to

The bigger issue was structural. Specifically, the business needed better alignment between leadership’s expectations and the metrics, systems, and decisions guiding the actual work. Which meant they needed an approach that went beyond simply tuning channels or improving campaign execution around the edges.

So what began as an effort to improve marketing performance turned into something broader, centered on better measurement, stronger alignment, sharper execution, and a more durable base for growth.

Was this a typical agency engagement? No.

But this is how we do some of our best work.

Connecting the dots between growth mandates and the work being done

Although the challenges this company was experiencing appeared to be around channel performance at first, the deeper issue sat further upstream.

The business had a growth mandate, and the marketing organization had executional work in motion. What was missing, however, was the strategic execution layer that connects those two things, including:

  • a clear operating model
  • a shared definition of success
  • a way to turn leadership priorities into decisions teams could actually act on

Without those pillars, team members were defaulting to the metrics in front of them, even though those metrics were only loosely tied to the outcomes the business actually wanted.  

For example, the team was working against familiar performance markers like app downloads and customer acquisition because those were visible, reportable, and operationally usable. But those markers were not always pointing the organization toward the same destination leadership had in mind. 

Situations like this aren’t uncommon. Performance does not usually stall because nobody is doing the work. It stalls because the business has not built enough connective tissue between strategy, measurement, and execution. 

Once that became clear, the assignment expanded to creating the structure, language, and measurement approach the team needed in order to pursue the right growth objective in a consistent way. 

Developing a shared definition of growth

We knew our first step was aligning their teams on a shared definition of growth that they should all be working toward, and this was one of the first real turning points in our work together.

Without a shared growth metric, teams often do exactly what they are set up to do: optimize toward what is visible, available, and easy to act on. 

So the business needed a single metric that could anchor decision-making across the program, but getting there took work. It required sorting through which measurements actually reflected meaningful progress for the business, rather than defaulting to the ones that were easiest to report on or most familiar inside the team. 

Once that decision was made, the strategy became more coherent and confidence expanded internally:

  • Measurements could be built around the same outcome the business cared about
  • Channel choices could be evaluated against that outcome
  • Optimization could become more disciplined because teams were no longer chasing a cluster of adjacent signals that pointed in slightly different directions

The next step, then, was making that objective operational, not just agreeing on it. That meant working closely with the data team to surface the right inputs and giving the marketing organization something concrete it could use in the platforms and in day-to-day decision-making.

Once the goal was clear and the data was usable, execution had something real to build from.

In between diagnosis and outcomes

When the challenges are both complex and organizational, the “messy middle” is where the most important work gets done. 

In this case, this work was technical and operational, which some might call “unglamorous,” but we consider to be essential. We reworked account structure, improved campaign design, brought better data back into the platforms, and put in place some of the day-to-day systems the team needed in order to make better decisions consistently.

One tactic didn’t change everything. Performance improved when the foundational, underlying mechanics were optimized overall. 

Paid search is a perfect example of this

There was meaningful spend going to areas that were not producing much value, whether that showed up in weak placements, lower-quality keywords, or ad formats that were not contributing in a meaningful way. 

So the approach was not complicated, but it did require discipline: cut back what was wasting money, reallocate toward what was actually working, and rebuild the program on a more intentional foundation.

Paid social had its own set of issues

The company had enormous inventory (millions of SKUs) and plenty of product volume to work with, but the program leaned heavily on catalog ads assembled from marketplace listings. 

Did that create scale? Yes, but it also created inconsistency at scale. 

The quality of the creative varied widely. Some assets were visually weak. Some did a poor job of showcasing the product. Some elevated items that were not closely tied to the parts of the business driving the most value.

There was also a broader creative constraint

Without enough static and video assets in the mix, the team was limiting its own options across Meta. This was a media issue, in addition to a creative quality issue. The program had less flexibility, reduced reach, and fewer ways to improve efficiency because it was not giving the platform enough to work with.

Again, the problem was not inactivity or lack of effort; their team wasn’t ignoring the work. It was that the program had not been built with enough structure around what mattered most, where spend should go, and what kinds of inputs would actually help the channels perform better.

SEO gaps were part of the larger pattern

The business had been built around product needs first, while acquisition needs came later. That approach shaped everything from site structure to page content to how categories were organized. 

It may have worked from a platform management standpoint, but it left clear gaps from a search and growth standpoint. For instance, their site lacked many of the foundational elements a stronger SEO program depends on, such as clearer category architecture, more intentional content structure, dedicated engineering support, and a tighter connection between SEO work and the marketing function. 

In key categories, competitors had built those foundations more effectively, and it showed in the rankings. 

This is why SEO became an early priority. We helped their team see where the structural gaps were, mapping what needed to be built, and creating enough implementation discipline to start closing them. That included page and content recommendations, structural improvements, and ongoing prioritization to secure the engineering resources required to actually make progress.  

Building an operating system around measurement

Existing setup for this company leaned heavily on last-touch attribution, which tends to over-credit the channels closest to conversion. Search often benefits from that because it captures intent at the moment someone is ready to act. 

But that does not mean search created all of that demand. 

Instead, it usually means search is the easiest place to assign credit. A more developed measurement approach makes it easier to see the longer path to conversion and gives other channels a fairer role in the picture.  

Of course, this challenge only becomes more apparent in an app environment. Measurement is more fragmented, privacy constraints create gaps, and it is harder to maintain a clean line between paid media activity and what happens after someone installs or converts. 

So yes, their team was working with data, but not always the kind of data that gave them enough confidence they were optimizing toward the right business outcome. 

That’s why we worked with their team to build a stronger framework for how performance would actually be understood around three pieces working together: 

  • multi-touch attribution for day-to-day optimization
  • media mix modeling for a broader view of channel contribution
  • incrementality testing to validate whether specific investments were creating real lift.

Instead of looking at only one of those pieces as a cure-all, the value of this framework was how those pieces told a true measurement story in combination. Ultimately, this empowered the team a better basis for decision-making.    

It also changed the quality of optimization itself. The business could acquire users, but the harder question they faced was whether it was acquiring the right users and whether those users turned into stronger long-term customers. 

Without better signals going back into the platforms, optimization tends to drift toward whatever looks cheapest in the short term. Better measurement made it possible to push past that and optimize against outcomes that were more closely tied to real growth.

Early impact came from tightening the fundamentals

We knew we were on the right track when we noticed we didn’t have to wait for every measurement piece to be in place before we saw noticeable gains. 

We saw performance improvements as a result of cleaning up campaign structure, cutting waste, improving how the platforms were being fed, and bringing more discipline to execution across search and social, even as we were building out their new broader measurement framework. 

On top of that, the team was tracking a projected annual revenue lift of roughly $75 million, based on improved efficiency and current run rate. 

While that figure was based on an evolving model that was updated on a regular basis, it was still meaningful enough to be shared consistently with the CEO as a way to show the scale of impact.

Over time, creative emerged as another gap

As the work progressed, it became clear that performance media was only part of the picture. 

Once the core marketing foundation got stronger, the next set of constraints became clear: creative and brand. This touched on product, merchandising, partnerships, positioning, and the overall user experience across the app, site, and marketing channels

The broader challenge is the brand did not yet have enough creative output, enough range, or enough consistency to support the kind of social performance the business needed. That started to shift over time, with more creative being developed, more direction around what was needed, and stronger support for the internal team.  

That is part of what made this engagement more consequential than a standard optimization project. 

Improving performance not only led to better campaign performance, it also surfaced bigger questions about how the business was operating, where resources needed to go, and what had to improve in order for marketing to work the way leadership wanted it to. 

We did more than optimize campaign performance for a client

This story goes deeper than the traditional “an agency came in, improved campaign performance, and tripled our ROI” narrative you see, and we’re proud of that.

Because what happened inside of this engagement was much more fundamental, as well as impactful:

  • A business with a visible growth problem got clearer on what was really holding it back.
  • The work uncovered a disconnect between leadership goals and the systems, signals, and decisions shaping execution. 
  • Once that disconnect started to close, the impact was broader than campaign performance alone. 
  • The company had a better way to make decisions, a clearer growth lens, and a stronger base to build from.

Yes, there were real weak spots in their original program. 

Some parts of the structure were underdeveloped, and certain priorities were misaligned. Creative support was not where it needed to be. The line between different kinds of marketing work had also blurred in ways that made sharper execution harder. 

But that wasn’t due to a lack of commitment within the team. In reality, they had been working inside a system that was not giving them enough clarity, enough support, or enough connection to the outcomes leadership actually wanted. 

Ultimately, that’s the difference between optimization and transformation. 

Optimization improves the performance of the system you already have. 

Transformation forces you to confront whether that system is built around the right goals, metrics, and ways of working in the first place. 

In this case, once the business got clear on what was missing, what was broken, and where their greatest levers of opportunity existed, the path forward opened up before them with better measurement, tighter alignment, stronger execution, and a marketing function better built to support growth over time.

Leave a Reply