top of page

Getting Transparent about Measurement: Bridging the Art and Science of Marketing Performance

by Jon Lorenzini

July 27, 2023


All measurement solutions have a dirty little secret, and it lies in the reconciliation between data and reality. Here are some common examples:

  • Missing and biased data gets projected up (iOS14, GDPR, etc.) - if a significant portion of users opt out of being tracked, marketers may not have complete data. They might have to estimate or 'project up' the behavior of this non-consenting group which could introduce bias.

  • Additional economic variables are added to lower errors but might not be relevant. - for instance, adding GDP to a model may not be a relevant variable and could just add noise to the model if the product being sold is a necessity

  • Coefficients get tweaked based on believability, not driven by data. - an analyst believes that social media engagement should impact sales more than the data suggests. They might increase this coefficient beyond what's supported by the data.

  • Days get normalized or flagged out for valid or invalid reasons - for instance, deciding between normalizing or flagging out Black Friday sales.

  • Part of the sample gets removed (inconvenient vs. explainable anomaly) - should a week of a sudden uptick in sales be removed because there is no promotion to explain it?

These treatments aren’t wrong or bad, but the issue is whether this data massaging is transparent to the business. Statisticians or modelers may make these adjustments without market context or clear disclosure, or worse yet, the adjustment is a “kleven.” (a nod to “The Office” fans out there.) None of this bodes well for trust in measurement solutions.


The knowledge gap between data scientists and business-focused marketers can be bridged by bringing modeling closer to stakeholders in a transparent way that represents the business. Data scientists can learn business dynamics through this process, while marketers can better understand the model output. The believability hurdle between parties must be overcome to establish trust in the model output. Otherwise, the measurement becomes a secondary KPI, an additional check, or a nice-to-have indicator that adds irrelevant KPIs to marketing goals.

So how can we trust the outputs of a model, when the inputs are low-quality data?


The answer is that we can’t.


A high-signal source does not need any distortion. However, a low-signal source may require many additives or distortions. Even a Ph.D. would be reduced to their not-so-data-driven best guess at a certain point. For example, if you spend the same amount on television every day for several years, how would you know what is TV and what is your customer baseline? There is no way to untangle these two from correlative models.


Modelers, and the solutions they employ, need to be accountable for sharing the quality of the signal so that marketers are not making big decisions off weak signals. Without that, a marketer has to apply a secondary validation, measurement triangulation, or their gut (“big data” of their own experiences).This process is not about the measurement itself, but how well it passes general believability that the business side is intimately familiar with. Without trust, the measurement becomes a secondary KPI, an additional check, or, at worst - a nice-to-have indicator. This increases costs, slows decision-making, and adds irrelevant KPIs to a goal that marketers might have. Only after trust is established can action occur.


Our Approach

At LiftLab, we have been working hard on solving this trust problem. When we have weak signals for channels in our Agile Mix Model, we let our clients know and recommend running additional tests to create better data. Our geo-experiments create a variance, and with a control group, we can provide stronger causal signals that feed into (and refine) our clients’ Agile Mix Models. This powerful reinforcement learning adds clarity and comprehensive views of investment decisions. Because our client’s know their business best, when massaging the data is required, we strive to put all the art into the client’s hands and all the science baked into the platform. Using a consistent methodology across all channels, tactics, and campaigns, we now have a robust way to compare performance and, ultimately, take action to drive results you care about.


To summarize:

  • Transparency builds trust

  • Trust leads to action

  • Correct actions lead to better performance

  • Better performance drives marketing effectiveness

Many measurement solutions only get to the fourth rung at best, but trust can be established by empowering marketers with transparent and high-quality data that leads to correct actions. In April, I shared the below hierarchy at a recent IAB conference.


I’ll explore how we achieve greater transparency and marketing effectiveness in future blogs. In the meantime, it’s time to shed the ‘Kleven,’ get to greater transparency, and bring the art and science of marketing closer together. In the meantime, if you want more details, schedule some time with us here:




bottom of page