“How do we know this worked?” In the history of humanity, nobody has enjoyed hearing that question. Why? Because it contains seven assumptions – six of which have negative foundations! As companies evaluate their 2023 marketing spend (after likely reducing spending in 2022), the wise marketer remembers their north star: customer engagement. By embracing proven techniques, addressing these seven assumptions will expand the impact of your digital, physical and omnichannel marketing.
- “Our customers would have engaged anyway.”
Solution: A/B Testing.
Odds are you know the solution: A/B (or its nerdier, younger sibling – Multi-armed bandit) testing. What you may have missed is the rise of easily-adopted tools to attach targeted digital ad spend to in-store traffic with *gasp* coupons. All-too-often marketers forget that couponing isn’t intended to be a traffic driver – it is meant to trace engagement. Once you’ve proven your correlation, stop the coupon incentive (but keep it in your toolbox for when you need to reprove your answer.)
- “Other campaigns probably skewed these results.”
Solution: Advanced Calculus.
Real Solution: 9th Grade Math.
This answer requires a bit of math and discipline – but nothing too intense. Remember learning “Combinations” in math class? Probably not – but a quick refresher is just a search engine way. Using “Combinations” explains how many overlapping campaigns you can run with your trusted sample sizes. Let’s get specific. If you have 5 stores and it takes 3 stores for you to feel confident, then you can run 10 concurrent campaigns to know what worked and what didn’t (type “5 choose 3” in your friendly neighborhood search engine to see for yourself).
SIDENOTE: You probably didn’t expect Ancient Greece mathematics in a conversation about omnichannel. Honestly, I didn’t want to include it – but the unique pairing of “what’s new” and “what’s trusted” sets the upper echelon of CX professionals apart.
- “Did we really need to spend so much?”
Solution: Unbiased, third-party competitor analysis.
At family reunions, if you’d like to answer “What do you do” with “I’m a modern day Sherlock Holmes”, then this question is for you. Rather than looking at what you’ve spent, research what your competition spends. Be skeptical of data provided by ad platforms, whether social or search or billboard, as they want to create an arms race with your competitors. When you find “here’s what people are spending” answers from independent sources, you’ll have evidence of the effectiveness of your budgets.
- “Why didn’t this drive more results?”
Solution: MTFR (Minimize Time to First Result)
In the same vein of logic as “why the second piece of cake isn’t as delicious as the first,” understanding the “law of diminishing returns” is in everyone’s best interest. Turn the dial up in #1 until the effectiveness drops. Unbiased sources measuring ad spend effectiveness allows you to trust the findings and minimize the MTRF.
- “How can we trust these numbers?”
I think we stopped using the word “heatmaps” when the world went mobile, but brick-and-mortar should still be the first data point we seek. For those unfamiliar with this technology, search “Customer Traffic Heat Maps”. Although other answers emphasize “Quantitative Data,” a visualized heat map of in-store customer traffic reminds everyone that “Qualitative Data” is both beautiful and informative.
- “Are you sure this will work next time?”
Solution: Agile Methodology.
No, but that is ok. Each customer engagement test is designed to be brief, so lessons are quickly learned and leveraged. In the software developer world, this “agile” mentality has been around for decades – it involves early delivery and continual improvement, which is exactly what we want! There are great solutions that let you define starting expectations with explicitly defined KPIs, measure results at fixed-time intervals, and blamelessly review what worked and what did not. By accepting a 100% success as impossible, eachiteration improves our engagements because learn-from-failure is baked into our processes.
- “Why didn’t we do more of it?”
Solution: Trust (Earned Through Transparency)
If you had the luxury of saying “take four minutes and read the previous six answers,” yes, you wouldn’t need a solution for this – but c’est la vie. Instead, leverage a wiki to showcase transparency and accountability of your experiments. The initial practice will feel like navel gazing, but over time creating a searchable, living document becomes an asset that details lessons and accountability. This treasure trove proves your decisions are founded on expertise and the ability to evolve with the marketplace.
The platforms, metrics, customers and expectations always change. With some simple research, you can find some great service providers offering tools for each problem. By separating “this tool” from “this technique,” you’ll be empowered to measure customer engagement – across every channel – and evolve the conversation from “How do we know this worked?” to “What do you want to do next?”
Eric Caron is the Senior Director of Digital Experience at Caribou Coffee, and before that was at Best Buy and a few startups. He’s a Minnesota native and firmly believes that any concept (from EBITDA to quantum mechanics) can be explained in 30 seconds. When he isn’t talking about customer engagement, digital marketing, agile methods, websites, or personas, you’ll find him with his family in Eden Prairie (attempting to teach his three boys the difference between a French press, Aeropress & ChemEx).