smart-kid-in-glasses-thinking-with-speech-bubble-above.-vintage-000053821710_full_1024Ahh, the end of year is here.

Before you stuff those holiday stockings, it’s time to pull up your own socks and take a long, hard look at the mobile marketing campaigns you ran in 2015.

It’s time to ask yourself, which ones were really worth running?

And by that we mean, which ones truly impacted your bottom-line business goals?

These are tough questions that you just cannot answer with vanity campaign metrics alone.

Why Vanity Campaign Metrics Don’t Cut it

Vanity campaign metrics are click rates and conversion rates that are specific to each offer or message you run.

Click rate is a nice measure of immediate spikes in engagement, but it cannot tell you if that engagement lasts over time. Conversion rate helps you understand the percentage of people that complete a certain action once, but it can’t tell you how many times.

These are surface-level, short-term measures of success; they cannot shed light on:

  • How did this marketing campaign affect long-term engagement?
  • Did this marketing campaign increase in-app spend over time?
  • How many times did these users convert?
  • Are my campaigns turning off users and causing them to abandon my app?
  • Am I actually better off when I run these campaigns?

You cannot gauge the ROI of messaging campaigns by looking at clicks and conversion rates – you need to see lasting changes in users’ lifetime value and behavior. You cannot decide if Campaign A is better than Campaign B by simply looking at which performed better – you need to figure out if users would still convert at the same rate even if they didn’t receive these messages (or worse, would they convert at an even higher rate?).

Vanity campaign metrics are early indicators of success, but you need to adopt a more sophisticated analytical methodology to answer this million-dollar question:

Are my mobile marketing campaigns worth the investment and effort because they are actually moving the needle on retention, engagement, and revenue?

And the only way you can answer this question with ironclad confidence is by using lift analysis.

What is Lift Analysis?

Lift analysis is a way to measure how a campaign impacts a key metric. In mobile marketing, you could measure lift in engagement, in-app spend, or conversion frequency. Lift is calculated as the percent increase or decrease in each metric for users who received a campaign versus a control group.

If that was a mouthful, think of it like this: Lift analysis means comparing users who receive a campaign to a group of users who do not receive the campaign (i.e. the control group) to see which group is better off. By the way, a control group is a neutral segment of your users that do not get any special message, which makes them a good baseline for benchmarking against.

When a control group is enabled, you can see the “lift” in each of the metrics mentioned above and make solid app marketing decisions.

Validating the Value of Lift Analysis in Improving Messaging Campaigns

Data doesn’t lie.

In the first month of using lift analysis to measure impact of push and in-app messaging campaigns, brands saw an average 32% lift in engagement (sessions per users) and an average conversion lift of 20%.

It’s clear from these statistics that push and in-app messaging are successful channels for driving app growth, but not all strategies are created equal. In our research, we also discovered that some campaigns actually decreased engagement and conversion.

That’s why lift analysis is so important: it enables marketers to quickly identify and cut the campaigns that aren’t working, and continue to optimize the ones that are, to drive the best results over time. Using a control group to calculate lift is the only way to truly gauge impact.

Let’s walk through an example that proves this.

A Powerful Example: Clear Insight from a Control Group

Suppose you have an eCommerce app and you decide to run a push messaging campaign to offer users 20% off with a control group in place. The control group doesn’t receive the 20% off promo.

Then, you A/B test two push messages.

Message A is:

“FLASH SALE: All our cozy wintertime sweaters are 20% off with code SHOP20. Get the perfect present Present_Emojior the perfect look Blond_Girl_Emoji. Shop now.”

Message B is:

“TODAY ONLY: Grab all of our bestselling accessories for 20% off with code MERRY20. Hurry, before everyone else does! Come on, treat yourself.”  

Lift_Analysis_Example_1

In this example, users who receive no promotional message (the control group) convert (i.e. check out) at a rate of 4%. This gives you a baseline for measuring the impact of your discount campaign.

Message A drives a 10% conversion rate, enabling you to calculate a lift of 150% over the control group. It’s clear that Message A has a positive impact on conversion vs. no message. 

On the other hand, Message B has a 3.5% conversion rate, representing 14% less conversions vs. the control group. Message B actually has a negative impact on conversion vs. no message.  

In this situation, the smart marketer would continue running Message A, and cut Message B.

Key Takeaway: Always activate a control group. Otherwise, you’ll never know if your messages actually improve or harm conversions that would have occurred naturally.

A Powerful Example: Looking at Long-term Effects of Campaigns

Now, let’s consider another example. Suppose you own a media app and you decide to run an in-app messaging campaign to offer users a free 7-day trial of your premium version, with a control group in place. The control group doesn’t receive the free trial offer.

Then, you A/B test two in-app messages.

Message A is:

“Listen offline. Skip ads. And get unlimited songs. Click here to try our premium version FOR FREE for one week. No credit card required – just a body that’s ready to boogie. Try Now.”

Message B is:

“Special Offer: FREE 7-day trial of our premium version! Click here upgrade your music experience and take your tunes anywhere. No strings attached – only unlimited songs await.”  

Lift_Analysis_Example_2

In this example, we think of a key conversion event as “listened to a song.” As you can see, the early results show that users who received Message B listened to more songs one day later (150% lift over control). But this spike tapers off and the average number of conversions per user returns to the baseline level.

However, if you expand your time frame and look at the long-term effects of each campaign, users who received Message A actually listened to more songs over the course of the 7-day free trial (400% lift over control).

In this situation, the smart marketer would continue running Message A, and cut Message B.

Key Takeaway: Don’t solely focus on immediate success of your messages. Look at which campaign shows the highest lift in important metrics over time.

As you can see, lift analysis gives you insight into whether or not your campaigns are really making a difference – and by how much. Without lift analysis, you can’t see the full picture.

Lift Analysis Demystifies True Impact

Lift analysis is the only way you can quantify the true impact of push and in-app messaging campaigns. Lift analysis allows you to:

  • Directly attribute revenue to marketing actions
  • Make smarter optimization decisions
  • It saves you time and allows you to avoid running campaigns where people just click on a message versus actually completing the action you want them to
  • It allows you to track repeat conversions, not just one-offs
  • It helps you understand if your messages are resulting in short-term spikes in engagement versus long-term increases in loyalty and app use

Put simply, lift analysis is the only way you can measure your mobile marketing campaigns in terms of business value – not clicks.

And as you gear up and plan for a great 2016, don’t make investment decisions without incorporating it into your testing strategy.