Preventing churn in non-paying users with machine learning

Ever wonder how those special offers coming from mobile telecom providers are determined?  Almost certainly some aspect of these marketing offers involve a machine learning algorithm identifying which customers are likely to churn.   Game companies are beginning to take a cue from these industry leaders.  A previous study we discussed identified and effectively addressed churn in high value spenders in a game.  Today we get a chance to learn how a leading game company, IMVU, uses machine learning to address churn in non-paying users.  What follows is a condensed exchange between Nick Lim and Donnie Kajikawa, Senior CRM Manager at IMVU.  

Hi Donnie.  You gave a presentation in Feb 2021 at the Mobile Growth Summit Virtual 3.0 titled "How Churn Prediction Models Can Improve Retention", can you tell us a little more about IMVU and why you wanted to give that talk?

First off Nick, some introduction to IMVU.  We just got rebranded as Together Labs, and we are the world’s largest avatar based social networking app.  As game marketers, we spend a lot of time and energy on optimizing user acquisition.  But we tend to ignore the tail end of the funnel, that is, churn.  In fact if you can decrease churn by 10%, it's like improving your UA by 10%.  Many more things that affect churn are in your control than UA.  And for an average mobile app, 94% of users churn within 28 days, so there's lots of low hanging fruit.  Hence we wanted to share with the community our efforts on churn reduction.

So how did you start this churn prevention journey and what was the thinking behind it?

Some background here is that we had already built an intelligent machine for UA.  This UA machine would ingest a bunch of data points and continually try to identify the best acquisition channels, pricing etc.  So we thought it might make sense to apply the same methodology to churn.  This was in late 2019.  First consolidate all the insights we have about our users, then we would use an intelligent machine to identify where each user is in their IMVU player experience.  The machine would sift out the signal from the noise and tag each player with her likelihood to churn.  Then it would be up to us to figure out how to retain each user.  We were willing to try a bunch of things to see what worked.  

What type of information does the machine use and what do the outputs look like?

Well the machine looks at the users’ activity within the app, such as shopping or browsing.  It also looks at the device the user has and also how she came to install IMVU.  What comes out of the machine is a propensity score, that tells you the probability of the user churning in the next few days.

So what did you do with the propensity score?

Before we get into the nuts and bolts of it, it’s important to think of a churn reduction as a program.  This means it’s not a one-off campaign you run, but rather a system with several parts, including audience analysis, test plans, communications plans, results review and automation.  The churn propensity score is just one part of the program.  

Can you elaborate on this program?

Sure.  For the audience analysis, we had to segment to identify not only the likely churners, but also which geos and languages would make the most sense.  Also understanding the big categories of user activities helps to brainstorm the communications plans.  For test plans, that included different offers or retention incentives, what metrics we wanted to measure and compare.  Communication plan had to include in-app communications as well as out-of-app channels.   Automation is critical to increase the cadence of the testing and also getting ready for evergreen status.

How did you work through the test plan?

So the test plan was a series of things we wanted to try.  Now that we knew the probability of churn for each user, we could experiment with different retention offers, how many credits to provide; we could also experiment with different groups of users, those who were more than 90% likely to churn or between 60-70% likely to churn, for example.

What about the communications plan? 

We could try different channels, in-app and out-of-app to reach the users.  For each channel different message types were also tested.  Additionally we had different message frequencies and triggers. Here are push notification and email examples that we showed during the conference.

  retention email 

retention push notification

 

Figure 1.  Screenshots of push notifications and emails sent to users likely to churn.

 

What’s the automation part?

I’ve been running programs for a long time, and one of the hardest things is to coordinate among different departments and tools.  Many ML projects fail because it’s hard to get the ML output to the different places that it can be used.  So we focused on automation.  For example, Sonamine can analyse the raw data without needing any internal IMVU data resources.  Sonamine also updates the users ChurnSoon score inside our CRM tool, Leanplum.  This allows us to focus on the testing and communications plans. 

Yes, I hear a lot about ML projects that are well hyped but have difficulty getting into production.  How did you fare?

Within 4 weeks we had our first test in-app campaign running!  It’s a testament to the team effort.  And we have been sprinting ever since.  We are now even running conversion campaigns using the same automation paradigm.

As for the churn reduction, when we target the users who are likely to churn, and we give them some retention rewards, we find that their DAU level is 10% higher than a control group that does not get them.  And interestingly, this elevated DAU level is sustained even 60 days after the initial reward!   See this chart of our ChurnSoon campaigns aggregated for the first 30 days after the initial reward.  The red line shows that the rewarded group consistently had better DAU than the control group.

 

 higher DAU for retention bonus group

Figure 2.  Test group with retention bonus with consistently higher DAUs many days after the initial retention reward.

 

So you would target rewards to smaller groups of users, thereby not giving away credits to users who were going to stay anyways?

Exactly.  Our communications with our players are personalized to her current spot in the IMVU experience. 

Did giving away retention rewards affect the ARPU of these users?

I think that’s a common objection to giving rewards to users who are likely to churn. In our case, the ARPU of the rewards group dips below the control group in the first 1 to 2 weeks.  But by week 4, the rewards group ARPU overtakes the control group.  Currently in the aggregate of all our campaigns, the rewards group has a 23% higher ARPU compared to the control group!   

So what’s next for the churn reduction program?

We have been running these ChurnSoon campaigns twice a month for more than a year now.  We hope to increase the cadence here by additional automation.

What were some key takeaways in the program?

One decision was whether to build or buy the machine learning part.  We have a great data science team in-house.  But like all data science teams, they have more work than resources.  So we decided to leverage Sonamine services, as an extension of the internal team, but also to get to market faster.  If you are just starting out, we recommend working with a partner first.  You can always bring it in house later. 

Also, you need to continually have a control group as things may change.  It’s important to know there isn’t one single holy grail answer that works forever.  You may find some tactics that work for now, but you may need to make changes over time.  

Thanks for chatting Donnie, that was very informative.

Thanks for having me.  Feel free to reach out with further questions.

 

To learn more about Sonamine or to contact us, please click here.