How balancing user feedback with a data-driven approach powered Mini Golf Matchup to success

Less of a science, more of an art

How balancing user feedback with a data-driven approach powered Mini Golf Matchup to success
Antony Blackett is the co-founder and managing director of New Zealand indie developer Rocket Jump Games.

Along with our publishing partner Scopely, we recently launched Mini Golf MatchUp on both iOS and Android.

Our first game Major Mayhem had been well received, but Mini Golf MatchUp achieved a level of success beyond our highest expectations, ranking #1 overall in 28 countries on the first day of release and driving over 10 million downloads in the first month alone.

Following this recent success, we wanted to take a look back at the game's development and global launch, sharing our experiences regarding the opportunities and decisions that led to these results.

Choosing the right partner

It's a decision that every independent developer grapples with: to self-publish or work with a partner?

While there are benefits to both approaches, we ultimately partnered with Scopely because their value proposition was clear and their style of communication was direct.

A lot of developers are either blindly determined to self-publish or under the assumption that they need a publisher no matter what - even if they don't fully understand what a publisher will do.

In contrast, we remained open to both options, and we asked potential partners specific questions about how a relationship would work. For example, we asked questions like:

  • What’s your method of calculating the LifeTime Value?
  • What's your ad click/install tracking solution?
  • What's the name of your contact at Google Play?

These types of questions cut to the core of what a partner could provide, and whether or not they were thinking about building the business around our game in a sophisticated way.

Ultimately, we recognized that Scopely would be an excellent partner because they had clear and direct answers to our questions, and they made sure that we understood the value they could provide.

Subscribe to Pocket Gamer on

More specifically, we were excited that Scopely planned to build a heavily customized distribution plan around our game.

Scopely’s social infrastructure was also very attractive to us, since we lacked server-side and multiplayer experience. For instance, Scopely’s login and matchmaking tools, while they seem simple to the end-user, would have been very difficult to build on our own.

Scopely also provided us with APIs that supported more complex features like daily events, tournaments, and achievements, and we worked with their engineering team to implement these tools into the game.

Finally, we realized that Scopely could provide data-driven insights in a way that we could not ourselves - for example, we had no idea how to compare the lifetime value of users acquired from different marketing channels - and the fact that they planned to assign dedicated product managers to our game was a resource we valued highly.

After analyzing our other publishing options, including self-publishing, we realized that working with Scopely provided significantly more upside while mitigating our risk. And since we found partners with a wealth of expertise and tools that we lacked, we were able to focus on the core gameplay and presentation - our specialties - while Scopely built the business around the game.

Big changes during beta

Staying open to new ideas and being able to change design direction quickly was very important to our development process.

We took advantage of Android's open platform to test game elements like our control mechanics and new user flow on over 50,000 real people all over the world, and the feedback we gained from these players was invaluable to the game's eventual success on both Android and iOS.

We made use of a custom analytics system that Scopely provided, and we were able to understand high-level metrics like session length and retention as well as more granular information like which holes were too difficult or what courses were the most popular.

As we tested the game, we noticed that many players weren't completing the tutorial, and we dug into our data and analytics to understand what specific parts of the tutorial were losing users. It became clear that there was something wrong with the first hole - users who played through this first hole in the tutorial generally completed the five-hole course, but we were losing over half of our users on this initial hole.

By fine-tuning the first hole, testing several different iterations until we found one that was easy enough but still fun and exciting, and we were able to increase the tutorial completion rate from 23 percent to 83 percent, while increasing our player registrations from 13 percent to 65 percent.

Click for full-sized image

While we didn't have a tremendous amount of experience analyzing the data that we acquired through testing, we quickly learned in working with the Scopely team which metrics were most important (e.g. retention, session length, number of multiplayer invites sent) and how to be more data-driven in our design changes.

We also learned how more granular information can be extremely important to solving a specific problem. Because of this kind of analysis, instead of guessing as to why we had such a low registration rate, we were able to trace the problem back to the first hole in our tutorial. This detailed understanding of our game allowed us to focus our design efforts, and when we improved beginning of the tutorial, our registration rate spiked.

Additionally, we noticed a trend that many players were forfeiting or giving up on many games before they were completed.

By analyzing the specific games that weren't being completed, it became clear that our scoring system allowed blowouts to happen too easily, and our games were lasting too long - it was often obvious who would emerge victorious halfway through the game.

As a result, we shortened our games from 9 holes to 5 holes, and we tuned our scoring system so that games would remain more competitive throughout. After making these adjustments, we reduced forfeits by 80 percent and increased game completions by 240 percent.

Since we made changes quickly, releasing new builds in rapid succession on Android and engaging in constant A/B testing, we were able to see to what extent our changes were improving the game. This process of rapid iteration and testing was a great way of working that resulted in significant improvements prior to the worldwide launch.

Get user feedback early and often

Quantitative data and analytics were key to many of the changes we made before launch, but we also worked to qualitatively analyze problems in the game.

As a specific example, the game initially made use of a flick system, in which the ball would inherit the momentum of a user's finger along the screen. However, by conducting a great deal of play testing and paying close attention to user reviews, we learned that it was difficult for some people to grasp the flick mechanic.

Players were frustrated by a lack of accuracy, and this became clear as we ran frequent usability tests on players that had never seen the game before.

Not only did we play test with our friends and family, but we went so far as to post Craigslist ads and approach random strangers on the street to bring in a wide range of users. We wanted to make sure that we had a variety of people playing our game - old, young, male, female, hardcore gamers, and more casual players - because we didn’t want our analysis to be biased by a small sample size of users.

We recorded videos of our play tests, and watching these video recordings gave us clues as to what was really going on as people played the game.

As a result, we radically redesigned the input method from a flick system to a more easily controlled slingshot mechanic. Without this qualitative analysis of a potential problem, we may have released a game that many people would have struggled to play.

After we made this change, it was very exciting for us to see day-one retention raise from 42.8 percent to 54.8 percent, with day-seven retention increasing by over 500 percent.

Overall, the beta testing process was data-driven, but we also listened to users and trusted our creative instincts when making design decisions. If a design change didn't feel natural to the fundamental game design, it didn't make it into the final game.

The balancing act between data-driven and user feedback-based decisions wasn’t a science, but more of an art, and we made sure to pay attention to both the metrics and verbal feedback before we made any changes to the game.

What's next?

While Mini Golf MatchUp has already exceeded our expectations, we’re still exploring how we can build onto what we've created.

One of the most exciting elements of a mobile game is that development doesn't end when the game is released. Rather, users are always clamoring for more levels, more challenges, more gameplay features, and it's been an inspiring experience to see how our players interact with the game.

As a concrete example of how we've improved Mini Golf MatchUp post-launch, when we implemented tournaments, our revenue jumped by 80 percent.

We knew tournaments would enhance the overall experience - they were designed to improve monetization by forcing players to use virtual currency to enter and encourage the use of 'do-overs' with a more competitive style of play. But it was incredibly exciting to see how a new social feature could drive retention and monetization to such an extent.

We're continuing to work hand in hand with Scopely to kick around new ideas for power-ups, scoring changes, item collection mechanics, social feature integrations, and even cooperative gameplay.

There's still more to come, and as each tweak we make brings about concrete results, we're more excited than ever about the game!