Solving the Engagement Problem in Mental Health Apps through Gamification

B2C
Health Tech
2023

Overview

As part of a freelance project for a Lithuanian client, I was tasked with designing and validating a mobile application aimed at preventing mental health problems among young users, with a specific focus on the Lithuanian market. This case study examines the user-centered design process I followed, which integrated gamification to enhance user engagement and adherence.

Company
Private Client
Role
UX/UI Designer
Scope
End-to-End Product Design
Product Strategy & Vision
User Research
Prototyping
Design Systems
Research Icon - Portfolix X Webflow Template

The Problem

Globally, and particularly in Lithuania, young people are facing a significant rise in mental health challenges such as anxiety, stress, and depression. While mobile health (mHealth) applications offer an accessible and anonymous way to seek support, they suffer from critical flaws. Market analysis and user research reveal two primary issues:

Problem no. 1

Low User Engagement and Retention

Many existing mental health apps fail to keep users engaged over the long term. They often lack personalization, content variety, and motivating features, leading to high abandonment rates.

Problem no. 2

Poor Usability

A common complaint among users is that mental health apps are not intuitive. Complicated interfaces, bugs, and difficult navigation create a frustrating user experience, especially for individuals already struggling with their mental state.

The core challenge:

To design a mobile application for teenagers and young adults (ages 14-25) that is not only functional but also genuinely engaging and easy to use, thereby encouraging consistent, proactive self-care for mental well-being.

The Approach

To tackle this challenge, I adopted a rigorous user-centered design (UCD) methodology. The core philosophy was to move beyond assumptions and ground every design decision in the real needs, behaviors, and motivations of the target audience.

My approach was built on three key pillars:

1
Foundational User Research: The Interview

The project was grounded in direct qualitative data gathered through semi-structured group interviews with 20 young adults from the target Lithuanian market. The interviews were designed to uncover deep insights and were structured in three parts:

  • Exploring past mental wellness experiences.
  • Collaboratively defining app requirements on an interactive whiteboard.
  • Evaluating the motivational appeal of specific gamification mechanics.

The key findings from this research, such as the need for ease of use, privacy, and personalized content, and a preference for non-competitive gamification like quests and points, formed the bedrock of the entire design strategy.

2
Synthesis and Ideation

The qualitative data from the interviews was synthesized to identify core themes and pain points. This research was then used to build two detailed user personas, which served as a constant reference point throughout the project. These personas informed the information architecture and the feature set. Gamification was not treated as an afterthought but was woven into the core design to drive engagement, create a sense of accomplishment, and make self-care feel less like a chore.

3
Mixed-Methods Validation

Every stage of the design was validated with real users. To test the high-fidelity interactive prototype, I employed two methods. First, empirical usability testing was conducted, where participants were asked to complete specific tasks to identify any friction points in the user flow.

Second, the design was evaluated quantitatively using the System Usability Scale (SUS), an industry-standard questionnaire to measure perceived ease of use. This mixed-methods approach provided both qualitative and quantitative data to prove the design's effectiveness.

The Solution

The project culminated in the development of a high-fidelity, interactive prototype for a gamified mobile app. Guided by the user-centered design process, the solution was crafted to be intuitive, engaging, and directly responsive to the needs uncovered during the initial research phase. The design focused on creating a supportive and motivating environment where users could proactively manage their mental well-being through a set of personalized tools and positive reinforcement.

The results

Key Results & Impact

The app's design was validated through two methods of usability testing with members of the target audience before the final handoff to the client.

All users successfully completed tasks

Empirical Usability Testing

Participants were given a set of tasks to complete within the interactive prototype. While all users successfully completed every task, the testing revealed minor usability issues, such as the placement of the buttons. This provided clear, actionable feedback that was incorporated into the final design recommendations for the client.

Average SUS score 78.6

System Usability Scale

The prototype was evaluated using the industry-standard SUS questionnaire, a reliable measure of perceived usability. The app achieved an average SUS score of 78.6, which is considered "Good" and falls above the industry average of 68. This score indicates that users found the app to be intuitive, well-integrated, and easy to learn.

Next steps

Next Steps & Recommendations

Following the successful validation of the prototype, the final deliverable was handed off to the client. As the project concluded at the design phase and did not proceed to development on my end, it was not possible to track live metrics.
Therefore, this section outlines the key recommendations I provided to the client during the project handover for post-launch monitoring. I advised them to focus on two primary areas to measure the app's real-world performance and guide future iterations:

Engagement

To measure engagement, the primary goal was to answer the core question: "How much value are users getting from the app, how often do they return in the short term, and how deeply do they interact with its features?". The recommended metrics to track this would include Daily and Monthly Active Users (DAU/MAU), session length and frequency, and the specific features used per session, such as the number of daily quests completed or mood entries logged.

Retention

For retention, the key question to answer was: "Do users come back after the first use?". To measure this long-term loyalty, I recommended tracking Day 1, Day 7, and Day 30 retention rates, the churn rate, and performing cohort analysis to understand the behavior of specific user groups over time.