Enhancing User Experience on an E-Learning Platform
GOAL: Identify key usability issues present on the learning platform to inform and guide iterative product redesign
SITUATION: As the sole researcher, collaborated with design and technology teams at a bootstrap startup (early stage) with 7,000+ users
IMPACT: Design and technology teams reworked product timelines to prioritise and implement my recommendations
- Increased NPS score from <10% to 38%
- Increased course completion rate from 5% to 86%
Company: X Billion Skills Lab
Methods: Industry & literature review, Interviews, Moderated Usability Test
Role: User Researcher and Product Strategist
Team: 3 Designers, 2 Developers, CEO
Duration: 6 weeks
BACKGROUND AND PROBLEM
As the Learning Experience and Curriculum Architect at X Billion Skills Lab, I had presented the generative insights and recommendations that the features of the learning platform were built on. For the purpose of this case, I’m going to talk about the next, evaluative phase: testing of our platform!
Despite all our efforts, course completion rate plateaued at the industry average ~5%. We wanted to identify key usability issues that prevented learners from sticking with the learning plan and completing the course. As a startup, product development was progressing rapidly. Therefore, if I wanted my recommendations to be implemented, I had to be quick. Balancing research rigor with resource constraints was what this project demanded!
ROLE AND CONTRIBUTION
I was the sole user researcher on the team, and at the company. I was single-handedly in charge of:
- Planning the research (designing usability tests, building research plan, organizing the equipment for the usability tests)
- Screening and recruiting participants
- Conducting the moderated usability tests
- Analyzing the data
- Presenting insight-based recommendations to the team
- Ensuring the recommendations were implemented staying true to the needs of the users
As a team of one on a tight deadline, I had to prioritize, identify workarounds, and adapt to the constantly evolving situation.
To implement my recommendations, I collaborated closely with the design team (Senior Design Architect and 2 graphic designers) and the technology team (Senior Technology architect and senior developer).
Research Question
How might we ensure our learning platform is intuitive, encourages consistency, and makes users want to continue learning?
Research goals:
- Identify tasks that the users can perform smoothly, and those they cannot
- Understand why users struggle with certain tasks (points of friction and/or confusion)
PROCESS
Phase 1
Identifying users
First, I identified who my user groups were going to be, by building directly off our platform’s two target audience segmentations.
Students
Currently in their last year of college (20-21 years old)
Attending tier 2 or 3 colleges
Studying engineering, commerce, science, or arts
Young professionals
- In their first two years of their career (21-23 years old)
- Attended tier 2 or 3 colleges
- Working in business development, sales, customer service, marketing, and other client-facing or cross-functional roles
Guerilla Testing
I knew I didn’t have resources to waste so I decided to design a “trial run” for myself. I created mock tasks, grabbed some laptops, and staked out cafes near colleges and offices where I thought target users would visit.
Challenge: I struggled to convince people to participate in the study since resource constraints meant I couldn’t provide monetary incentives. Ultimately, I tackled this by engaging with the larger group – making the usability study an event with social incentives. Treating participants’ friends as an extension of the participant pool also provided a spectrum of insights!
Phase 2
Moderated Usability Testing
Reflecting on and learning from my guerilla testing, I altered my tasks to remove vague terminology, add specificity, and keep them emotionally neutral.
Measurements of Success
- Success rate of the task (1 for completing, 0 for not completing)
- Time the task required (in seconds)
- Satisfaction score (“How would you rate your experience completing this task on a scale of 1-5?” where 1 was very difficult and 5 was very easy)
I included and emphasized on the satisfaction score because if the user found completing a task frustrating, they would view the experience negatively and be less likely to engage with the platform, even if they accomplished the tasks.
Learning: Taking detailed notes was hindering my ability to moderate effectively so I established a workaround: videos. I recorded the participants so I could focus on moderation and, if something interesting happened, I noted the time stamp so I knew exactly where in the video to go back to.
KEY INSIGHTS AND RECOMMENDATIONS
Using the measurements of success to analyze the usability tests, I identified two key insights and proposed a recommendation to address each.
1
Observation
When users were stuck or had a question about the content, they didn’t access the “help” feature. Instead, they stayed on the content tab and posted their questions in the “forum” section. They would keep checking for a response
Insight
Users struggled to identify where to go when they had questions, preferring to not move away from the content. They got frustrated when they didn’t get prompt answers
Recommendation
Make the “help” feature more accessible: integrate it into the content page, next to the “forum” feature. This ensures users ask the right people and get a timely response
2
Observation
Some users, mainly those identifying as female, looked visibly taken aback when the platform first loaded. They also used words like “manly”, “dark” and “too game-like” to describe the experience
Insight
Some users found the platform’s overall look and feel too masculine. The association with video games reduced the seriousness and hindered the experience.
Recommendation
Since some users appreciated the current color scheme, add an option to toggle between this “dark” mode and another “light” mode
Getting buy-in from stakeholders
I knew that implementing these recommendations would cause us to re-work priorities and alter product timelines for the tech and design teams. Therefore, I presented them using my go-to strategies to get buy-in from not only team leaders, but also those actually executing the recommendations.
Team, not just heads
Involve the developers and designers who are going to execute in the decision making process
Videos to show, not just tell
Use images, videos, or audio clips to show the users behind the insights
Common end goal
Establish the need for the recommendations in achieving the goal before presenting them
IMPACT
Both of my recommendations were implemented – drastically improving user feedback in the short run and, in the long run along with other iterations, course completion rates:
- NPS score jumped from <10% to 38%
- Course completion rates increased from ~5% (industry average) to 86%
REFLECTION
What I learnt
- Second time better. I’m always going to be doing something for the first time: how do I do it such that the second time is always better? Giving myself the space to try, fail, and learn from my pilot test made me more confident and improved the quality of my research.
Next time…
- Leverage tools. My approach was scrappy given the lack of resources. Next time, I’d love to explore doing this process using tools like heatmaps and platforms like dscout or usertesting!