Feedback after singing

Simply Sing is an application that takes the users on a journey where they learn to sing with fun songs, receive feedback, and  improve their skills. In this case study I’ve worked on giving users feedback after singing.

The challenge

After users finish singing a song, a summary screen appears, with an option to review their singing.We suspected that the feedback feature was not as impactful and valuable as we would like it to be, and as users expect. It causes the app's perception to be more like a karaoke app than a learning tool.

The hypothesis

With better singing feedback feature, users will practice the same song again, and get more value from app.

Background

The old review flow

The research

Checking the hypothesis

On my research, I focused on understanding these questions:

Are users interested in feedback on their singing?

Are users find our "review" Feature valuable?

How well do users trust our feedback?

How much effort and time are they willing to put in it?

Collecting the Data

First, I examined the data on current interactions with the feature, focusing on user behaviors and key metrics. What did I ask?

For understanding current usage - How many of the users:
• Have ever pressed review button?
• Have pressed review button more than once?
• Does entering ‘review’ screen’  influences ‘Day 2 retention’ (our main KPI)?
• Time users spent on review page, and on 'song complete screen'.

For talking to users:
• Recurring users of the 'review' feature.
• Users who didn't click the review or clicked and left immediately.

Learning from talking with users

What and how did I ask them?

Conclusions & insights

Actionable feedback & guidance

The current singing review is too brief, and not actionable. Users like to be guided through feedback

Personalised tips

Can we produce a perception of personalisation? Creating a bank of tips and adjust them in a way that fits most cases

Current review button

Current usage: Only 42% click it, and only 12% click it more than once.This feature is not valuable because. The interface is too detailed and hard to understand what to focus

Practical tips

Users expect getting practical tips that will actually help them improve the song.

Expanding our feedback

Session with vocal development teacher

During the session, we provided participants with live feedback and suggestions for improving their singing. Based on this, I mapped out the feedback we can offer users and the technical requirements needed to support it.

The design process

Ideataion

I addressed the solution with diverse approaches, each responding differently to the need for feedback while also having its unique requirements.

Actionable feedback & guidance

The current singing review is too brief, and not actionable. Users like to be guided through feedback

Self feedback

Users give feedback to themselves “How would you rate your singing?”

Real teacher feedback

Get feedback from a real teacher

Over time statistics

Statisitcs about achivements, songs improvement, records

Highlights feedback

User gets the highlight of the parts he needs to work on, with the option to hear these parts and then practice

Improving the prototype

I conducted several user testing sessions, which helped me improve the product based on their insights:

1. Actionably feedback: Users appreciated feedback that was direct and actionable, enabling them to easily understand and apply improvements to their performance.
2. Unified Interface: Users found value in accessing all needed information on a single screen, enhancing usability and minimizing distractions.
3. Overwhelming tips: Seeing all tips at once was confusing and overwhelmed users. A more stepped approach is recommended
4. Minimal Screen Changes Constraint: Due to the screen's complexity, it's essential to implement as few changes as possible (Dev request)

The solution

End song screen

‘Feedback & tips’ card (instead of the review button):
• Mistakes can be viewed briefly by users, and this encourages them to explore the full feedback page
• Users can still choose to move on to another song if they don't interested in feedback

Feedback screen

The user hears themselves sing

• Highlighting the user’s main mistakes.
• Hearing the problematic parts

The user reviews the feedback they received

• Actionable feedback for each mistake.
• Practicing again the parts with mistakes.

Practice screen

• Practicing several times on the problematic part.
• Slowing down the speed
• View the my personalised tip

What have I learned?

User perception is everything

Even though technological limitations prevented giving perfect feedback, I designed the experience to ensure users felt the feedback was personal and meaningful. User perception played a crucial role in this process, illustrating the importance of user belief in the value of our product.

Refining Through User Testing

User testing provided valuable feedback, particularly given the physical interaction users have with Simply Sing. By observing real use cases directly, we improved usability and user satisfaction iteratively.

Layered Information Design

I learned to cater to diverse user needs by layering information - critical details first, with options for deeper exploration. It enables a flexible user experience -  quick overview for some users while offering more depth for others.