
Net Promoter Score
Net Promoter Score
Net Promoter Score
Net Promoter Score
Net Promoter Score
COMPANY
Naked Wines
COLLABORATORS
Product Manager
Full Stack Developer
Head of Customer Insights
RESPONSIBILITIES
Research & Analysis
UX Design
UI Design
OVERVIEW
The project focused on improving the Net Promoter Score (NPS) survey to increase participation and gather more meaningful feedback for Naked Wines. The existing process relied on a desktop-focused email and a simple onsite prompt that only asked, ‘Why did you give this score?’ This approach provided limited actionable insights. By introducing onsite follow-up questions and redesigning the email, the new experience created a more cohesive, branded, and engaging way for customers to share their feedback.
NPS is determined by asking people to provide an answer, on a scale from 0–10, to the question: 'How likely are you to recommend this website/product/service to a friend or relative?' The answers are grouped into three categories:
DETRACTORS
0–6
indicate dissatisfaction and likely criticism
PASSIVES
7–8
indicate moderate satisfaction, but low likelihood of recommendation
PROMOTERS
9–10
indicate high satisfaction and strong likelihood of recommendation

How do we encourage customers to complete the survey?
PAIN POINTS
Surveys are often seen as annoying and tedious, especially when they are optional, lack rewards, or aren’t optimised for mobile devices, making them frustrating to complete and leading to high abandonment rates.
USER NEEDS
To encourage completion, users need clear and simple questions, appreciation for their time and feedback, and a mobile-first optimised experience.
70%
of Naked Wines customers access the website on a mobile device
72%
of Naked Wines customers use Gmail as email service
OPPORTUNITIES
Analytics show that over two-thirds of Naked Wines customers access the website via mobile, with most using Gmail. This presents an opportunity to optimise the survey design and email experience for mobile and Gmail. Given that phones are often on hand, a mobile-first approach ensures the survey can be completed at a convenient time, while a playful design can boost participation.
Refine survey user experience: The stakeholder shared follow-up questions with two rating systems. To reduce cognitive load and minimise the risk of survey abondonment, I suggested using only one rating system.

Usability
and Readability
MATRIX VS BUTTONS
A traditional Likert scale in a matrix format is less mobile-friendly, requiring horizontal scrolling as not all options are visible, increasing effort and the risk of drop-offs.
To improve usability, we replaced the matrix with circular buttons, making it easier to tap, providing visual feedback, and creating a more interactive, faster, and engaging experience that reduces abandonment.

Mobile Layout: Users said it was easier to read in rows

Desktop Layout: Users said it was easier to read in columns

Final designs: reducing visual clutter by displaying only the endpoints
LESS IS MORE
Mobile users preferred the legend in rows, whilst desktop users favoured columns. Recognising that a one-size-fits-all solution wouldn't work, we challenged our approach and simplified the legend to show only the endpoints, relying on users' familiarity with Likert scales. This proved intuitive and effective, prompting us to amend the design based on user feedback.

TIME CONSTRAINTS AND RESOURCE ALLOCATION
The design process went through multiple iterations, including displaying each question on its own page and exploring features like scroll-snap and active question highlighting. However, we needed to adapt when facing time and resource constraints. We agreed to simplify the design to accelerate development. As a result, these features were ultimately deprioritised.
Error Handling: if mandatory questions are left unanswered, it auto-scrolls to the first missing response
UNBIASED DATA
The previous email design used red for negative and green for positive feedback. In the new design, we decided for a neutral approach to collect unbiased data.
The updated design, now with onsite follow-up questions, is sent to customers who made a purchase within the last 7–30 days, instead of up to 12 months, ensuring feedback is tied to a more recent and memorable order.
Email before and after (desktop designs shown, as the original email lacked mobile-specific designs)
OUTCOME
Despite the old design being much shorter—asking only 'Why did you give this score?'—the completion rate increased to 26% with the new design. This demonstrates the effectiveness of the improved approach, which not only engaged users more successfully despite being longer but also collected richer, more meaningful feedback.