
Net Promoter Score
Net Promoter Score
Net Promoter Score
Net Promoter Score
Net Promoter Score
COMPANY
tldraw
COLLABORATORS
CEO
Software Developer
RESPONSIBILITIES
Research & Analysis
UX Design
UI Design
OVERVIEW
tldraw is an open-source whiteboard for teams and individuals. Until now, all files were tied to a single owner, making collaboration clumsy. Sharing required making files public, and teams often lost visibility unless they pinned files or saved links manually. To solve these pain points and make tldraw more useful for team workflows, we set out to design and build a groups feature that would enable shared ownership of files.
Whilst the idea of groups was clear, one action–moving files between personal space and groups–turned out to be the most debated and complex part of the design.
NPS is determined by asking people to provide an answer, on a scale from 0–10, to the question: 'How likely are you to recommend this website/product/service to a friend or relative?' The answers are grouped into three categories:
DETRACTORS
0–6
indicate dissatisfaction and likely criticism
PASSIVES
7–8
indicate moderate satisfaction, but low likelihood of recommendation
PROMOTERS
9–10
indicate high satisfaction and strong likelihood of recommendation

How do we encourage customers to complete the survey?
PAIN POINTS
Surveys are often seen as annoying and tedious, especially when they are optional, lack rewards, or aren’t optimised for mobile devices, making them frustrating to complete and leading to high abandonment rates.
USER NEEDS
To encourage completion, users need clear and simple questions, appreciation for their time and feedback, and a mobile-first optimised experience.
70%
of Naked Wines customers access the website on a mobile device
72%
of Naked Wines customers use Gmail as email service
OPPORTUNITIES
Analytics show that over two-thirds of Naked Wines customers access the website via mobile, with most using Gmail. This presents an opportunity to optimise the survey design and email experience for mobile and Gmail. Given that phones are often on hand, a mobile-first approach ensures the survey can be completed at a convenient time, while a playful design can boost participation.
Refine survey user experience: The stakeholder shared follow-up questions with two rating systems. To reduce cognitive load and minimise the risk of survey abondonment, I suggested using only one rating system.

Usability
and Readability
MATRIX VS BUTTONS
A traditional Likert scale in a matrix format is less mobile-friendly, requiring horizontal scrolling as not all options are visible, increasing effort and the risk of drop-offs.
To improve usability, we replaced the matrix with circular buttons, making it easier to tap, providing visual feedback, and creating a more interactive, faster, and engaging experience that reduces abandonment.

Mobile Layout: Users said it was easier to read in rows

Desktop Layout: Users said it was easier to read in columns

Final designs: reducing visual clutter by displaying only the endpoints
LESS IS MORE
Mobile users preferred the legend in rows, whilst desktop users favoured columns. Recognising that a one-size-fits-all solution wouldn't work, we challenged our approach and simplified the legend to show only the endpoints, relying on users' familiarity with Likert scales. This proved intuitive and effective, prompting us to amend the design based on user feedback.

TIME CONSTRAINTS AND RESOURCE ALLOCATION
The design process went through multiple iterations, including displaying each question on its own page and exploring features like scroll-snap and active question highlighting. However, we needed to adapt when facing time and resource constraints. We agreed to simplify the design to accelerate development. As a result, these features were ultimately deprioritised.
Error Handling: if mandatory questions are left unanswered, it auto-scrolls to the first missing response
UNBIASED DATA
The previous email design used red for negative and green for positive feedback. In the new design, we decided for a neutral approach to collect unbiased data.
The updated design, now with onsite follow-up questions, is sent to customers who made a purchase within the last 7–30 days, instead of up to 12 months, ensuring feedback is tied to a more recent and memorable order.
Email before and after (desktop designs shown, as the original email lacked mobile-specific designs)
OUTCOME
Despite the old design being much shorter—asking only 'Why did you give this score?'—the completion rate increased to 26% with the new design. This demonstrates the effectiveness of the improved approach, which not only engaged users more successfully despite being longer but also collected richer, more meaningful feedback.