TL;DR 🚀
The NPS measures how likely your users are to recommend your product.
Single question to ask: “On a scale from 0 to 10, how likely are you to recommend Edusign to a friend or colleague?” ”
Promoters: 9–10
Passives: 7–8
Detractors: 0–6
Formula: NPS = % Promoters − % Detractors → score between –100 and +100.
Interpret the NPS over time and by segment, not as an isolated figure.
(Here is an example of NPS questions, posed directly from the Edusign app!)
Why we use it (and why you should too)
At Edusign, we love numbers that bring us closer to clients. The NPS helps you to:
Take the pulse: quick read on loyalty and word-of-mouth. 🩺
Prioritize: know what to address first (product, experience, support). 🎯
Measure the impact: check if an initiative (new feature, onboarding, message) really changes the perception. 📈
Edusign Tip 💡: always accompany the NPS with an open-ended question “Why?” to turn a score into concrete actions.
How to calculate it (mini-example) 🧮
Classify each response into one of the 3 groups (0–6 / 7–8 / 9–10).
Calculate the percentages of Promoters and Detractors (Passives only count in the total).
Apply the formula.
Quick example:
30 responses → 12 Promoters, 3 Detractors, 15 Passives.
% Promoters = 12/30 = 40%
% Detractors = 3/30 = 10%
NPS = 40 − 10 = 30
Read your NPS (pragmatic interpretation) 🔍
< 0: red alert; more detractors than promoters.
0 to 30: acceptable, large scope for improvement.
30 to 50: good; recommendation becomes an advantage.
50 to 70: excellent; you provide a remarkable experience.
70+: exceptional (rare!).
⚠️ Thresholds vary by sector and context. The most relevant comparisons: yourself over time (the trend) and within homogeneous segments (e.g., institution, message, adoption level). institution, message, adoption level).
To give more context, recent research estimates that the average Net Promoter Score in higher education is around 32, a “favorable” zone but improvable compared to broader education benchmarks.
Best practices for useful insights ✅
Always add an open-ended question: “What is the main reason for your rating?” ”
Timing: send close enough to the experience for fresh feedback.
Sample size: avoid hasty conclusions with few responses.
Cadence: keep it reasonable (transactional to the event, relational once per quarter).
Channel & tone: email, in-app, SMS… or via the Edusign app! Simple, human, on-brand.
Rounding: display the NPS in whole numbers (and the average /10 to 1 decimal if useful).
From measuring to action (the real topic) 🎯
Detractors (0–6): quickly contact (< 48 h), thank, understand, correct, and keep them informed. 🧯
Passives (7–8): ask “What is missing to reach 9–10?” → often quick wins. ” → often quick wins.
Promoters (9–10): say thank you, share your roadmap, invite to referrals or a public review. 🌟
Edusign Tip 💡: link each drop in NPS to an improvement ticket (owner + deadline). Follow-up matters more than the score — that's how we aim for continuous improvement!
Common pitfalls & limits 🧱
Sampling bias: don’t survey only your biggest fans (or least satisfied).
Channel effect: phone, email, in-app → scores vary by channel.
NPS ≠ average: the average /10 cannot replace the NPS.
Fragile comparisons: avoid “NPS leagues” without context (size, market, model).
NPS isn’t everything: combine it with CSAT (moment satisfaction), CES (effort), product usage, churn, etc.
FAQ ❓
How many responses are needed?
No magic number; aim for dozens per segment at least to reduce noise.
Should you encourage responses?
Gentle reminders are OK, but rewards can bias the results; if you offer them, be transparent and consistent over time.
Can the score have decimals?
We calculate with percentages, but display as whole numbers (clearer and standard).
What to do if the NPS suddenly drops?
Check the volume and filters.
Read the verbatims.
Identify 1–3 probable causes.
Open actions with owner + deadline.
Communicate the correction plan to affected respondents.
Ready-to-use templates 🧩
Main question:
“On a scale from 0 to 10, how likely are you to recommend [Product Name] to a friend or colleague?” ”
Open-ended question:
“What is the main reason for your score?” (one or two sentences suffice) ”
Thank You Message (Promoters):
“Thank you for your 9/10! It’s for you that we love building [Product Name]. Would you agree to leave a public review or test our upcoming features in advance? ”
Catch-up Message (Detractors):
“Thank you for taking the time to respond. We want to understand and correct. Could you tell us a bit more? A team member will contact you within 24–48 h. ”
Fun facts (to shine in meetings) ✨
The NPS was popularized in 2003 by Fred Reichheld (“The One Number You Need to Grow” in Harvard Business Review), with Bain & Company and Satmetrix.
The score ranges from –100 (100% detractors) to +100 (100% promoters) — theoretically possible, but extremely rare.
The NPS is often nicknamed “The Ultimate Question” (title of Reichheld’s book) because it boils down... to one question.
To go further
Practical guide on our dashboard: how to read each graph and apply filters.
Context article: best sending practices (when, to whom, through which channel).
Benchmarks: handle with care — focus first on your trend and your segments.
In a nutshell 🧭
The NPS is a trigger for useful conversations. Measure it regularly, listen to the verbatims, act quickly, then see if the curve returns to the right direction.
At Edusign, it’s a simple compass to stay close to you — and continuously improve. 💙

