TL;DR 🚀
NPS measures how likely your users are to recommend your product.
Ask one simple question: “On a scale from 0 to 10, how likely are you to recommend Edusign to a friend or colleague?”
Promoters: 9–10
Passives: 7–8
Detractors: 0–6
Formula: NPS = %Promoters − %Detractors → score between –100 and +100.
Interpret NPS mostly over time and by segment, not as an isolated number.
(Here’s an example of an NPS question, asked directly from the Edusign app!)
Why we use it (and why you should too)
At Edusign, we love numbers that bring us closer to customers. NPS helps you:
Take the pulse: a quick read on loyalty and word-of-mouth. 🩺
Prioritize: know what to tackle first (product, experience, support). 🎯
Measure impact: check whether an initiative (new feature, onboarding, message) truly shifts perception. 📈
Edusign tip 💡: always pair NPS with the open question “Why?” to turn a score into concrete actions.
How to calculate it (with a mini example) 🧮
Sort each response into one of the 3 buckets (0–6 / 7–8 / 9–10).
Compute the percentages of Promoters and Detractors (Passives count only in the total).
Apply the formula.
Speedy example
30 responses → 12 Promoters, 3 Detractors, 15 Passives.
%Promoters = 12/30 = 40%
%Detractors = 3/30 = 10%
NPS = 40 − 10 = 30
Reading your NPS (pragmatic interpretation) 🔍
< 0: red flag; more detractors than promoters.
0 to 30: acceptable, clear room for improvement.
30 to 50: good; recommendation becomes a strength.
50 to 70: excellent; you’re delivering a standout experience.
70+: exceptional (rare!).
⚠️ Thresholds vary by industry and context. The most relevant comparisons are to yourself over time (trend) and within homogeneous segments (e.g., school, message, adoption level).
For context, recent research estimates that the average Net Promoter Score in higher education is around 32, which sits in the “favorable” range but leaves room for improvement compared to broader education benchmarks.
Best practices for useful insights ✅
Always add an open question: “What’s the main reason for your score?”
Timing: send close enough to the experience for fresh feedback.
Sample size: avoid hasty conclusions if responses are few.
Cadence: keep it reasonable (e.g., transactional at the event, relational once per quarter).
Channel & tone: email, in-app, SMS… or via the Edusign app! Keep it simple, human, and on-brand.
Rounding: display NPS as a whole number (and the /10 average to 1 decimal if useful).
From measurement to action (the real game) 🎯
Detractors (0–6): reach out quickly (ideally < 48h), say thanks, understand, fix, and keep them updated. 🧯
Passives (7–8): ask “what’s missing to get to 9–10?” → often quick wins.
Promoters (9–10): say thank you, share your roadmap, invite referrals or a public review. 🌟
Edusign tip 💡: we link every NPS dip to an improvement ticket (owner + due date). Follow-through matters more than the score — that’s how you keep improving nonstop!
Common pitfalls & limits 🧱
Sampling bias: don’t ping only your biggest fans (or your most upset users).
Channel effect: phone, email, in-app → scores can vary by channel.
NPS ≠ average: the /10 average does not replace NPS.
Shaky comparisons: avoid “NPS leagues” without context (size, market, model).
NPS isn’t everything: combine it with CSAT (momentary satisfaction), CES (effort), product usage, churn, etc.
FAQ ❓
How many responses do we need?
No magic number; aim for dozens per segment at least to reduce noise.
Should we incentivize responses?
Gentle reminders are fine, but rewards can bias results; if you offer them, be transparent and consistent over time.
Can the score have decimals?
You calculate with percentages, but display a whole number (clearer and standard).
What if NPS drops suddenly?
Check volume and filters.
Read the verbatims.
Identify 1–3 likely causes.
Open actions with owner + deadline.
Communicate the correction plan to affected respondents.
Ready-to-use templates 🧩
Main question
“On a scale from 0 to 10, how likely are you to recommend [Product Name] to a friend or colleague?”
Open question
“What is the main reason for your score? (one or two sentences is enough)”
Thank-you message (Promoters)
“Thanks for your 9/10! You’re the reason we love building [Product Name]. Would you like to share a public review or try upcoming features in early access?”
Recovery message (Detractors)
“Thanks for taking the time to respond. We want to understand and fix things. Could you tell us a bit more? A team member will get back to you within 24–48 hours.”
Fun facts (to shine in meetings ✨)
NPS was popularized in 2003 by Fred Reichheld (“The One Number You Need to Grow” in Harvard Business Review), with Bain & Company and Satmetrix.
The score ranges from –100 (100% detractors) to +100 (100% promoters) — theoretically possible, but extremely rare.
NPS is often nicknamed “The Ultimate Question” (title of Reichheld’s book) because it boils down to… one question.
Go further
Practical guide on our dashboard: how to read each chart and apply filters.
Context article: sending best practices (when, to whom, through which channel).
Benchmarks: use with caution — focus on your trend and segments above all.
In a word 🧭
NPS is a starter for useful conversations. Measure it regularly, listen to verbatims, act fast, then watch whether the curve bends back the right way.
At Edusign, it’s our simple compass to stay close to you — and to keep getting better, again and again. 💙