-
Notifications
You must be signed in to change notification settings - Fork 741
Description
Summary
@adboio and @fivestarspicy did a lunch & learn session at Hogpatch on surprising data about what makes people respond to in-app user surveys.
I propose we write this as a guest article on someone else's newsletter as a way to grow our reach. Potential places:
- Lenny's Newsletter: More of a PM, product, and growth audience
- Growth Unhinged: Growth newsletter
- Department of Product: Another product newsletter
- First Round Review: They focus a lot on PMF but could be more difficult; their content seems a bit more long-form and high investment
This one's promising for external channels because:
- It's full of data that's unique to us and you can't find anywhere else
- It's a bit more specific/drilled down than our typical Product for Engineers newsletter topics. (Ours are usually one level higher in depth, like "An engineer's guide to talking to users")
- But it's still in the direction of what we're known for (how to understand your users)
Headline options
- You're doing surveys wrong. We did the math.
- We have the data. You're doing surveys wrong.
- 5 myths about user surveys (that our data proves wrong)
- Everything you know about user surveys is wrong
- Forget everything you know about surveys
- We analyzed 1,855 surveys. Here's what actually works.
- Why no one answers your surveys (and what to do about it)
Outline
-
Everyone knows what an in-app user survey is. The challenge is getting people to actually fill them out.
- A lot of advice out there is super generic and unactionable. Write like a human, avoid jargon, be concise, etc.
- We analyzed 1,855 surveys from the last 6 months from TBD* and it turns out a lot of conventional wisdom is wrong. Here's everything we learned and the data to back it up
-
Myth 1: Survey placement matters
- Reality: Timing matters way more than location
- Event-triggered surveys (shown after a user does something) get 14% response vs 7% for surveys that just appear on a page. The magic is showing the survey right after the user does something, not just "somewhere on the site"
- The highest performers hijack moments where users are already waiting or engaged, like "Before we start, have you crocheted before?" → 96.6%
- URL targeting alone barely moves the needle (8.6% vs 8.4%)*
-
Myth 2: The PMF question "How would you feel if you could no longer use this product?" is king.
- Reality: It gets an 11.5% response rate, near the bottom of all survey types
- Sean Ellis coined the famous PMF survey question with answers ranging from "Very disappointed" to "Not disappointed"
- The conventional wisdom is if 40%+ say "very disappointed," you've hit product-market fit
- Low response rate = it's not actually that great* (TODO more research for this argument)
-
Myth 3: No one responds to open-ended questions
- Reality: Open-ended questions can hit a 75%+ response rate.
- What doesn't work = "How can we improve?" "Any feedback?" "What do you think?"
- What does work = offering value ("Want a certificate for the tests you passed?"), being playful, or -as we said earlier- hijack dead time, trigger on user's action
- TL;DR - Make it about them!!!!!
-
Myth 4: Don't risk upsetting a user who's already leaving
- Reality: Exit surveys get 30.9% response, the highest of any intent type*
- People get scared to ask users who are already leaving. If they're unhappy, a survey can be an annoying popup that sets them over the edge
- The data says that people want to tell you why they're leaving (vs. PMF surveys at 11.5% or generic satisfaction surveys at 13.2%)
- Plus, you're catching them at the moment they've made a decision so the feedback is fresh, specific, and actionable.
- If you're only going to run one survey, make it an exit survey - it helps you improve the most
- *n=8 for this in our data so i'm not sure if this is a strong case yet
-
Myth 5: Question order doesn't really matter*
- Reality: Surveys that start with a single choice questions get 15.8% average response vs 3% for open-ended
- Something here about starting out with low-effort tasks and then that gets them hooked? (as opposed to starting off with a really mentally taxing open-ended question)
- Starting with fixed-response formats is best; if you really need more open-ended stuff, save it for later in the survey, or use the survey to do in-depth research recruiting works at 17.4%
- *This myth is sort of a straw man, need a better framing
What (if any) keywords are we targeting?
user surveys
in-app surveys
in-app user surveys
Existing inspo
Ahrefs - Our Top 5 Blog Posts of 2025 (And What Made Them Work)