Are your user research questions secretly destroying your product decisions?

Share

I've seen many many product people (PMs, founders, designers — including me when I was starting) come back from user tests or user interviews with what felt like strong user feedback... but was in fact biased, misleading user feedback.

Not because they didn't listen, but because the questions they asked guaranteed they'd get those misleading answers.

There's one particular type of question that does this. And it's sooo easy/tempting to ask it without even realizing.

It sounds something like this:

  • "Would you use this feature?"
  • "Would you pay for something like this?"
  • "If we built X, would that solve your problem?"

These feel like good research questions. You're not pitching your product nor trying to convince them it's so good, you're asking questions about them, and you're giving them something concrete to react to.

But the problem is, you're asking people to predict their own future behavior. And we humans are spectacularly bad at that.

This is called projection bias in research, and it's one of the most common ways to collect confident, specific, but incredibly wrong data. And as a consequence, get overconfident about building and shipping some product or feature... and look at how not used it ends up.

"The problem with market research is that people don't think about how they feel, they don't say what they think, and they don't do what they say."

— David Ogilvy, the legendary founder of creative agency Ogilvy

When you ask someone "would you use this?", you're not getting a data point. You're getting a social interaction.

People want to be helpful. They see you've worked hard on something, so don't want to crush you. They want to be nice, to say what they believe you want to hear. So they say yes (or at least a softened version of yes) even when the honest answer is "I have no idea" or "probably not."

It's not malicious. It's just human. We are not reliable narrators of our own future behavior, that's it.

"So what to ask instead?", you may think

The fix isn't hard. You just have to stop asking about the future and start asking about the past.

Instead of "Would you use a tool that did X?", ask "Tell me about the last time you had to deal with X. What did you do?"

Instead of: "Would you pay for this?", ask: "What are you currently paying for to solve this? How do you you find it?"

Instead of: "Would this feature be useful to you?", ask "Walk me through what you actually did the last time this situation came up."

You're not asking them to imagine. You're asking them to remember. Memory is imperfect, but it's way much more reliable than speculation.

The goal isn't to hear "yes, I would use this". It is to understand the behavior that already exists, so you can design something that fits into it.

A small but important mindset shift

We often talk about validating our ideas with users. But I prefer to use the word "test".

Because when you go into a user call trying to validate something, you're already primed to hear confirmation. You'll unconsciously emphasize the moments where they agree and discount the moments where they hesitate. That's the confirmation bias in action.

But <hen you go in to test, to genuinely observe and understand, you stay open. You're curious about what actually happens, not what you hoped would happen.

That's not just semantics. It's the difference between research that puts you on the right track and research that sets you up for failure.

Here's what projection bias actually does to a product:

You run 5 user calls. 4 people say "yes, I'd definitely use that". You interpret this as strong signal. You build the feature. You ship it. Almost nobody uses it.

You're confused. The research said people wanted this. 🤷‍♀️

But the research didn't say that. It said 4 people were polite when asked a hypothetical question. That's not the same thing.

Better questions don't take more time. They just take a small shift in what you're actually trying to find out.

3 questions worth bookmarking

If you're doing some user calls this week, try these:

  1. "Walk me through the last time you had to deal with [problem]. What did you actually do?" (reveals real behavior, not imagined behavior.)
  2. "What did you try before landing on your current solution?" (proves they're actively looking to solve their problem/achieve their goal, and shows the competitive landscape from their perspective.)
  3. "What almost stopped you from [making a recent relevant decision]?" — surfaces the objections that users won't mention unless you ask.

None of these ask what someone would do. They all ask what someone did or thought or felt; things they can actually answer accurately.

That's the shift. Small, but it changes everything downstream.

Hope this was helpful.

François Simitchiev

Senior Product Designer • Activation/Onboarding Specialist
Helping B2B SaaS founders activate, convert and retain more users

Let's talk → LinkedIn | fsimitchiev.com

Read more