At some point, all teams who buy in to the value of customer research go overboard.
They start realizing that they could ask people anything. And then they want to do just that.
It’s a good problem to have. I’d much rather have an over-eager product manager who keeps sending me potential survey questions or trying to add questions to an interview script, than one who neither values nor trusts research.
But it’s still a problem. There’s a limit to the amount of time we can interact with a customer and there’s a limit to the number of interviews/usability tests/etc. that a researcher can do in a day. Some questions aren’t great questions, and they need to be ruthlessly culled.
Here’s what I recommend:
- Repeat the question back to the asker
- Ask, “What would we do, if we had an answer to that question?”
- Might we get an answer that would change a decision we’ve already made/are about to make?
- Would we shuffle priorities of what to work on next, based on the content of the answer?
- Do we think we’re likely to be surprised by the potential answers we get?
- If the answer to these sub-bullet questions is “no”, then it’s rarely worth asking the question.
This isn’t a free pass to skip research.
But challenging questions before we start asking them does two things: it helps us admit that some questions/answers don’t impact our decisions, and it forces us to re-work questions to more directly get at what we really want to know.
For example, most decisions at Yammer are made based on usage data — if a new or changed feature drives people to use Yammer more, we keep it. So questions like “How do customers feel about ____?” or “Do customers understand how _____ works?” are irrelevant. Knowing the answer does nothing more than assuaging our curiosity. If a customer tells us they have no idea how Feature X works, but our data reveals that they are regularly using Feature X, we go with the data.
Or some questions are a proxy for a different thing we want to know. I had a colleague from another team once suggest that we put an annual income question on a survey. It seemed out of place (and could potentially make website visitors feel uncomfortable), so I dug in to ask why she wanted that particular question.
“If we know annual income, and it’s on the higher side, then that person is more likely to be a decision-maker at their company,” was the explanation. I said, “Why don’t we just ask that directly, ‘Do you make software buying decisions for your company? radio buttons with yes or no’?”
So the next time you’re prepping to do research, or you’re in an endlessly long discussion meeting where people are coming up with all the questions they could potentially answer, be ready to ask: “What would we do if we had an answer to that question?”