Depending on the context and what we’re trying to learn, a handful of well-crafted in-depth interviews with the right people can be enough to give a product team some direction to move forward, reinforce (or call into question) previous decisions, or uncover usability or language issues.
Sometimes we make recommendations based on interviews with just five or six people. Meanwhile, the products we’re studying might have millions of users. Our companies have access to detailed usage data that can show exactly what actions people are taking in these products. In an environment where high-quality quantitative data is plentiful, qualitative research is sometimes met with skepticism.
Others have dedicated plenty of ink and pixels explaining why, when, and how to conduct qualitative research. I’m not going to rehash those topics here. Instead, I want to focus on writing. How can you write up your qualitative findings responsibly, earning respect from your more quant-focused colleagues?
The skeptics aren’t wrong
Folks who are skeptical of qualitative research worry that qualitative data will be used incorrectly. They worry that people will make product decisions based on a small number of interviews, when quantitative data should really inform those decisions. And the skeptics are correct: there are many questions that are inappropriate for qualitative methods.
For example, you shouldn’t use qualitative methods to predict what percentage of your user base is likely to use a new feature. You need representative numbers to confidently answer that kind of question. But there are many research questions where a few solid interviews can bring great clarity, or at least point the way towards the next round of inquiry.
When writing up your qualitative findings, you can help ease skeptics’ fears and instill more confidence in your findings by:
- Providing additional context for your work
- Using language carefully and precisely
- Being transparent about what your work does and doesn’t mean
Provide the necessary context
Be clear about your goals and hypotheses
What exactly were you trying to learn with this research? What research questions were you looking to answer? State these explicitly at the beginning of your report. This sets good expectations around what readers can expect to find in your report. And it will help them better understand qualitative methods generally. If you had hypotheses going into your study, be sure to state those explicitly as well.
Contextualize your project within the product development cycle
You don’t need to restate your team’s entire product strategy in your report. But make sure that you link out to the right documents so any reader can understand the bigger picture of this work if they choose to.
What other related research have you already done? Be sure to link to it and reference it directly within your report when applicable. Maybe you’ve already run a survey or a/b test in relation to this project, or maybe you have plans to. If so, be sure to include links to those other studies, demonstrating how this qual work adds to the quant work that’s already been done, or will be done.
Be explicit about next steps in your report. Reassure your readers that you have a solid plan that makes the right use of the right methodologies. Those next steps often include additional research activities. Calling these out will help allay any fears that your qual data will be used in isolation to make the wrong types of decisions.
Triangulate with other data sources
Your qualitative findings will carry more weight if you can relate them to other sources of information. The classic example is using qualitative work to explain the “why” behind the “what” that your team sees in behavioral data. But you can also use other sources of information to reinforce the findings and recommendations from your qualitative study. Work with your data scientists or product analysts to connect what you heard in your research with what they see in behavioral data. Connect your findings to other academic studies. And, of course, correlate your study with other relevant internal research.
Provide the details
Make sure you include or link to the details that your readers need to understand your work, and follow up if they have questions. This one may seem obvious, but people often forget one or more of these. Remember to include:
- Participant details: How many people did you talk to? What were your screening criteria?
- Links to other documents: discussion guide, research plan, screener, other related research, etc.
- Testing artifacts: screenshots of what you showed participants, and/or a link to your prototype. If you’re linking out to your prototype, be sure to preserve a version of it that matches what your participants saw.
- Date: Seriously, include the date you did the research. This gets overlooked surprisingly often.
- Contact info: Include your name and contact info as the researcher. But also include info on your designer, product manager, and anyone else that worked on the research. If you go off on that much-needed sabbatical and can’t be reached, this gives your colleagues a better chance of getting their questions answered.
Support busy readers
Not all of your readers will have the time or patience to read every detail of your study. While it’s important to include the details, you also want to enable folks who have very little time to benefit from your work. Include a “Key takeaways” section at the very top of your doc that summarizes the most important bullet points from your findings.
Use precise language
Write for a future hire
It’s easy to focus on your immediate product team when writing up your findings. They’re usually the most anxious to read it, after all. Your immediate team already has a lot of context about the product and about your research, but they aren’t your only readers. Hopefully, your work will also be read and leveraged by members of other teams, members of leadership, or future members of your immediate team.
Once you’ve finished your writeup, give it another read, specifically with a new team member in mind. What additional context would they need to understand your findings? Again, you don’t have to spell out every detail of your product strategy, but be sure to at least link out to the most important docs.
Expect your writing to be taken out of context
At Dropbox, we write almost everything in Dropbox Paper. Paper makes it very easy to facilitate discussion within docs and to discover content within the company that you weren’t aware of. Paper fosters a culture of collaboration and transparency. As a result, you can assume that someone is likely to copy something from your research and paste it into their own document to support their own argument. Whether your company has a similar culture or not, writing with an eye towards excerpting will help you maintain discipline in your writing.
“Participants” not “users” or “customers”
When describing the people you spoke with, always refer to them as “participants” not “users” or “customers.” If you call them “users,” then readers will be tempted to apply your findings to your general user base. And if someone excerpts your work out of context, then your conversations with six users might be used to represent your entire user base without you even knowing it.
Findings, recommendations, considerations, and how might we’s
Be clear about what your writing means. Sometimes we can talk to a handful of users and be pretty confident about what we learned and what the product team should do about it. Other times, the qual research might be sufficient to identify a problem but doesn’t give us enough information to know exactly how to fix it. I like to use different labels to communicate the level of conviction I have around different elements of a report.
- Findings: I use “findings” to indicate observations. These are the things I saw people do and/or heard people say. A finding is simply a report of what happened, without any explicit call to action. Example: “Most participants missed the escape hatch link to set up SmartSync preferences later.”
- Recommendations: I only use “recommendation” when I’m pretty confident that I know what we need to do to solve a problem. I’m confident that I have enough information from this research and other data sources to recommend a specific action. Example: “Recommendation: Increase the prominence of the escape hatch link.”
- Considerations: Sometimes the problem is clear, but I want to highlight multiple ways that we might fix the problem, or open up a larger discussion about how we might solve it. Example: “Consideration: Consider whether we should include an escape hatch in this instance. Will users ever remember to set up SmartSync preferences later? Would users be better served by a more opinionated experience that forces them to set up preferences before moving forward?”
- How might we’s: Sometimes the findings point toward a bigger issue that might not be solvable in the next iteration of a feature. Maybe your product team doesn’t have a clear mandate to solve the issue because it will require action from other teams. Or maybe the findings point to a fundamental issue that requires more thought from a wider audience to address. In these cases I like to use “How might we” statements. Example: “How might we increase user confidence in SmartSync?” or “How might we develop a sense of urgency around SmartSync settings, so that users are more likely to address them at this point in the experience?”
Don’t presume to know what users will do in the real world
Focus instead on what you heard and observed. Imagine that you’ve run a study in which most or all participants ignored a certain area of the screen. You might feel pretty confident that, “This area wasn’t relevant to our participants” or “Users will ignore this area.” But unless you questioned participants to understand why they ignored the area, then you should simply report what they did. “None of our participants engaged with this area.” Stick to the facts.
This will help keep your colleagues focused on user needs and leaves room for your colleagues’ unique expertise. Imagine a scenario where you’ve seen participants overwhelmed with too much information on a screen, or perhaps they struggled to differentiate between multiple types of information on a screen. It would be tempting to simply recommend that the team use a tabbed approach instead of a single long page.
But instead, try something like: “Participants were overwhelmed by the different types of information on this screen. Consider ways to help users differentiate and focus on the data that is most important to them at any specific time.” Stating the consideration this way keeps your team focused on the user need, and also respects your product design partner’s expertise. Leave it to them to figure out the best way to meet that user need. Tabs may seem like the obvious answer, but your designer might have other ideas that would work better.
Express quantities carefully
Even in qualitative work we sometimes need to express what happened in terms of quantities. When you do, do so carefully. Here are a couple of ways to describe quantities responsibly, and one way you should avoid.
- Use imprecise language to match imprecise methods. When you’ve only talked to six people, the exact number of people who said or did a particular thing is simply not that important. Whether two of them said a specific thing or four of them said that thing, we don't know with confidence that the thing is true for any specific portion of our user base. Our methods are imprecise, so we should use imprecise language to match. Words like “some,” “most,” or “about half” are imprecise, but they accurately reflect what we know. Example: “Some participants were unfamiliar with SmartSync, but most of them understood it to be valuable.”
- Place specific numbers in relation to the sample size. Another approach is simply to be completely transparent about your sample size every time you mention a quantity. Example: “2 of 6 participants hadn’t heard of SmartSync, but 4 of 6 understood it to be valuable.”
- Don’t use percentages. Never use percentages to express anything about qualitative findings. Percentages encourage readers to apply your findings to a wider audience, and that is rarely appropriate. Particularly if your work is quoted out of context, it can lead to false confidence in what a finding means and who it represents. Example: “33% of participants hadn’t heard of SmartSync, but 67% believed it to be valuable.” ← Don’t do this!
Use participant quotes often, and feel free to edit them (for clarity only)
You may be thinking, “Edit quotes?! What kind of deceptive sorcery are you selling here?!” But relax. I’m not suggesting that you should misrepresent what your users tell you. But a few choice edits for clarity can amplify your participants’ message.
Hearing a user talk about your product in their own words is research gold. Verbatim quotes can be very impactful. And your readers will often copy them from your report to use in their own presentations.
But participants are usually giving their feedback verbally. They’re thinking on their feet and aren’t speaking with the expectation of being quoted verbatim. They speak imprecisely and use filler words like “um” and “like” and “I guess that.” Sometimes they change their minds in the middle of a sentence. If you write down exactly what someone said, they may sound unreliable or inarticulate. Your readers may be tempted to dismiss their feedback because it sounds unsophisticated.
A simple edit or two can increase the impact of your participants’ words. You have a responsibility to represent participants’ sentiments as accurately as you can. No edit you make should change the meaning of their statement. But sometimes you can more accurately express their sentiments by making some thoughtful, respectful edits to their words.
- “Because it is unusual for this person to be editing files, and is a higher level member of the team. If they are editing, its probably for good reason, and good for us to know he made the change because it could impact all of our workflow.”
- It’s unusual for this person to be editing files, and he is a higher level member of the team. If he is editing, it’s probably for good reason, and good for us to know he made the change because it could impact our workflow.
It is also useful to use a different visual treatment for participant quotes. Using italics or pull quotes or some other visual distinction calls attention to your participants’ words, and helps to make clear what they said vs. what you said.
Be transparent about the limitations of your work
Don’t overstate your findings or recommendations. Acknowledging the limitations in your work will increase your audience’s confidence in the assertions that you do make. If you’ve only talked to a handful of people, don’t pretend that what you’ve heard represents all of your users. Qualitative research is a valuable tool to aid decision making, but it won’t provide all the answers. Be clear about the limitations of your work. Overconfidence is exactly what the skeptics worry about. Explicit humility on your part will encourage confidence in your assertions.
Include recommendations for additional research and other ways to learn
Recommending other methods to complement your qualitative work can also reassure your skeptics. Sometimes the most fruitful thing that comes out of qualitative work is pointing the way toward the next step in data gathering. Qualitative work can help jump-start your other data-gathering activities, giving your team confidence that the next round of inquiry will focus on the right problem areas.
Consider using Dropbox Paper to publish your findings
At Dropbox, we tend to use Paper for everything, including final reports. It’s very easy for your readers to comment and ask clarifying questions in Paper. When you publish your findings in Paper, your report itself facilitates ongoing conversation about your research. You’re able to maintain a dialog with your readers, and that dialog is visible to all of your readers. If a skeptical reader challenges your methodology or conclusions, you’re able to justify your work in this open forum. That discussion is healthy for you and your stakeholders, and it can be instructional for all of your readers.
Instill confidence in your readers with careful writing
Qualitative research, when used correctly, can be a very efficient and effective way to help your teams understand the experience of users. Sometimes our more quant-minded colleagues focus on the limitations of qualitative work, and they may need a little help understanding and trusting the significant benefits of this kind of work. Try using some of the tips in this article to win over your more skeptical colleagues, increase their confidence in your work, and more effectively advocate for user value in your products.
We hope that these insights will help you win over skeptics of qualitative research, and amplify the voice of your users in the product development process. If you have any questions or additional wisdom to share, reach out to us on Twitter @dropboxdesign.