Summary

Human insight tests and surveys together can answer questions that neither could alone, and also make your findings more persuasive. So it’s surprising that companies often fail to combine the two methods. We’ll show you how to mix them to make your work more productive and compelling.

Quantitative and qualitative testing: Two different mindsets

If you go to a market research conference, you may notice a subtle division among the attendees: the “qualies” versus the “quanties.” You can stereotype the quanties, who specialize in statistically-rigorous quantitative research, as precise and intense people, perhaps a little nerdy, never far from the calculator app on their phones. For the qualies, who focus on non-statistical qualitative research, picture the host and guests of a television talk show, drinks in hand, trading stories. The two groups eye each other uneasily during the breaks, and when you attend a session you always want to check whether it’s qual or quant focused so you make sure you’re attending the right church.

Those are just stereotypes, of course. Each individual is unique, and many researchers recognize that there’s value in both methods. But I think it’s fair to say that qual and quant represent two different schools of thought which often mix awkwardly, if at all.

Combining the two methods can feel as difficult as mixing oil and water. But in your kitchen that’s what makes a good salad dressing, and in business mixing quant and qual methods can make for great business decisions. If you do it right, you get the precision of quant research with the emotional understanding of qual, giving you a much deeper understanding of markets and customers. How to bring them together? We’ve talked with companies that do it, and tried it ourselves, and we found two recipes that work reliably.

How to mix oil and water

The most common use of mixed methods we see is the combination of a quantitative survey with human insight tests. There are two specific ways they’re most often used together:

  • A human insight test is used to verify a quant survey design before it’s launched.
  • A survey and human insight test are used in sequence to get deeper insights on the market.

Verify a survey before it’s launched

A good quantitative survey is expensive. Depending on the size of the sample and the type of people you need, you can easily spend many tens of thousands of dollars just purchasing responses. Add in the time needed to analyze and communicate the results, and the cost of a do-over is immense. You need to be sure that the survey is right the first time.

It’s surprisingly difficult to design an effective survey on your own. Companies often develop their own jargon that’s different from the language of the outside world, and can creep into survey questions, creating confusion. It’s also disturbingly easy for survey participants to misunderstand a question, or to be offended by it — either of which can skew the results.

A very easy way to avoid these problems is to run a usability test on the survey itself. The steps are simple:

  • Create the online survey using your survey tool.
    • For now, disable any screeners in the survey. You want every participant to take the survey so you can get feedback on it.
  • Generate a URL that people can use to take the survey.
  • Create a human insight test (set it up as a self-guided web test).
  • Write an introduction to the test explaining what you’re doing. Here’s some sample text:
    • We’re going to take you to a survey. Please fill it out, following the prompts on screen. When you see each question, read it out loud. Is anything about the question or the answers confusing or offensive? As you fill out the survey, continue to think out loud. Explain why you’re giving your answers.
  • Put the survey URL into the human insight test, the same as you would a web page that you wanted to test.
  • Insert a couple of follow-up questions at the end of the human insight test, after they finish the survey, asking again whether anything was confusing or uncomfortable.
  • When you launch the human insight test, it’s best to target it at the same demographics as you’re using in the actual survey.

Your human insight system will send you back a video of someone taking your test as they explain what they are thinking. A single participant is usually enough to identify problems, but you can use two or three if you want to be especially careful.

This is a great approach to use when you’re trying to get a deep understanding of a market change or a rapidly evolving issue. When things are changing quickly, you usually need to understand two things:

  • How and why are people’s attitudes changing, and
  • How many people have those opinions

A survey can tell you how many people feel a particular way, while human insight tests can tell you why they feel that way and what they’re thinking. Only both methods together can give you a full picture of the trend.

At UserTesting, we used this approach during the Covid-19 pandemic to understand how consumer attitudes were changing. We started with a quant survey, and then used human insight tests to probe for the ideas and attitudes that drove the survey results. By doing the survey first, we were able to focus the human insight tests on the survey results that were most surprising.

Here’s what we did:

  • Talking with our own customers, we identified the Covid-related issues that they were worrying about the most. Effects on purchasing behavior and feelings about working at home were two hot topics.
  • We created a quantitative survey asking people about their purchasing behaviors (things like “how willing are you to fly on a plane?” and “how likely are you to shop for groceries in a store?”), and about their working behaviors (“if you had a choice, how many days a week would you work in the office versus working from home?”)
  • We pre-tested the survey with a couple of people in a usability test, as described above
  • When we were satisfied that the survey was working well, we launched it with a thousand participants
  • Once the survey results came in, we looked for the most surprising or extreme results. For example, at the peak of Covid most people were very nervous about air travel, and most wanted to work from home part time.
  • We built those questions into a new survey. We could have had people take the entire survey in a human insight test, but there were about 30 questions in the survey, and that’s far too long for a human insight test where you are asking people to spend a couple of minutes talking about each question. So we picked about 10 questions from the survey and probed reactions to them.
  • In the human insight tests, we took people to the shortened survey and had them respond to the questions and explain their answers: why they gave those responses, how they felt about the situation, and so on. We tweaked the survey questions to remind participants to think out loud. (Otherwise, people tend to fall into survey-taking mode and read everything silently.) At the end of this article, we’ll give you a sample of the test plan we used.

The results were incredibly insightful. To give you two examples:

  • At the peak of the pandemic, the scariest part of air travel for many people wasn’t being on the plane, it was being in the terminal before and after the flight. People did not trust that the terminal would be clean, or that passengers in the terminal would be properly screened for disease. This had huge implications for airlines that wanted to entice people back into flying.
  • People who wanted to work from home part time had very detailed, specific ideas about how work should be organized. They wanted a couple of days in the office every week for networking and cooperative projects, and a couple of days at home every week for focused personal productivity. Unlike the stereotyped debate between full-time office and full-time home work, most people wanted an optimal mix of the two. This dynamic is still playing itself out in the post-Covid economy.

When we presented our findings, mixing charts from the survey with video of people explaining their responses was extremely persuasive. We were able to show motivations and emotions along with exactly how many people felt a particular way. You can see some of the study results here: 

A note on open-ended responses in surveys. To get insights on motivations and attitudes within surveys, it’s common practice to use open-ended text questions. Those are better than nothing, but when we’ve compared those responses to the responses in human insight tests, the contrast is striking. People taking an online survey are usually trying to move fast, so you’ll usually get only a few words in the survey’s text field. This is especially true for people taking a survey via smartphone. There’s also often a difference in the tone of replies. People often seem to use open-ended survey answers as a way to vent, so we see a lot of angry or even abusive answers. In a human insight test, where participants know they’ll be evaluated on the completeness of their answers, they often go out of their way to explain themselves. You’ll still see emotional responses, but they don’t usually have the casual, dismissive tone we see in a lot of surveys. You can decide for yourself if that’s a bias or a source of strength, but it’s definitely different.

To learn more…

Boutique research firm Harvest Insights wrote a good article on the quant-versus-qual culture clash: https://www.harvest-insights.com/blog/market-research/blurred-lines-the-qualquant-edition

Dovetail described several mixed methods techniques: https://dovetailapp.com/blog/mixed-methods-research/

Sample test plan

Here’s a test plan we used for our Covid-related tests. (Note that because we planned to show the results to our customers outside of UT, we needed to ask permission to show participants’ faces.)

Screener

In this test, we’ll be asking you to turn on your phone’s camera to show your face. Are you willing to do this?

Yes [Accept]

No [Reject]

Video clips from this test may be used by UserTesting in an online report. Your name will not be used. Do you consent to this use of the video?

Yes [Accept]

No [Reject]

Introduction

This is not a usability test. We’re doing research on the Covid-19 pandemic and want to learn more about your reactions to it. We’re going to show you a survey about the pandemic. There are about 10 questions, and you should spend 1-2 minutes answering each one.

Tasks

Click on the link below to go to the survey. Please follow the instructions there, and remember to THINK OUT LOUD.

<insert survey link here>

Photo by Steph Wilson on Unsplash

The opinions expressed in this publication are those of the authors. They do not necessarily reflect the opinions or views of UserTesting or its affiliates.