If you work in customer education, you know that centering the customer’s experience is a critical–and challenging–part of building any successful training. We might learn about our customers from internal field teams, or if we’re lucky, get to speak directly with customers to understand their barriers to usage and adoption. While we’re familiar and comfortable with using iterative and rapid content design approaches, we might not have access to the most efficient tools to get fast customer feedback on our content. So, if the customer’s voice is absent or obscured in what we design, how can we ensure our educational experiences truly engage them?

Conversations with other customer education professionals have taught me that more customer education teams could benefit from a human insights solution like UserTesting. I don’t mean to sound like a sales person here. In fact, I was mistaken for one several times while discussing the company I work for when I attended the Customer Education Management Association (better known as CEdMA) conference in 2022. In my three years at UserTesting, our on-demand customer feedback platform has become a necessary part of how my team and I build content for our customers. From exploring what our customers need to gathering insights on ideas-in-progress, to improving programs that miss the mark, UserTesting compels us to approach customer education like a product that serves and supports our audience.

If you’re a customer education professional reading this, you might be wondering how you can integrate a powerful machine like UserTesting into your team’s workflow. After all, it’s not a simple survey you send out to participants; there are multiple capabilities you can adopt for different use cases. For instance, testing how learners search for content and conducting discovery interviews on learners’ feelings and mindsets towards virtual instructor-led classes. This article pulls from some of my own experiences on the UserTesting University team and how I have practiced the (very meta) exercise of using UserTesting to create training content for UserTesting.

Approaching customer education like a product

Before we jump into the platform, let’s level-set on what customer education teams should know before launching studies on the UserTesting platform. It starts with approaching customer education like a product. 

Like any customer education team at a software-as-a-service (SaaS) company, we’re familiar with creating content that educates customers on new and updated featured offerings the product and marketing teams develop and bring to market. From conception to public release, the product follows what’s called the product development lifecycle. Here’s a simplified version that’s appropriate to content development:

A circular chart of the product development lifecycle shows three phases: learn and empathize, ideate and explore, and execute and improve.

This process is something that customer education teams can follow whenever they’re building a new course, help center article, certification program, or whatever it might be. There are three stages in this lifecycle:

  1. Learn and Empathize Stage
    What are your learners’ frustrations with the subject matter, and how can your educational content solve their pain points? At this stage, the objective is to learn about your customers–or rather, your learners. Gather as much information as you can about your learners through discovery interviews, then organize it in a way that helps you determine the right path forward. When designing for learning, I recommend sketching learner personas, conducting a user needs analysis, or creating an empathy map.
  2. Ideate and Explore Stage
    After learning about your learners, it’s time to generate initial ideas and gather early feedback on the concepts. Depending on the solution, you might benefit from creating a learning map to identify skills gaps and how to make the experience seamless. You could also start crafting the learning objectives to pinpoint what you want learners to get out of your content.  
  3. Execute and Improve Stage
    Once you have a design or concept fleshed out, and have feedback to validate your direction, the next stage is to build and iterate. This is where rapid prototyping comes into play as you create and gather feedback on different versions of your learning experience. You might also have certain success metrics you want to measure, like course completion, CSAT scores, or Time-to-Value

With these three stages in mind, customer education teams can ensure they are keeping UX (user experience) principles at the forefront of the digital learning experiences they design and improve. At the same time, these stages might feel overwhelming and time-consuming. That’s where UserTesting steps in. Let’s get into how my team and I have used UserTesting to accelerate these stages and elevate the quality of feedback in customer-centric learning.

Learn and empathize

It’s easy to make assumptions about our customers right as we begin building content. But what if there are surprises that could alter the path you end up taking in your design? This is why it’s critical to ask customers upfront about their likes, dislikes, expectations, and experiences with learning how to use your product or service, or anything else related to the learning experience.

At UserTesting, an unmoderated study can be the quickest way to get that information. An unmoderated study, as opposed to a moderated study, just means that you are not present while the participant completes your test. Often, an unmoderated study is the fastest way to get feedback because participants are self-guided, and they can take the test right away if they qualify. If you want to ask individualized follow-up questions in real-time, then UserTesting also allows you to schedule what’s called a Live Conversation with the participant.

When some of our newer on-demand courses had low enrollment, even months after they were released, I decided to run an unmoderated study to learn why. At this time, my team was in the early stages of planning a redesign of how we structure our on-demand courses. We wanted to better position and promote courses to meet customer needs, as well as determine if course length was deterring learners. These circumstances prompted two questions:

  • How are customers discovering on-demand courses?
  • How do customers decide which on-demand courses to take?

Here is the process I followed:

  1. Set my audience requirements: The UserTesting Contributor Network is a vast, global resource of people that our users can utilize based on their desired audience criteria. For my study, I set my demographic criteria to search for a variety of current UserTesting customers who worked as marketers, product managers, designers, and UX researchers. I set my location criteria to “anywhere” to ensure my test could fill quickly.
  2. Build and launch my test: Whenever I need test plan inspiration, I like to go to UserTesting’s Template gallery. The Template gallery has over 100 different test plans, built by our UX experts, that you can use as-is or customize to fit your needs. Since I was running an unmoderated discovery test, I filtered for discovery templates and reworked the Customer journey template (a type of test to understand how customers engage with your brand or product across multiple touchpoints). Here’s how I then customized the tasks and questions:

    Multiple choice: How long have you been using UserTesting?
    Multiple choice: Have you ever enrolled in training content (e.g., on-demand courses, live-instructor courses) to help you use the UserTesting Platform?
    Verbal response: If you have enrolled in UserTesting training content, explain your process for how you usually discover on-demand courses.
    Launch URL: You will be taken to a slide deck displaying UserTesting University course landing pages. Once the new page fully loads, move on to the next step.
    Task: Spend 1-2 minutes browsing the course landing pages in the slide deck. Remember to speak your observations out loud. When you’re ready, move on to the next task.
    Verbal response: What, if anything, did you like about the course landing pages?
    Verbal response: What, if anything, did you dislike about the course landing pages?
    Verbal response: How unlikely or likely are you to enroll in a UserTesting course in the future? Explain your answer.
    Written response: If you could change anything about the course landing page or description, what would it be?

    Next, I launched my test to one participant, as part of UserTesting’s recommended practice of piloting your study to make sure everything is running as expected. I then made a couple of edits before launching it to nine more participants.
  3. Analyze my results: After an hour or so, all ten of my participants had completed my test. I typically like to watch the entirety of the recorded video sessions of my results, but if I’m in a hurry, I’ll increase the video speed or just review the Metrics page for at-a-glance charts and summaries. Here’s what I discovered from our customers:

    Customers either don’t search for courses or only learn about them via direct contact: Seven out of ten participants said they had not searched for training content. For those who had taken a course, they discovered the training through their Customer Success Manager, an email, or an in-app notification. 40% of participants mentioned that the platform is easy and self-explanatory, so they did not feel the need to pursue additional training (not a bad thing, I suppose!).

    Customers want shorter courses: Seven out of ten participants mentioned time duration as a factor when considering a course. As one participant said, “Spending 20 minutes to learn a feature seems like a waste.”

    Customers care about course ratings: Seven out of ten participants said that they would like to know how others rated a course. One even mentioned, “I’m not sure I would take this course since it has not been rated.”

    Customers want to quickly know how the course fits their needs: The course landing page is critical for learners to determine how the content will solve a roadblock or grow their skills.

    Customers don’t enroll in courses because they’re not aware of them: About 90% said they would enroll in a course in the future after reviewing the landing pages in the test. Short course duration and use case relevancy would be major motivating factors.
  4. Share my results: I presented the findings to my team and highlighted possible solutions, including shortening courses, templatizing course landing pages, and reevaluating how we promote courses. 

Since then, we’ve started to introduce shorter, microlearning videos in our ungated (no login) Knowledge Base, implement standardized templates for courses, and create customer-facing slides about new and featured content so that Customer Success Managers and other customer marketing programs can more easily promote our content. So far, our efforts have yielded 1,074 total plays on 12 new microlearning videos for 2023. While we are continuing to optimize our courses, our ability to easily adopt an iterative testing strategy through UserTesting has allowed us to validate shifts in our plan as we respond to company-wide changes.

Ideate and explore

In this stage, it can be challenging to get timely feedback on a course outline or a new interactive learning design you’re building, especially if you’re up against a tight deadline. This is where UserTesting has made it possible to get quick customer feedback while I’m creating content.

I’ve used preference testing and concept testing methods to help me validate whether a new learning design will resonate with customers. Here’s how I used a concept testing approach to build an in-app diagnostic quiz to show customers what they should test and discover other ways to use our platform.

My team was thinking through how our content could cater to customers with a diverse range of professional backgrounds. UserTesting caters to more than just UX researchers; we also see people from marketing, product, design, and even executive management use UserTesting to solve their business challenges. But with these varied audiences comes the predicament of creating content for each of these specific audiences, which takes a lot of time and effort to personalize our existing generalized content.

So, I came up with this problem statement: How can we create and scale learning experiences for customers from different job and industry backgrounds?

From there, I let that statement guide my process for coming up with possible solutions:

  1. Brainstorm and Discovery: A brainstorming exercise led me to the idea of an online diagnostic quiz that users take to understand what their “UserTesting learner profile” is and what help content they should use. But I decided to keep the quiz “job agnostic.” During a discovery interview, I learned that a marketing professional might want to run a prototype test because they want to better understand how people are using the product. This stuck with me and forced me to think about how our training content needs to strike a balance between job-specific learning content and purpose-specific learning content.
  2. Concept Testing: I focused my concept testing around a quiz that tries to get to the heart of why customers are using UserTesting. First, I created a prototype using a free online quiz tool. I built five questions that led to eight different outcomes consisting of testing best practices for common business challenges and scenarios. Then, using the First impressions test template from the UserTesting Template Gallery, I tested with three different customers to get their initial takeaways. The results revealed that the customers liked the ease of the quiz and the recommended educational resources in the outcomes. However, the outcome page felt generic and flat. Because I was using a free quiz tool, I was unable to embed links, which definitely hindered the experience.
  3. Internal Feedback: After trying a couple of other quiz tool options, I was able to get buy-in from our Customer Success team to help amplify this project. We chose a subscription with Outgrow, which allowed for more control over the look and feel of the quiz as well as the branching logic, which provides a variety of ways customers can go through the quiz. Next, we launched an Invite Network test, a UserTesting method of recruiting anyone with a direct link, to our internal subject matter experts, who helped us streamline the quiz structure. The quiz went from five questions to three, and the outcomes went from eight to 23; this would prevent quiz-taking fatigue while also giving customers many chances to retake the quiz and get different outcomes depending on where they’re at in their project.
  4. Customer Feedback: We then ran a few more user tests with customers to nail down some details, such as the name of the quiz (we landed on “What should you test?”) and the overall impression of the outcome page, which now included a more visually appealing design, an actionable “three things you can do right now” section, and a downloadable PDF.
  5. Go Live Launch: Once we got the quiz looking and functioning the way we wanted, we pushed it out into our platform for customersto take. 

We discovered that 272 customers interacted with the quiz a total of 391 times since its launch in February 2023. Using Pendo, a product analytics app, we were able to track that 74.8% of users who took the quiz launched a test on the platform. Being able to routinely test my idea and get it right before building it led to a quick, engaging solution that helped drive adoption as customers were logged into the platform.

Execute and improve

For most customer education teams, this final stage is often dominated by end-of-course surveys that ask customers how we can improve the learning experience for them. While survey feedback can certainly be helpful, we still run into the issue where not everyone responds to the survey–or provides the most helpful feedback.

With UserTesting, my team has been able to dig much deeper than one can with survey feedback, as well as run tests periodically to better understand how we can improve our training offerings. We had often heard from our field teams that customers wanted to know if certain features were available on their subscription plans and that our Knowledge Base content was lacking this information. So, we decided to create a banner to visually represent if certain plans included the features mentioned in the articles.

Image of the design, with buttons for each option, arranged in two rows., one for each plan type.

We then ran an internal user test, and some internal feedback mentioned that the design was hard to read and that it would be better to not use a visual to convey plan information, especially for accessibility reasons.

I ran an external test to confirm that users were indeed struggling with the graphic. In the test, I asked them to review a couple of articles and determine if the product feature was available on a certain plan. After viewing five recorded sessions, I noticed that it took participants two to four minutes to realize that the information they were looking for was in the graphic. The suggestion I heard was to produce something more direct.

I returned to the drawing board. Instead of creating a visually dynamic graphic that confused instead of clarified, I developed a simple “at-a-glance” box to appear at the top of the article.

A text box with a bulleted list of the plans that include this feature.

I ran another test with this concept, asking participants to determine if the feature mentioned in the article was on a particular plan or not. I learned that participants could either easily find the information or were very confused and took three to five minutes to locate it. They wanted the information to stand out more (e.g., color, bolded, table, checkmarks) because the text about the plan blended in with the rest of the text-heavy article.

It almost felt like I was being pointed backward to the initial design. However, I resisted, and I decided to explore the suggestion one participant gave about a table.

I next came up with this design:

A text box with a table that has checkmarks next to the plans that include this feature

After running a third test, I was relieved to see that participants were able to find the information in the table in under a minute. One suggestion was to turn the row headers into columns and add links to the subscriptions so people could learn more about them. We then arrived at our final design: Clear verbiage introducing a vertically organized table containing links to the different plans and checkmarks next to the plans that have the feature.

A text box with a simplified table.

With rapid prototype testing, UserTesting was able to help my team ensure we were not moving forward with a potentially confusing graphic that would appear in nearly every Knowledge Base article. 

The takeaways

Every Customer Education team can benefit from fast, quality feedback that helps them build engaging learning experiences that lead to increased usage, adoption, and renewal. We need agile practices to train around customers’ pain points and use cases. UserTesting is but one solution that can make the process more efficient and scalable not just for the Customer Education team, but for any stakeholder invested in their training output. 

So, where should your team get started? 

  1. Create a plan to collect insights for every project. Determine where in your timeline you should be soliciting feedback from participants. Hint: It’s in the three stages mentioned in this article!
  2. Determine your primary questions and objectives. What do you want to learn? Is there a problem you’re trying to solve or better understand? This will help you frame your study questions and make sense of the results you get back.
  3. Choose a testing approach that fits your needs and resources. You don’t have to be an expert in UX research methodologies. UserTesting provides guidance on different approaches to take and even offers hundreds of test plan templates in our Template Gallery if you need inspiration.  

By treating Customer Education like a product, teams can more easily plan when and how to collect insights that best educate their customers to be successful.

Photo by Kenny Eliason on Unsplash

The opinions expressed in this publication are those of the authors. They do not necessarily reflect the opinions or views of UserTesting or its affiliates.