Validating Product Ideas During COVID-19 Period: A Case Study, Part 2

Shirley, Wang Xinling
IxD Stories

--

Recap

In the last episode, we’ve shared the background and motivation for conducting the online test and user survey for a design-driven product idea in our content quality check platform. We crafted the online test and survey to find out:

  • If the customization feature proposed is a go-to option to our users (by analyzing the satisfaction scores)
  • If all options provided are necessary (by analyzing if there are enough divergences in the chosen options)
  • If any unforeseen usability flaws were not discovered during the previous user studies (by providing open questions and reviewing responses)
An illustration includes a pyramid of three layers, bottom to top: usefulness of the proposal, scope of proposal, usability
Three layers of answers we sought for

In this article, we will dive into the details of how the online test and survey were conducted. Topics to be covered:

  • How to prepare the prototype for user test
  • Welcome message of a user survey
  • Tips for composing the body of the survey (with Google Form)
  • How we collected participants’ contact information

Disclaimer

This article is co-authored by Xinling Wang and Anne Hwarng for an internal project we completed in late 2020 in Shopee. To comply with the company’s NDA, this article will not cover any sensitive content or accurate numbers. All views are our own.

How to prepare the prototype for test

To further cut down prep time for this test, static prototypes were used. Anne joined me in preparing the prototype and crafting questions.

An illustration of the test workflow, respectively feature walkthrough, evaluate the feature, blue-sky suggestions.
The test workflow

We used Figma in this stage, as the tool is one of the best for digital prototyping and propagating — easy access management with just one link. Below are tips for preparing the static prototype.

Provide adequate guidance for how to experience the prototype

In moderated user tests, tasks should be prepared thoroughly and presented to participants before they interact with the prototype. In our case where static prototypes were used, we didn’t assume that they could easily understand the workflow well solely through interfaces. We strongly suggest providing proper visual guidance to participants with at least a briefing of the scenario and goals (even if the goal is not specific). We also presented the prototypes with proper layouts in groups, with titles, arrows and notes.

A screenshot of the prototype canvas we used during the remote test
Visual guidance should be provided to participants to navigate through the design proposal

Communication is vital as once the results get collected and broadcasted, any miscommunication would be amplified to a profound extent and have a great impact on the acceptance of our design idea amongst our tech and product teams.

Test the prototype link

Sometimes even if links are accessible to you, permission layers may prevent others from viewing them. Try to conduct at least one test with a potential participant (from the same user group as actual participants) to ensure links work. Or, if we are to check if a link is public, we may also try opening it in our browser in incognito mode.

A demonstration of how to use incognito mode to test prototype link’s accessibility
Open browser in incognito mode to test the link works for external parties

Walkthrough the user survey

When the prototype was completed, it was embedded into the survey. The survey In the following chapters, We’ll share how our survey was put together and tips and lessons of doing so.

Welcome Message

The welcome message, as the first touchpoint of the survey, should clearly convey the objectives. To us, this is a golden chance to connect our objective with the participants’ goal. — so that they feel they can benefit from participating and thereafter could be more focused on the tasks they are going to take next. It also introduces the participant as co-creator in our design process as we will take participants’ input as our key factors for improvement.

Embed the prototype into the welcome message

Although the rule of thumb for an effective opening of a survey is to make it as short as possible, we dedicated the entire opening page of the survey to the welcome message, a link of the prototype, instructions with imageries and bullet points and critical troubleshooting topics. This is to ensure the survey procedures could be followed correctly.

An illustration of a questionnaire’s welcome message

The body of the questionnaire

This section introduces the body of the questionnaire. We’ll share a template of the questions designed to evaluate the usability of the feature/website then list down some tips we found useful while composing the questionnaire.

Questions to evaluate the usability of the feature/website

The first collection of questions are feature-specific. They are to evaluate:

  • If all options provided are necessary (by analyzing if there are enough divergences in the chosen options)

The second part, or usability evaluation questions adopted from Arnie Lund’s Usefulness, Satisfaction, and Ease of Use framework, is designed to find out:

  • If the customization feature proposed is a go-to option to our users (by analyzing the satisfaction scores)
  • If any unforeseen usability flaws were not discovered during the previous user studies (by providing open questions and reviewing responses)

In our case, the evaluation is based on the prototype provided. Below is the template of the question list.

To download the list of usability evaluation questions, click here

Use sections and question filters

Grouping questions that fall in the same category or are logically related will reduce cognitive load for readers. Sections are also the main function in Google Forms for filtering questions that do not apply to certain groups of participants.

A demonstration of Google Form’s section based question setup.
Google Form supports branching the questions based on user’s selection, but that will lead to a new page no matter the selection of user’s

Note: the only limitation of this would be that Google Forms requires sections to occupy an entire page, which means that even if you are only using one question to filter people into different pathways, that single question will appear in an entire page.

Utilize multiple-choice grid layout for a list of scale questions

Make use of the layout for Likert questions that share the same scales so that readers can quickly scan through the questions and indicate their preferences without drifting the sight focus around the interface.

Compare how the two layouts could cause the difference of focus as below:

A comparison of screenshots, showing how multiple-choice grid questions surpasses the individual scale questions
Comparing linear scale questions vs. grid multiple-choice questions
A screenshot of how to set up multiple-choice grid on Google Form
Set up a multiple-choice grid layout for a list of scale questions

Stepladder the questions

Questions in user surveys can be categorized into closed and open questions. Closed questions are easy to answer and analyze, whereas open questions, though requiring effort from participants, often bring valuable qualitative responses.

We used a mixed methodology but still played small tricks on the arrangement of questions. To onboard participants smoothly, in each section, we prioritized closed questions before open ones. A normal sequence in our questionnaire would be as such:

  1. A series of Likert scales (asking participants to rate their satisfaction from 1~5, closed questions)
  2. Paragraph question (asking participants to elaborate why they gave a poor rating, skippable, open question)
The usability evaluation questions before the open questions also serve as heuristics to the open questions

Recall, rather than recognize

The survey contains specific questions to certain sections represented in the prototype. We are aware that for especially new feature tests, participants are not as familiar with the names/terms as we designers are. To help them recall and avoid any ambiguity, we appended screenshots to each question to orientate our participants with clarity.

Write in the readers’ language

Our business operates in 8 regions that communicate in languages including Bahasa Indonesia, Thai, Vietnamese, Traditional Chinese, English and Portuguese. To avoid confusion, we composed in plain English and handed the questionnaires over to native speakers of each target region.

Furthermore, we set up two requirements in the translations process:

  1. Questionnaires should undergo translation then proofreading
  2. The two steps must be done by two individuals
  3. Both individuals should be familiar with the scope of business or have experience working on the system we are designing for.

Make user feedback a long-term thing

Collecting user feedback is not a one-time effort. We have a Google Form dedicated to collecting user feedback which we appended at the end of our questionnaire too.

A screenshot of how to leave a confirmation message on Google Form
Consider utilizing confirmation message to spread the link of long-term feedback collection

Contact information

We wish to collect the detailed job title from our participants for the sake of a weighted-analysis. (Weighted-analysis means to assign different weights in answers based on participants’ seniority. Leaders’ responses should also be weighed more heavily.)

We also wished to collect the email addresses from the participants so that we could follow up on any odd or ambiguous responses if the need arose. This is viable by limiting the responses to our organization (Google sign-in required before viewing the survey)

A screenshot of Google Form’s settings page.
If “Collect email addresses” is ticked, emails will be collected (Google sign-in)

In the end, we decided not to adopt the idea of sign-in as we were afraid this measure would impact the answers from the respondents — it is a commonly known phenomenon that users resist answering questions as honestly and are more conscious of their answers if they know their identity will not be anonymized. Instead, we informed our participants why their contact information is needed and how it will be used and let them opt to share or not.

References

  • Tom Tullis, Bill Albert, Measuring the User Experience — Collecting, Analyzing, and Presenting Usability Metrics, Chapter 6 Self-Reported Metrics

In the next episode, we will walk through the online test and discover what tricks could contribute to the administration of a successful remote test and survey.

About the authors

What’s in your mind?

💭 Comment and let us know your thoughts, doubts, feedback!
👋 Connect with us on LinkedIn: Xinling’s Homepage · Anne’s Homepage

--

--