July 6, 2020
Events of recent months have introduced a global shift in research activities. For those who conduct face-to-face research, this has required adaptation of more typical research approaches to facilitate remote testing. There are challenges with conducting any user research virtually, in particular usability testing, which typically requires participants to interact directly with a product or other stimuli. As such, there are unique considerations around how to facilitate realistic interactions with the test articles.
A dynamic virtual testing platform that accommodates complex interactions is key for capturing robust, valid data, but this alone is not enough; participants must feel at ease navigating through this simulated environment. If participants are uncomfortable or lack confidence around how to interact with the virtual testing platform, it can detract focus from engaging in test activities, resulting in performance data that is not truly representative of how participants might interact with the same product in real life.
By enabling participant comfort in a few key areas, you can reduce the risk that remote usability test data will be confounded by participants’ uncertainty or unease with the test environment.
Build rapport between moderator and participant
Whether testing in-person or remotely, forging a strong, genuine connection between the moderator and participant is essential. If participants do not feel comfortable communicating with the moderator, they might be hesitant to provide candid feedback and could withhold valuable insights.
Setting aside a couple of minutes for small talk when the participant first joins the virtual testing room helps orient the participant to how you will interact with each other throughout the session. It also enables the participant to get their bearings in the new environment before diving into hands-on activities. Utilizing video-sharing is another good way to build rapport: rather than being a “faceless” voice, you can come across as more human if the participant can actually see you, even if you are still only present through a screen. Ensure your word choice and tone convey warmth when using video, as body language can be more difficult to interpret virtually.
That said, displaying a continual video feed could also add stress, as it makes participants acutely aware of being watched. Consider your research activities when determining whether to share video. Early-stage prototype exploration or tasks requiring frequent back-and-forth between the participant and moderator are well-suited for continuous video sharing. Human factors (HF) validation testing or tasks that participants will complete largely independently might call for a more nuanced approach. Only sharing video during the introduction and debrief portion of the session can be a good compromise for these latter cases.
Clearly outline session structure and expectations
Remote test participants generally have less access to contextual information than they might during in-person tests, as they can only see what you display on the screen. Giving a general overview of the session structure--touching on details such as how you will present stimuli and task prompts, and who might be dialing in to observe--can help ease participants’ anxiety around what will occur during the session. Offering this insight and guidance early on frees more of participants’ mental energy to concentrate on completing tasks to the best of their ability. Furthermore, it enables participants to approach each task with greater certainty around what is expected of them, lowering the chance that participants’ task performance will be impacted by general confusion with the overall session structure.
This general overview also provides an opportunity to highlight any “unnatural” or unfamiliar interactions that might be necessary for the session to run as intended. For instance, if participants must show their phone screen to the moderator via webcam before proceeding to the next step in a task workflow, this process should be made clear at the start of the session to mitigate delays or lost data resulting from having to remind participants to take this action while partway through a task.
Onboard participant to remote testing platform interactions
Interacting with remote testing platform features can be challenging, particularly for less tech-savvy participants. If participants are hesitant or confused about navigating around the platform, they could encounter difficulties when completing hands-on tasks that are largely attributed to lack of familiarity with platform features rather than true usability issues with the product itself.
To mitigate this, make sure that participants are adequately onboarded to the remote testing platform before introducing hands-on tasks. Walk through key platform controls such as screen sharing, allow participants to freely explore the platform for a few moments, or present a straightforward sample task to enable familiarization with platform controls used during actual tasks. You might also provide a contingency plan on how to handle technical difficulties in order to quell participants’ worries--and resulting time and energy spent--around troubleshooting technical issues independently.
Consider sending participants brief setup instructions in advance, so they can orient themselves to the platform beforehand. Alternatively, or in addition to these instructions, call participants shortly before their session to assist with platform setup and initial troubleshooting (e.g., downloading and logging into the remote meeting), and trial run any relevant platform tools that will be used during the session. Providing this real-time setup assistance limits delays and interruptions throughout the session and reduces risk of having to modify or skip hands-on tasks due to time constraints.
In conclusion, remote usability testing is a powerful tool, but without consideration of how participants will experience the virtual testing space, data collection and validity can suffer. By ensuring participant comfort in these virtual environments, we can reduce much of the friction that impacts task performance, resulting in higher-quality data that reflects how participants interact with a product in real life. And just as important, participants will likely have an enjoyable, fulfilling testing experience, which is a win all around.
Tess Forton is User Researcher at Emergo by UL's Human Factors Research & Design division.
Learn more about usability testing and related human factors engineering issues at Emergo by UL:
- Human factors engineering (HFE) user research for medical devices, IVDs and combination products
- Medical device, IVD and combination product usability training and consulting
Request information from our specialists
Thanks for your interest in our products and services. Let's collect some information so we can connect you with the right person.