Skip to main content
  • Regulatory Update

Common Use-related Risk Analysis Pitfalls

A use-related risk analysis (URRA) is an important cornerstone activity in your human factors engineering (HFE) process that is expected by regulators.

Magnified warning triangle

December 5, 2022

A use-related risk analysis (URRA) is an important cornerstone activity in your human factors engineering (HFE) process that is expected by regulators.  A complete and well-considered URRA helps ensure that all use-related risks are considered and mitigated to an acceptable level throughout the development process. Our previous blog 10 steps to conducting a use-related risk analysis as part of your Human Factors Engineering process details how to conduct URRAs.

In this blog, we will highlight a number of commonly observed URRA pitfalls along with our recommendation for how to overcome or avoid the pitfall altogether.

Risk ID

  • Risk IDs are not uniquely assigned to risks, but to tasks. Assigning risk IDs to tasks could cause confusion in case a task contains multiple use errors and harms, and thus risks.
    • Recommendation: Assign a unique risk ID to every risk (i.e., every single line item which might contain a different use error, hazard, hazardous situation, or harm).

Use scenarios and tasks

  • Not all workflow steps are included. It can be easy to overlook workflow steps as part of your URRA. For example, you might be so focused on making sure that you capture all injection steps for your pen-injector that you forget to include the disposal of materials.
    • Recommendation: Conduct a task analysis and leverage the instructions for use (IFU) to ensure you capture all workflow steps. Remember to consider potential troubleshooting steps, especially if failing to resolve these situations might lead to different harms than the situation that caused an alarm/warning to appear in the first place.

Potential use errors

  • Use error is not focused on the user. One common pitfall is to list use errors (or failure modes) that focus on technical failures, rather than on the user.
    • Recommendation: Start each potential use error with “User …” to keep the focus on the user and their (inter)action.
  • Use error is too vague. It can be challenging to draft concise but comprehensive use errors, and many use errors that we observe do not clearly describe the actual use error. Consider the use errors “user receives incorrect dose” and “user does not administer treatment according to IFU.” What did the user do to receive an incorrect dose, and is it an under- or overdose? Similarly, in what way is the user deviating from the IFU? And if users take a deviating approach from the IFU, to what extent is that an issue?
    • Recommendation: Remember that a potential use error is an error of commission (i.e., an action taken), or an error of omission (i.e., an action not taken). Therefore, ensure to write the use error using an active voice (i.e., did or did not) and refer to your task analysis when considering what a user might omit or do differently per task.
  • Use error is a hidden potential root cause. For example, a use error might be titled “user does not see button.” However, not being able to see a button might instead be a root cause for why someone did not push a button.
    • Recommendation: Consider adding a separate column to your URRA outlining potential root causes. This will help you keep the two separate. In addition, it is a good exercise to determine whether your mitigations are sufficiently strong.

Hazards

  • Inconsistent hazard presentation. A hazard is a potential source of harm. Out of the different URRA contents, we notice that filling out the hazards is one of the most challenging activities. Specifically, we often observe that different URRA elements get combined into the hazard, such as partial hazardous situation descriptions (e.g., “cannot use device anymore”).
    • Recommendation: Choose a consistent approach when presenting your hazards, such as presenting hazards as nouns (e.g., needle, extension cord, water leak, unassembled device), and stick to that approach.

Hazardous situations

  • Hazardous situation does not provide clear connection between the use error and hazard, and the resulting harm. Consider the hazardous situation “device is damaged due to excessive force, user uses damaged device to apply treatment” and the following harm “burn wounds.” You might be left wondering how the damaged device led to a burn wound.
    • Recommendation: Avoid surprises in your harms. The hazardous situation should build a bridge between the use error and hazard to the harm. Always consider: What is the result of your use error combined with the hazard? And how does that result into the harm?

Harms

  • Harms are too high level. When harms are too high level (e.g., “injury”), they do not provide the reader/reviewer with sufficient information to determine how severe the harm is. This could also lead to inconsistent severity ratings. For example, hypoglycemia might have a severity rating of 2 in one place and of 3 in another. Notably, severity of use-related harms plays a big role in the HFE process because it determines the scope of the HFE process and the validation test in particular.
    • Recommendation: Consult a clinical specialist. They will be able to provide you with more information regarding the severity of the harm, such as whether the harm might be a first, second, or third-degree burn, or how severe a one-time versus repeated underdose might be. In addition, ensure that all harms and severity levels throughout your URRA align and are consistent. If a difference in severity levels between different risks is required, ensure to add more detail to your harm clarifying the difference between the two harms (e.g., injury: cut, blister – severity level 2 vs. injury: broken bone – severity level 3).

Mitigations

  • Not taking credit for existing design mitigations. You might think that perhaps a risk is no longer applicable due to an “inherently safe by design” mitigation. Or perhaps you consider the design simply being part of your product, and therefore only list labelling mitigations (e.g., training, IFU).
    • Recommendation: Take credit for your design mitigations, even if you feel they make the risk obsolete (e.g., automated needle shield preventing users from touching the needle).

Petra Boeree is Senior Human Factors Specialist at Emergo by UL’s Human Factors Research & Design division.