Published on July 2, 2020

Bob Moll, Principal UX Architect, Orthogonal
Beth Lester, UX Researcher, Bold Insight

In last week’s blog, we looked at implementing frequent releases on an organizational level and touched on a few key approaches we recommend when integrating and introducing frequent releases into your organization’s design and development process for connected mobile medical devices and Software as a Medical Device. Though we talk about release cycles as “build-measure-learn,” the reality is that they are actually “build-learn-build-measure-learn” cycles.

Today, we’re delving into techniques for learning early in the process, learning quickly throughout, and measuring efficiently once possible. Based on many years of real-world experience, these are strategies that we recommend integrating within each release cycle because if you can build faster, but not learn and measure faster, you will not be able to release faster. In other words, it’s great if you can make your race car go faster and get better mileage, but only if your car is pointed towards the finish line!

Technique 1. Employ a lean UX approach

Testing your product with a small sample size on a regular, recurring basis ensures you continue to learn about your device from users as the design grows and changes. The Lean UX practice of “Three User Thursdays,” as they’re known at Orthogonal and other organizations such as Meetup, employs shorter, more frequent sessions with end users for both user experience design and formative human factors feedback. Whether weekly or biweekly, bring in a handful of users for short sessions to test changes and features that your team has made and added to the software. The frequent recurrence of fast, small-sample testing ensures feedback is gathered constantly throughout the design and development process.

  1. Identify your KPIs. To make the most of learning opportunities and to maximize the efficiency of build-measure-learn cycles, begin by identifying key performance indicators (KPIs) through an event model taxonomy or failure modes and effects analysis (FMEA). Identifying KPIs early will help prioritize research efforts and will ultimately lead to a more effective launch.
  2. Establish UX and human factors questions. Next, we recommend setting up a backlog of UX and human factors questions that you would like to ask users. By keeping and updating a backlog of questions, and referring back to your KPIs, you can prioritize what you want to test for the next iteration. Use these small sample, rapid sessions as an opportunity to test your highest priority questions, such as product design implementations or instructions for use. You may also want to test for the purpose of defining and refining the risk assessment and/or intended user group(s). Whether the questions on your list are task specific or more exploratory in nature, the goal is to always have an artifact to test against.
  3. Use a prototype. Instead of asking users questions about a theoretical situation, give them something to interact with. Remember: your test artifact should not be too fancy. Use low-fidelity materials that will be cheap and easy to modify based on user feedback. Then, implement design changes based on user feedback and retest the design at your next “Three User Thursday.” Waiting until you have a polished, shiny prototype is counterproductive to lean testing as it will drastically decrease the frequency with which you can test your designs. For example, WeWork’s recent implosion began when after taking billions of dollars of investment, they revealed some shockingly bad financials in their IPO filing using the assumption that investors would buy into some unusual financial measures.
  4. Engage the right players. Lean UX utilizes an iterative process of cycling through 1) user research, 2) the design and testing of interventions, and 3) user analytics as shown below.  Note that this cycle is likely to have smaller cycles within it where you move between user research and the design and testing of interventions multiple times before moving onto user analytics. As a result, implementing lean testing methodology requires an appropriate level of staffing of UX and human factors experts, as well as a recruiting process that ensures sufficient unique users are available in a timely fashion to support sessions.

User Research -> Design and Test Intervention -> User Analytics

Citation:  Orthogonal

 

Technique 2. Measure real-world results

Lean UX methodology should not end or stall just because an iteration of the product has gone to launch. Design changes and new features can be qualitatively tested with users, and then developed and tested in the real-world. Once a release is in production and ready to be out “in the wild,” seize the opportunity to get real-world feedback to understand “What’s happening” through user analytics.

What user analytics does not do, however, is gives us the opportunity to dig into learning why a user made a certain decision or acted one way versus another in the real-world. Returning back to conduct additional lean research sessions, such as “Three User Thursdays,” will answer these “why” questions as you continue through the iterative development process once again.

It’s important to realize that usability and human factors activities are not a one-time event. Once you are in-market, you will have more real-world data that you can look at to help improve the product. This is important if you are going to continue to improve the product and submit regulatory filings in the future. Remember that the FDA is increasingly emphasizing real-world data, and has reorganized their entire operations towards a total product lifecycle approach to regulating medical devices – pre-market and post-market collaborating.

Next week, in our fourth and final blog, we’ll discuss more tips for how you can incorporate the strategies we’ve discussed into your organization. Look forward to some interviews with colleagues within our networks as we talk through some of the change management strategies that have worked to implement frequent releases for SaMD and connected medical devices.

About This Blog Series

Given the multidisciplinary nature of MedTech and connected health, we have the opportunity to work with a wide variety of professionals, including engineers of many stripes, scientists, researchers, designers, experts in regulatory, and quality management. Our collaborations with the talented UX and human factors researchers at Bold Insight have been especially synergistic because Bold Insight’s specialization in user research for regulated medical devices is an excellent complement to Orthogonal’s specialization in the design and development of connected mobile medical devices (CMMD) and Software as a Medical Device (SaMD). But more than working on a common niche, Bold Insight and Orthogonal are fellow travelers on the road to slay the same dragon: we both believe that there is a huge opportunity to move the needle on healthcare costs and outcomes through the effective use of digital health solutions.

In recent years, both of our organizations (and clients!) have recognized how research and development teams can work together in a much more seamless and effective manner when they are aligned with a common focus on user-centered design. This is even more true when the R and the D in R&D jointly leverage methods based on fast feedback loops that generate rapid, iterative learning and progress such as Lean UX and Agile software development.

Given our mutual passion for getting safe, engaging, clinically effective, and comparatively effective medical devices to market faster, we thought we’d use this new series of blogs to share from our experience what that process has looked like in this new series of blogs.

Blog #1:  5 Tools to Implement Frequent Releases for SaMD and Connected Devices

Our most recommended strategies to avoid surprises, minimize risk, and achieve successful frequent releases, including rethinking the composition of your UX team, open communication, and effectively prioritizing tasks to maintain continuous build-measure-learn cycles.

Blog #2:  Reduce surprises during FDA-required validation testing with a frequent-release process

Increasing release frequency and user testing frequency allows organizations to learn from mistakes in near-real-time and shows real-world successes, difficulties, and failures before summative human factors testing occurs resulting in reduced cost and user risk.

Let us know what you think!

 

About The Authors

Bob Moll is the principal UX Architect at Orthogonal. You can email him at bob@orthogonal.io

Beth Lester is a UX Researcher at Bold Insight. You can email her at beth.lester@boldinsight.com

About Orthogonal

Orthogonal is a software developer for connected mobile medical devices (CMMD) and software as a Medical Device (SaMD). We work with change agents who are responsible for digital transformation at medical device and diagnostics manufacturers. These leaders and pioneers need to accelerate their pipeline of product innovation to modernize patient care and gain competitive advantage.

Orthogonal applies deep experience in CMMD/SaMD and the power of fast feedback loops to rapidly develop, successfully launch, and continuously improve connected, compliant products—and we collaborate with our clients to build their own rapid CMMD/SAMD development engines. Over the last eight years, we’ve helped a wide variety of firms develop and bring their regulated/connected devices to market.

About Bold Insight

Bold Insight is a user experience and human factors research agency. Designing, managing, and executing projects that span the product development life cycle, we conduct user research informing early product design to global human factors validation. Working with digital, next-generation technology—medical devices to mobile apps, in-car systems to websites, back office systems to end-to-end customer journeys – we specialize in large-scale and global research.