Showing posts with label HE75. Show all posts
Showing posts with label HE75. Show all posts

Friday, December 13, 2019

Submission of the Human Engineering File to the FDA and Other Regulatory Bodies, Section 8: Part VI

This is the easiest for me to cover largely because the requirements for the validation section are clearly spelled out in detail.

8Details of human factors validation testing
  • Rationale for test type selected (i.e., simulated use, actual use or clinical study)
  • Test environment and conditions of use
  • Number and type of test participants
  • Training provided to test participants and how it corresponded to real-world training levels
  • Critical tasks and use scenarios included in testing
  • Definition of successful performance of each test task 
  • Description of data to be collected and methods for documenting observations and interview responses
  • Test results: Observations of task performance and occurrences of use errors, close calls, and use problems 
  • Test results: Feedback from interviews with test participants regarding device use, critical tasks, use errors, and problems (as applicable)  
  • Description and analysis of all use errors and difficulties that could cause harm, root causes of the problems, and implications for additional risk elimination or reduction 
These requirements are largely self explanatory. However, I would like to make a few comments and additions.

  • Validation testing -- including verification testing -- are often performed by outside consulting firms. Thus it is extremely important that you spell out how your testing should be performed and the measurements to be collected and reported. I've noted that often times consulting company is asked to write both the protocol and the testing script. This is a mistake. The organization that performed the work up to the validation testing stage should be responsible for creating the protocol and the script, because it is this organization that will be responsible for the submission of the HE file to the FDA and/or other regulatory bodies. It's important that the research and development as well as the submission be responsible and in full control of what takes place during the validation step.
  • Verification and Validation testing. Verification testing takes place under laboratory conditions using as testing participants members of the targeted user population. This is an additional check on the usability of the system or device. Validation testing takes place in actual or simulated actual conditions -- with all the distractions and problems that users will likely encounter.
  • Rationale for type of testing performed and the conditions chosen for validation testing can be extremely important especially if you have chosen a testing procedure less rigorous than performance testing under real or simulated real conditions. Consult IEC 62366 and AAMI HE75 for guidance.
  • Testing procedure should insure that full testing of critical tasks are performed and likely to be repeatedly performed by testing participants.
  • Suggested additional measurement: If your system or device has error trapping and redirecting capabilities, be sure to report how often these capabilities were triggered and if they enabled the testing participant to successfully complete the task. This could be labeled as: task successfully completed, close call. However, a system or device with the capability to protect against use errors is a capability worth pointing out. 

What to include in your narrative?


Include the abstract or abstracts of your validation testing in your narrative. 

If you haven't included any significant issues or root cause analysis in your abstract, be sure to include this in your narrative. Be sure you surface all issues or concerns in your narrative, if you don't it could appear to a reviewer that you're trying to hide any problems that you encountered. Even the appearance of hiding problems could cause problems with receiving approval for your system or device. 


Wednesday, November 20, 2019

Submission of the Human Engineering File to the FDA and Other Regulatory Bodies: Part IV

Section 5: Analysis of hazards and risks associated with use of the device


This section is one of the most important sections of your submission. Your narrative should have both a summary of your findings and highlight any notable findings from your risk analysis and include any important steps taken to mitigate risks. 
5Analysis of hazards and risks associated with use of the device
  • Potential use errors
  • Potential harm and severity of harm that could result from each use error
  • Risk management measures implemented to eliminate or reduce the risk
  • Evidence of effectiveness of each risk management measure
Risk assessment and management for human engineering for medical devices focuses on the identification and management of use errors. This is defined in IEC 62366 (part 1).  It is valuable for representatives from human engineering to be a part of the risk management team because of the additional insights that experienced human engineering professionals can bring to the process of identifying, managing and mitigating risks. However, the specific focus of the human engineering file with respect risk management is on use errors.

Narrative should be both a summary and a means to highlight events and areas of specific importance. This section can be relatively brief, but include the following information.  


Potential Use Errors, Identifying Harm and Severity of Harm


Identify the number of use errors discovered categorized according to their level of harm/severity. Any use errors categorized as high or critical should be highlighted. Be certain to include the method or methods used to identify and categorize potential use errors: this could include methods such as general system analysis, scenarios, field data, etc. 


Risk Management Measures Implemented to Eliminate or Reduce the Risk


Once you've identified the potential use errors, discuss what was done to mitigate or eliminate them. (Of Note: Root cause analysis is a particularly effective method for understanding use errors and correcting them.)  In the narrative you can make general statements regarding what was done to manage use errors of medium risk and lower. For High and Severe risk use errors, the narrative should include the specifics of how these risks were managed. Risk management can include anything from changes in design, system error trapping and error prevention measures to updates to instructions.


Evidence of Effectiveness of Each Risk Management Measure


Finally, you will need empirical data to demonstrate that the use errors identified have been properly addressed. That will require testing with members of the targeted user population. Your narrative should include a summary (abstract) of the study or studies that you performed. And be sure to provide clear references to the documentation included in your submission. 

Note: if your targeted user population is particularly narrow and tightly specified, consider including other members from a wider group. For example, if the targeted group consists of ICU nurses, consider including general hospital RNs in your user testing group. All too often, medical devices intended for one highly trained group often appear for use by others who may be technically astute but lack the specific training of the highly trained group.

A Word to the Wise ...

You should show that your human engineering program assessed risks, identified use errors and mitigated use errors long before reaching verification and validation testing. This should have been addressed in your formative research and by periodic testing before reaching the Verification and Validation stage of testing. V & V should not be the place to uncover use errors. It should be the place that validates all of the work you've done up to that point. Now, it's likely that V&V will uncover some areas of concern, but these should be relatively minor and addressable in a relatively easily. If you have discovered numerous problems at the point of V&V, then you have a problem with your human engineering process and revamping that process should become a major focus of your organization. 






Saturday, December 1, 2018

Commentary: HE-75 and IEC 62366 and Cleaning Up the Messes

I received a reminder recently when I was made aware of the International
Consortium of Investigative Journalists' database of medical device recalls of what human factors professionals working in the area of human engineering for medical devices are often called on to do: clean up the mess created by a failed design process that somehow failed to incorporate research. (Note that medical device development isn't the only domain where this kind of failure occurs, however, the impact of medical device failures can often result in fatalities.) The persons responsible for designing an awful, unusable and in some case, useless user interface expect the usability expert to come in, take one look and create a beautiful user interface. This is absurd!

Writing from my own perspective, there is nothing that a usability professional likes to do less than to correct a failed design that resulted from a failed design process. This week I was asked to save a group of programmers and user interfaced designers from the monstrosities that they had created. What was particularly strange was that the leader of the project thought that I could just redesign something by looking at what they had created. It was bizarre. Unfortunately, I had to deliver several harsh messages regarding the design process and the design, that were not well received. (Nevertheless, that is my job.)

Here is the point I want to make clear to anyone who reads this: Process and the resulting design should be considered as two sides of the same coin. The outcome of a good design process generally results in a good design. A nonexistent or poor design process often times leads to a poor design and a design that gets worse with each design iteration when attempts are made to fix problems or incorporate enhancements.

The processes and design direction provided by HE-75 and IEC 62366 can serve as a foundations for research and designing systems with user impacts within nearly any industry, particularly in those industries where the potential for harm is likely.

Saturday, April 4, 2015

UK Perspective Regarding FDA Regulatory Requirements

A Linked-In colleague posted a link to this article. I read it and found it interesting enough to post the link and comment on it. It's by a UK publication and discusses the FDA regulatory process as it relates to Human Engineering requirements for device approval for commercialization.

Here's the link:
http://www.emdt.co.uk/daily-buzz/what-are-fda-usability-testing-requirements-device-approval?cid=nl.qmed01.20150325

In addition, I provide my own perspective on the article in the "Commentary" section below. I do not critique the article. I only attempt to expand on a few points from it.

But first, a brief summary of the article.

Article Summary


Medical errors have become an increasing concern of the FDA. I became interested in medical errors when I was a consultant at St. Jude Medical Cardiac Rhythm Division in Sylmar, CA. During my time at St. Jude (2009-2010), deaths by medical error were being reported as being 100,000 to 120,000 per year. Last year, I posted links to two articles that stated that deaths by medical errors could be closer to 400,000 per year. (http://medicalremoteprogramming.blogspot.com/2014/07/rip-death-by-medical-error-400000-year.html)

It has been noted by the FDA a large proportion of medical errors can be attributed to poorly designed medical device user interfaces. Since a fundamental mission of the FDA is increasing patient safety and reducing injuries and fatalities in the practice of medicine, the FDA has begun placing greater emphasis on improving the usability of medical device user interfaces.

This article provides measures that show the FDA's increasing emphasis on usability and human factors issues by showing the increasing frequency that companies seeking medical device clearance for the US market mention the terms "usability" and "human factors." Figure 1 from the article clearly shows the increasing usage of these terms in company filings.



The focus should be on the trends, not the absolute numbers because not all filing documents have been included in the count. But the trend clearly shows an increased emphasis by companies to increasingly use the terms "usability" and "human factors" in their filings with the FDA. The two figures that follow suggest the degree that companies have incorporated the FDA prescribed human factors engineering process and design guidance documentation.

The documents listed below are specifically targeted to defining and supporting the human factors engineering process and the development of the Human Engineer File that's included as part of a company's filing to the FDA.


  • ISO 62366, Medical Devices - Application of Usability Engineering to Medical Devices
  • AAMI / ANSI HE75:2009, Human Factors Engineering - Design of Medical Devices (General)

I'll discuss the documents above in greater detail and describe how they're intended to fit within the human factors engineering process when developing medical devices.


  • IEC 60601-1-6 Medical electrical equipment - Part 1-6 General requirements for Safety - Collateral standard: Usability
  • IEC 60601-1-8 Ed. 1, Medical Electrical Equipment - Part 1-8: General Requirements for Safety - Collateral Standard: Alarm Systems - Requirements, Tests and Guidance - General Requirements and Guidelines for Alarm Systems in Medical Equipment (General)

The two documents above are engineering standards. They're engineering specifications that medical devices must meet. They are technical and specific.

I show Figure 3 from the article before showing Figure 2. 



The increasing reference of 60601-1-8 is not surprising given the increased emphasis on safety. My real interest is in the significant increase in reference to ISO 62366. As mentioned, this is process standard the lays out how human factors engineering should be engaged to reduce "use errors." The emphasis in this standard is on the reduction of risk. Risk management is extremely well embedded in the medical device design and engineering process. It would seem that from a cultural perspective, ISO 62366 fits with the medical device engineering process. 

I want to contrast the dramatic, increasing references to ISO 62366 with the references to AAMI/ANSI HE75 shown in Figure 2 below.



References to AAMI/ANSI HE75 rise and fall from 2010 to 2013 instead of a steady upward trend that you see with ISO 62366 in Figure 3. I would like to emphasize that ISO 62366 and AAMI/ANSI HE75 should be considered as companion documents. (I'll expand on this in the Commentary section below.)

Commentary


The article does support the contention that the FDA and the companies it regulates are paying increasing attention to usability and human factors. That they're paying enough attention is another matter entirely. As new medical devices are introduced we should see two things. First, the use error rate for the newly introduced medical devices (once users have adapted to them) should decline in relationship to other similar devices currently in use. Second, we should see over time the number of per year of deaths and injuries from medical errors begin to decline. This will take time to detect.

Without a doubt, the push by the FDA to define a human engineering process in the design and testing of medical devices, and to press for testing under actual or simulated conditions is needed. In many ways the FDA is mirroring many of the processes that have already been adopted by the US Department of Defense (DoD) in the area of human engineering. Admittedly, the DoD doesn't always get it right, there is an understanding within the DoD that it is important ... life saving, battle-winning important ... to insure that those at the controls can do their jobs quickly, effectively and with as few errors as possible.  So from that standpoint, the FDA has adopted processes from programs that have proven effective. But the FDA has just passed the starting line. And much more will be required going forward.

ISO 62366 vs AAMI/ANSI HE75

As I mentioned earlier ISO 62366 and AAMI/ANSI HE75 should be consider complementary or companion documents. HE75 is a much larger document than 62366 and includes a significant amount of device design guidance and guidelines. 62366 is almost entirely a process document that's devoted to directing how to go about managing the research and development process of a medical device. In addition, the focus of 62366 is managing risks, risks in the realm of reducing use errors.

I found it interesting that references to HE75 were not increasing at the rate as references to 62366. I would have expected Figures 2 and 3 to have a similar appearance with respect to 62366 and HE75 in large part because the documents significantly overlap. In fact I might have reasonably expected references to HE75 to outpace 62366 because HE75 includes design specific guidelines in addition.

One possible reason for references to HE75 not being referenced in the same accelerated way as HE75 may have to do with the fact that the European Union has not adopted HE75, so it's required for medical devices that will be marketed in the EU (CE).  (I am currently unaware of the regulatory requirements of other countries on this matter.) Medical device companies are international companies and the documents that they file in one country are generally the same in each country. Thus since the EU hasn't adopted HE75, references to HE75 and HE75's use as a foundational process and design document may be less.

DESIGN RATIONALE

I'm not sure that this is true at this point in time, but I am certain that the following will be true going forward at some time in the future. I believe that the FDA will hold companies to account for their user interface designs. I believe that the FDA will demand that companies clearly define how they came up with their user interface designs and that those designs are well-grounded in empirical evidence.

This is what I mean ... the FDA will demand that the design choices ... these include: controls, placement of controls, number of controls, actions performed by controls, the way the control responds, methods for interacting with the device (e. g., touch screen, buttons, mouse), size of the display, etc. ... for medical device user interfaces must be grounded in empirical data.

Commercial websites are often designed by graphic artists. Often times the design of webpages reflect the artist's aesthetic sensibilities. Layout appear they way that they do because they look good.

I believe that the FDA will require that user interface designs for medical devices have an empirically grounded design rationale. Companies will be required to point to specific research finding to justify the design and the design choices that they made. Furthermore, as the design of the user interface evolves with each iteration of testing, the FDA will require that changes to the design be based on research findings.

Finally, I believe that soon if it is not occurring already, that the FDA will require:

  1. That companies submit documentation to show in detail the full evolutionary design process beginning from product inception, including ...
  2. Detailed pre-design research ... population(s), method(s), research questions and rationale, etc ... as well as the findings and what they suggest for the design of the user interface
  3. A design that includes with a full discussion of the design rationale ... why was it designed the way it was ... 
  4. A detailed description of the evolution of the design that include full and clear justification(s) for each change in the design ... and require that changes be grounded empirical data 
  5. A full description of pre-commercialization testing process and method ... with a clear justification for why this testing meets FDA testing requirements
  6. And a complete and clear analysis of the testing data.
What I'm suggesting above is that the process of designing and testing a medical device user interface should be more than going through the prescribed steps, collecting the data, doing the tests, etc. There should be a clear thread that ties all the steps together. When in a subsequent step, one should be able to point back to the previous steps for the rationale to explain why the user interface was designed to appear and operate the way it does ... to this point.

As near as I can tell, what I described above is rigorous than is currently required by the FDA. However, I believe that it would be in any company's best interest to follow what I've suggested because there may come a time when the FDA's enforcement becomes more rigorous. 

Another reason may be lawsuits. If a company can show that it went beyond the FDA's regulatory requirements at the time, those suing would likely have less of a chance of collecting damages. And if damages were awarded, they may likely be lower. Also, if the company went beyond the FDA requirements, it would be likely that there would be fewer people injured and that should lower damages.

FINALLY

This article has been a springboard for me to discuss a number of topics related to human engineering for medical devices user interfaces. This topic will remain a central part of this blog. I'll return this within a week or two, and discuss in depth other topics related to the human engineering process for medical device user interfaces. 




Thursday, June 30, 2011

Are Electronic Prescription Systems Failing to Trap Errors?

A Brief Introduction

Before I jump into the topic of electronic prescription systems, I want to make known how I came across the article I am about to post. I am creating a website that includes a substantial portion of the human factors related work I have produced over the years. That website also includes posting articles on the home page related specifically to human factors - and that includes article related to medical errors: a topic of interest to me.

The new human factors website is not yet ready for viewing. I have just created a usable home page. The bulk of the work is to come. I'll post the address when it's reached a usable state.


What's Going on with Electronic Prescription Systems?


Bloomberg news recently reported the results of a study that indicated that prescription errors are as frequent whether handwritten or written through an electronic prescription system. Here is the address of the Bloomberg article:
http://www.bloomberg.com/news/2011-06-29/errors-occur-in-12-of-electronic-drug-prescriptions-matching-handwritten.html

I have not yet had the opportunity to read the study. However, I shall and I'll continue to update this blog on this topic based on what I find. 


With respect to the Bloomberg article, this quote caught my eye:


"The most common error was the omission of key information, such as the dose of medicine and how long or how many times a day it should be taken, the researchers said. Other issues included improper abbreviations, conflicting information about how or when to take the drug and clinical errors in the choice or use of the treatment, the researchers said."


I have been a human factors professional for a long time and as I read the quote above my jaw dropped. The errors described in the quote are some of the most fundamental and easily trappable and correctable errors. It seems beyond belief that an electronic prescription system would allow a user to make such errors. In the environments where I have worked, I have designed and installed subsystems to insure that users do not make the kinds of errors as described in the Bloomberg article. When I have a chance to read the report, I'll cover specific errors, their detection and correction. And means to insure that patients are not harmed.


Here's a link to another publication that reported on the same study:


http://www.eurekalert.org/pub_releases/2011-06/bmj-oep062811.php











































Sunday, August 1, 2010

HE-75 Topic: Cleaning Up the Mess

I received a reminder this week of what usability professionals are often called on to do – cleaning up the mess created by a failed process. Somehow, the persons responsible for designing an awful, unusable and in some case, useless user interface expect the usability expert to come in, take one look and create a beautiful user interface. This is absurd!  It was the "nightmare" come true - something related to one of my other postings: HE-75 topic: Design first and ask questions later

Writing from my own perspective, there is nothing that a usability professional likes to do less than to correct a failed design that resulted from a failed design process. This week I was asked to save a group of programmers and user interfaced designers from the monstrosities that they had created. What was particularly strange was that the leader of the project thought that I could just redesign something by looking at what they had created. It was bizarre. Unfortunately, I had to deliver several harsh messages regarding the design process and the design, that were not well received. (Nevertheless, that is my job.)

Here is the point I want to make to anyone who reads this. Process and the resulting design should be considered as two sides of the same coin. Good design process nearly always results in a good design. A nonexistent or poor design process leads to a poor design. HE-75's design process can serve as a foundation design process for designing user interface in nearly any industry, particularly in those industries where the harm is particularly severe. Where I am currently working, I plan to use HE-75 as one of the foundation documents to set user interface design standards. And as I mentioned, I am not currently working in the medical or medical device industry. However, I have come to believe that in this industry, the level of harm can be significant. Thus, I shall incorporate HE-75.
 
Next time, I'll review so of the literature that might be of some use to the community.

Saturday, July 24, 2010

Useful Verses Usable

This is a discussion that you are not likely to read in usability texts – the topic of useful verse usability, and the value of each. Just recently I had a discussion with someone on just this topic. Furthermore, I have had numerous discussions with others and each time it surprises me that people often do not know the difference between the two and the value of each.

Useful and Usable

If you go to the dictionary, you will discover that “useful” means “to be of serviceable value, beneficial” and in addition “of practical use.” Pretty straight forward.

On the other hand, the definition of “usable” is “capable of being used” and “convenient and viable for use.” Also a straightforward definition.

However, if you probe more deeply into the definitions, you will note that “useful” is the first, necessary quality of a tool or system. It must be useful or why use it? Usable is a quality of a tool or system. However, it is not primary in relationship with the quality of the being “useful.” It is secondary. Necessary, yes, nevertheless, it is still secondary.

The usefulness as a quality of a tool or system is not addresses in HE-75, or any other usability standard that I have encountered. (If any one knows of a standard where usefulness is addressed, please add a comment to this discussion.) Usefulness is assumed.

However, I have learned that in the real world, the usefulness of any tool or system should not be assumed. It should be tested. Furthermore, with complex systems, the fundamental capabilities of a system or tool are often useful. However, not all of the capabilities of that system may be.

I direct experience with a remote monitoring system where the primary or fundamental capabilities of the system have clear use. However, with each release of this system, as more capabilities are added, the useless capabilities may be on the verge of out numbering the useful ones.

Bottom Line


  • Usefulness is always more important than usability. If it is not useful, it is fundamentally worthless or at best, excess baggage, and a drag on the actual and perceived quality of the tool or system.

  • Usefulness should never be assumed. It should be demonstrated. I know of too many projects where usefulness was not demonstrated. This lead to capabilities being developed that waste time and money, and can damage reputations.

Saturday, July 17, 2010

The Return: The Value of Consistency

I have been distracted for a couple of months ... working to find and land another consulting contract.  I have completed that task.  However, it is outside of the medical device industry.  I am not completely happy with the situation, however, having a position outside of the medical device industry does afford some freedom when commenting on it.

Another reason for the significant gap between my last post and this one has been that I was working on a long and intricate post regarding hacking or hijacking medical device communications.  The post began to look more like a short story than a commentary.  The more I worked on it, the longer and more convoluted it became.  At some point, I may publish portions of it.

This experience with the article that would never end has lead me to change the way I'll be posting articles in the future.  In the future, my articles will be short - two to four paragraphs.  And will address a single topic.  I think that some of my posts have been too long and in some cases, overly intricate.  I still plan to cover difficult topics, but in a format that is more readable and succinct.

Consistency in User Interfaces

When it comes to making user interface "usable," the two qualities are 1. Performance and 2. Consistency.  Performance is obvious.  If the interface is slow, unresponsive, sluggish, etc. people will not use it.  Or those who are stuck with using it will scream.  Consistency is somewhat less obvious and more difficult to describe.  However, when you encounter a user interface that has changed dramatically on an application that you thought that you knew, you understand the value of consistency. 

Recently, I encountered a newer version of Microsoft Office.  Gone are the pull down menus, the organization of the operations and tools has changed dramatically.  Frankly, I hate the new version.  If I had encountered the newer version of Office as my first encounter with Office, I know that my reaction would be different.  The new version is inconsistent with the older version.  My ability to transfer my knowledge about how to use the newer version is being hindered by the dramatic changes that have been made.  

Consistency is about providing your users with the capability to reapply their knowledge about how things work to new and updated systems.  Operations work the same between applications and between older and newer versions.  In the case of the new version of Word, I am grateful that once I have selected a particular operation, such as formatting, it essentially works the same as the older version.  However, I have tried to use the newer version of PowerPoint and it's drawing capabilities.  I have not yet been successful and am a drawing tool that I know how to use.

Consistency has a side benefit for the development process as well.  When operations, layouts, navigation, etc. become standardized, extending the design of a user interface becomes easier, less risky and less likely to be rejected by users.  The effect of creating consistent user interfaces is similar to having a common language. More on consistency and HE-75 in a later post.

Tuesday, May 4, 2010

HE-75 Topic: Design First and Ask Questions Later?

I was planning on publishing Part 2 of my Medical Implant Issues series.  However, something came up that I could not avoid discussing because it perfectly illustrates the issues regarding defining and understanding your user population.

A Story

I live in the South Loop of Chicago - easy walking distance to the central city ("the Loop).  I do not drive or park a car on the streets in the city of Chicago.  I walk or take public transportation.

One morning I had to run a couple of errands and as I was walking up the street from my home, I saw a man who had parked his car and was staring at the new Chicago Parking Meter machine with dismay.  I'll tell you why a little later.

Depending on how closely you follow the news about Chicago, you may or may not know that Chicago recently sold its street parking revenue rights to a private company.  The company (that as you might imagine has political connections) has recently started to remove the traditional parking meters (that is, one space, one meter) with new meters.  Separate painted parking spaces and their meters have been removed.  People park their vehicles in any space on the street where their vehicle fits, go to a centralized meter on the block where they parked and purchase a ticket (or receipt) that is placed on the dashboard of the vehicle.  On the ticket is printed the end time wherein the vehicle is legally parked.  After the time passes, the vehicle can receive a citation for parking illegally.  Many cities have moved to this system.  However, this system has something missing that I have seen on other systems.

Here's a photograph of the meter's interface ...

Chicago Street-Parking Meter

 
I have placed black ellipse around the credit card reader and a black circle around a coin slot.  Do you see anything wrong in the photo?  ...

Getting back to the man who was staring at the parking meter ... he saw something that was very wrong ... there was no place to enter paper money into to the meter. 


I was surprised. This was the first time I had ever taken the time to really look at one of these meters.

As street parking goes, this is expensive.  One hour will cost you $2.50.  The maximum time that you can park is 3 hours - translated, that's 30 quarters if you had the change.  You can use a credit card. However, there are a lot of people in the City of Chicago who don't have credit cards.  And this man was one of them, nor did he have 30 quarters.

I have seen machines used other cities and towns, and they have a place for paper money.  Oak Park, the suburb immediately west of Chicago, has similar meters and they have a place to use paper money to pay for parking.  What gives with this meter?

I take the City of Chicago off the hook for the design of this parking meter.  I don't believe they had anything to do with the design of the meter.  I have parked in city garages over the years (when I was living in the suburbs), and the city garages have some pretty effective means to enable one to pay for parking - either using cash (paper money) or credit card.  But I think they should have been more aware of what the parking meter company was deploying.  I think they failed the public in that regard.

I can take the cynical view and suggest that this is a tactic by the private company to extract more revenue for itself and the city through issuing parking citations.  However, I think is the more likely that some one designed the system without any regard to the population that was expected to use it and the city fell-down on its responsibility to oversee what the parking company was doing.

Failure to Include a Necessary Feature

For the purposes of examining the value of usability research - that is, the research to understand your users and their environment, what does this incident teach?  It teaches that failure to perform the research to understand your user population could result in the failure to include a necessary capability - such as a means to pay for your parking with paper money.  

What I find interesting (and plausible) is that this parking meter design could have been usability tested and passed the test.  The subjects involved in the usability test could have been provided quarters and credit cards, and under those conditions the subjects would have performed admirably.  However, the parking meter fails the deployment test because the assumptions regarding populace, conditions and environment fail to align with reality of the needs of the population it should have been designed to serve.

Another Failure: Including the Unnecessary or Unwanted Features   


As I was walking to my destination, I started composing this article.  While thinking about what to include in this article, I remembered what a friend of mine said about a system wherein he was in charge of its development.  (I have to be careful about how I write this.  He's a friend of mine for whom I have great respect.  And, defining the set of features that are included in this system is not his responsibility.)


He said that "... we build a system with capabilities that customers neither need nor want."  The process for selecting capabilities to include in a product release at this company is an insular process.  More echo-chamber than outreach to include customers or users.  As a result this company has failed to understand their customers, users, their work environment, etc.  

Some might suggest that the requirements gathering process should reduce the likelihood of either failure occurring - failure to include or include unnecessary or unwanted features.  Again, I know that in case of my friend's company, requirements-gathering takes its direction largely from competitors instead of customers and/or users.  So what often results is the release of a system that fails to include capabilities that customers want and includes capabilities that customers do not want or need.
   
I don't know about you, but I see the process my friend's company engages in as a colossal waste of money and time.  Why would any company use or continue to use such a process?  


Ignorance, Stupidity or Arrogance - Or a combination?



I return to the title of this article "Design First and Ask Questions Later?" and the question I pose above.  I have seen company after company see design as an end in itself and failing to understand that creating a successful design requires an effective process that includes research and testing.  Failure to recognize this costs money and time, and possibly customers.  It is not always a good idea to be first in the market with a device or system that includes a trashy user interface.

So why to companies continue to hang on to failing processes?  Is it ignorance, stupidity or arrogance?  Is it a combination?  My personal experience suggests a combination all three factors with the addition of two others: delusion and denial.  These are two factors that we saw in operation that lead to the financial crisis of 2008.  I think the people will continue to believe that what they're doing is correct up to the point until the whole thing comes crashing down.

The Chicago Parking Meters has a user interface with poor and inconsiderate design ... inconsiderate of those who would use it.  (If I get comments from city officials, it will probably be for that last sentence.)  However, I don't believe that the parking meter company will face any major consequences such as being forced to redesign and redeploy new meters.  They will have gotten away with creating a poor design.  And they're not alone.  There are lots of poorly designed systems, some of the poor designs can be and have been life threatening.  Yet, there are no major consequences.  For medical devices and systems, I believe this needs to change and I hope the FDA exerts it's oversight authority to insure that it happens. 





Medical Device Design: Reader Suggested Books


One of my readers provided me the following list of books related to usable medical product designs.  I pass this list of three books on to you.  I do not yet have them in my library but these would be suitable additions.




Saturday, May 1, 2010

HE-75 Topic: Meta Analysis

The definition of a "meta-analysis" is an analysis of analyzes.  Meta analyzes are often confused with a literature search, although a literature search is often the first step in a meta-analysis.

A meta-analysis is a consolidation of similar studies on a single, well defined topic.  The each study may have covered a variety of topics, but with the meta-analysis, each study will have addressed the common topic in depth and collected data regarding it.

The meta-analysis is a well-respected means of developing broad-based conclusions from a variety of studies.  (I have included a book on the topic at the end of this article.)  If you search the literature, you will note that meta-analyzes are often found in the medical literature, particularly in relationship to the effectiveness or problems with medications.

In some quarters, the meta-analysis is not always welcome or respected.  Human factors (Human engineering) is rooted in experimental psychology, and meta-analyzes are not always respected or well-received in this community.  It is work outside of the laboratory.  It is not collecting your own data, but using the data collected by others, thus the tendency has been to consider the meta-analysis as lesser.

However, the meta-analysis has a particular strength in that it provides a richer and wider view than a single study with a single population sample.  It is true that the studies of others often do not directly address all the issues that researchers could study if those researchers performed that research themselves.  In other words, the level and the types of research related controls were employed by the researchers themselves.  But, again, the meta-analysis can provide a richness and the numeric depth that a single study cannot provide.

Thus the question is, to use or not to use a meta-analysis when collecting data about a specific population?  Should a meta-analysis be used in lieu of collecting empirical data?  

Answer.  There are no easy answers.  Yes, a meta-analysis could be used in lieu of an empirical analysis, but only if there are enough applicable studies recently performed.  However, I would suggest that when moving forward with a study of a specific, target population that the first response should be to initiate a literature search and perform some level of a meta-analysis.  If the data is not available or is incomplete, then the meta-analysis will not suffice.  But, a meta-analysis is always a good first step, and a relatively inexpensive first step, even if the decision is made to go forward with an empirical study.  The meta-analysis will aid in the study's design and data analysis.  And will act as a guide when drawing conclusions.



Additional Resources

Wednesday, April 21, 2010

HE-75: Collecting Data and Modeling Tasks and Environment

This article expounds on my earlier article related to AAMI HE-75: Know what thy user does and where they do it. 


Collect and Represent the Data


Ideally the first steps in the design process should occur before a design is ever considered.  Unfortunately, in virtually every case I have encountered, a design for the user interface has already been in the works before the steps for collecting user and task related data have been performed.


Nevertheless, if you are one of the people performing the research, do as much as you can to push the design out of your mind and focus on objectively collecting and evaluating the data.  And, in your data analysis, following the data and not your or the preconceived notions of someone else.


There are a variety of means for collecting data and representing it.  The means for collecting the data will generally involve:
  • Observation - collecting the step-by-step activities as a person under observation performs their tasks.
  • Inquiry - collecting data about the a person's cognitive processes.
Once the data has been connected, it requires analysis and representation in a manner that is useful for later steps in the design process.  Data representations can include:
  • Task models - summary process models (with variants and edge cases) of how users perform each task.  This is different from workflow models in that in task models no references to specific tools or systems should be included in the task model.  A task model should be abstracted and represented at a level without reference to actions taking place on a particular device or system.
  • Workflows - summary process models (with variants and edge cases) similar to the task flows with reference to a particular device or system.  For example, if the user interface consists of a particular web page, there should be a reference to that webpage and the action(s) that took place.
  • Cognitive models - a representation of the cognitive activities and processes that take place as the person performs a task.
  • Breadth analysis - I have noted that this is often overlooked.  Breadth analysis organizes the tasks by frequency of use and if appropriate, order of execution.  This is also the place to represent the tasks that users perform in their work environment but were not directly part of the data collection process.
Detailed Instructions


I cannot hope to provide detailed instructions in this blog.  However, I can provide a few pointers. There published works on how to collect, analyze and model the data by leaders in the field.

Here are three books that can recommend and several can be found in my library:


User and Task Analysis for Interface Design by  J. Hackos & J. Redish


I highly recommend this book.  I use it frequently.  For those of us experienced in the profession and with task and user analysis, what they discuss will seem familiar - as well it should.  However, what they do are provide clear paths and methods for collecting data from users.  The book is well-structured and extremely useful for practitioners.  I had been using task and user analysis for a decade before this book came out.  I found that by owning this book, I could throw all my notes away related to task and user analysis, and use this book as my reference.


Motion and Time Study: Improving Work Methods and Management 
by F. Meyer
Motion and Time Study for Lean Manufacturing (3rd Edition) by F. Meyer & J. R. Stewart


Time and motion study is a core part of industrial engineering as the means to improve the manufacturing process.  Historically, time and motion studies go back to Fredrick Taylor (http://en.wikipedia.org/wiki/Frederick_Winslow_Taylor) who pioneered this work in the later part of the 19th and in early part of the 20th Century.  I have used time and motion studies as a means for uncovering problematic designs.  Time and motion studies can be particularly useful when users are engaged in repetitive activities and as a means for improving efficiency and even as a means for reducing repeated stress injuries.  The first book I have in my library however it is a bit old (but very inexpensive) so I include the second book by Meyers (and Stewart) that more recent.  I can say that the methods of time and motion can be considered timeless, thus adding a book published in 1992 can still be valuable.

Time and motion studies can produce significant detail regarding the activities that those under observation perform.  However, these studies are time-consuming and as such, expensive.  Nevertheless, they can provide extremely valuable data that can uncover problems and improve efficiency.


Contextual Design: Defining Customer-Centered Systems (Interactive Technologies) by H. Beyer & K. Holtzblatt &

Rapid Contextual Design: A How-to Guide to Key Techniques for User-Centered Design (Interactive Technologies) by K. Holtzblatt, J. B. Wendell & S. Wood


The first book I have in my library, but not the second.  I have used many of the methods described in Contextual Design before the book was published.  The contextual design process is one of the currently "hot" methods collecting user and task data, and as such, every practitioner should own a copy of this book - at least as a reference.


I believe what's particularly useful about this contextual inquiry is that it collects data about activities not directly observered.  It's able but that affect the users and the tasks that they perform.  For example, clinicians engaged in the remote monitoring of patients often have other duties, many of them patient related.  Collecting data exclusively targeting remote monitoring activities (or the activities specific to a targeted device or company) can miss significant activities that impact remote monitoring and vice versa


Additional Resources


As a graduate student, I had the privilege of having my education supported by Xerox's Palo Alto Research Center.  I was able to work with luminaries of the profession, Tom Moran and Allen Newell on a couple of projects.  In addition I was able to learn the GOMS model.  I have found this model useful in that it nicely blends objectively observed activities with cognitive processes.  However, the modeling process can be arduous, and as such, expensive.  

Allen Newell and Herbert Simon are particularly well known for their research on chess masters and problem solving.  They were well-known for their research method, protocol analysis. Protocol analysis is a method that has the person under observation verbally express their thoughts while engaged a particular activity.  This enables the observer to collect data about the subject's thoughts, strategies and goals.  This methodology has been adopted by the authors of contextual inquiry and one that I have often used in my research.


The problem with protocol analysis is that it cannot capture cognitive processes that occur beyond the level of consciousness, such as the perception.  For example, subjects are unable to express how they perceive and identify words, or express how they are able to read sentences.  These processes are largely automatic and thus not available to conscious processes.  (I shall discuss methods that will enable one to collect data that involves automatic processes when I discuss usability testing in a later article.)  However, protocol analysis can provide valuable data regarding a subject's thoughts particularly when that person reaches a point where confusion sets-in or where the person attempts to correct an error condition.

Here's a link from Wikipedia: http://en.wikipedia.org/wiki/GOMS.


Another book that I have in my library by a former Bell Labs human factors researcher, Thomas K. (TK) Landauer, is The Trouble with Computers: Usefulness, Usability, and Productivity.


This is fun book.  I think it's much more instructive to the professional than Don Norman's book, The Psychology Of Everyday Things.  (Nevertheless, I place the link to Amazon just the same.  This is a good book for professional in the field to give to family members who ask "what do you do for a living?")  

Tom rails against the many of the pressures and processes that push products, systems and services into the commercial space before they're ready from a human engineering standpoint.  Although the book is relatively old, many of the points he makes are more relevant today than when the book was first published.  The impluse to design user interfaces without reference or regard for users has been clearly noted by the FDA, hence the need for HE-75.

Thursday, April 8, 2010

More on Knowing Thy Target User Population

Before moving forward into product development, I want to elaborate on the issues in my first two articles. This article elaborates on the importance of knowing the target population and ways to gather that information.  

The next article will discuss  I have had some recent experiences that reinforced that importance of defining and clearing understanding the targeted user population. And the importance of fully understanding and documenting what those members of the user population do and the environment(s) wherein they live and work.

Before proceeding any further, please review my previous article on understanding your target population. The link to the article is below:

http://medicalremoteprogramming.blogspot.com/2010/03/know-thy-target-population.html

HE75 clearly emphasizes the importance of understanding your target population.   The standard instructs that companies who develop medical devices should:
  1. Know their targeted user population
  2. Involve users early and often
  3. Accommodate user characteristics and capabilities. And in order to do this, one must first know what they are.

The information gathered about a target population should enable one to clearly define the qualities and characteristics of that population.  This can be particularly important when designing medical devices, particularly when those devices are targeted to patients. 

I have seen organizations a company, organizations that include program management, marketing and engineering assume that they know the characteristics of the targeted population.  Once the product is deployed, the company comes to a rude awakening and learns that their assumptions were often times false.  Neither the company nor the targeted user population(s) benefit from such a failure.

Methods for Gathering Target Population Data

The target population data is the most elemental data in the product development process.  All the descriptions about the targeted user population, their characteristics, culture and capabilities originate from this step in the research and development process.

So, how is this crucial data gathered? First, a confession ... the amount of work I have performed at this stage of the process has been limited.  My training is in cognitive psychology and computer science.  Most often I have been the recipient of such information about the targeted user population.  I have used the results of this first step as a means for recruiting subjects in my usability experiments and evaluations.  The training that is most suited to gathering this kind of data is anthropology and sociology.  The process of collecting target user population data draws on ethnographic and participant observation research methodologies.  The research can be observational.  It can be based on questionnaires administered orally or in writing.  It can be structured interview.  It can participant observation where the observer becomes participates in the activities of the target population.  It can be a combination of a variety of methods and include methods not listed above.  

The objective is the development well-grounded description that captures the important, defining characteristics of the target population.  The description can be provided in variety of ways, verbal or graphic.  The description should use the clearest and most appropriate methods available to covey that information to the members of the product development organizations.

Interestingly enough, I have used the data gathering methods I listed above.  However, I used those methods to collect data for the second step, Knowing what the user does and where they do it.  In other words, to gather task and environmental data.

Potential Costs for Failure to Correctly Define the Target User Population

Consider the following scenario ... that I collect task and environmental data about the wrong population, about a population that is not the target population.  What is the value of the results of my research?  And what could be the cost to the company for this failure?  What could be the cost to the target user population, to have a device with a user interface unsuited to their needs?

In reality, the cost could be high, but the product may not be a dismal failure.  Given the fact that we are all human, we share a wide variety of characteristics.  However, in the more stringent regulatory environment that is anticipated, it could mean delay, additional research, engineering and product development costs.  If the product is intended to provide a new capability to providers and/or patients, a delay could mean that a competitor could be first to the market the product.  Thus company could miss the competitive advantage to being first.

I have recent experience with two products targeted to patients. In one case the target population was well understood and well defined, and members of that population were used in usability testing.  In another case, there was a limited understanding of the target population by the research and development organization. And no member of the target population involved at any stage of the research and development process or in the development of the user interface.   In the first case where the target population was well understood and well defined, the user interface research and development process was clear and logical.  On the other hand, the research and development process that did not have a clear understand of the target population is struggling, it is learning as it goes.  Each time it learns something new about its target population, the user interface has to be updated.  It has been a costly process with constant reworks of the user interface.  So many reworks that the integrity of the original design has been lost.  It appears deconstructed.  At some point the entire user interface will have to be redesigned and that will likely come at the behest of the FDA enforcing HE75.

A Final Thought

HE75 instructs that medical product user interfaces should accommodate a diverse groups of users and should be maximally accessible. I see this as design objective of any user interface in that vernacular should be limited as much as possible and that limiting qualities should not be designed in or should be removed when detected. However, all products may not be accessible to all users but should be clearly accessible to the target population.  And I believe that the FDA will insist on this.

Tuesday, March 30, 2010

Know What Thy User Does and Where They Do It

Review ...



Last time, I discussed the importance of knowing your target population and their use environment. That first step identifies and specifies the population for inclusion. It is the means for including who should be included and excluded, and the environment where they work. For example, a targeted population for a particularly medical product could be surgical nurses who work in hospitals. The target population does not include all nurses or even all surgical nurses. In addition, the use environment in which the targeted population performs their work needs to be a part of the definitional equation.
Thus, the first step in the design process is defining the properties of the target population, determining who is and who is not part of that population. And include a complete description of their working environment, the environment where the product or service will be used. Field research will be necessary to establish the target population and its characteristics and the work environment. When this step is finished, the next step is perform additional research to establish the details of the work of interest and the environment in which it performed. (The means for collecting this information and form of the analytic product will be discussed in a later article.)

Know What Thy User Does and Where They Do It
Knowing what the user does consists of documenting the tasks that the target user population would perform with the product or service that a company plans to provide.

Once the product or service has been conceptually defined, the following information from the target population must collected:

  1. The preconditions that lead to performing each task,

  2. The steps required to perform a task, and

  3. How frequently each is performed (in absolute terms and in relationship to other tasks.

The information collected focus on the actions performed by a user localized to the product or service in development. The data would have little or no reference to the use environment – the full set of activities and environmental conditions wherein this product or service will be used. Thus once having collected the task data specific to the product or service in development, the next step would be placing this product or service within the environment wherein it will used.

A full and complete description or representation of the use environment may not always be possible. Moreover, often times there are multiple use environments. And a description or descriptions may be only of a representative sample. Nevertheless, it can be extremely useful to understand how the product or service in development will be used in context.
The final research products resulting from this step in the product development process are:


  1. Task analyzes: pertaining only to the product or service in development.
That include:

  • Breadth analysis, that consists of:

  • The number of tasks users would perform using the product or service in development

  • Their frequency of performance

  • Depth analysis

  • Define how each task is performed

  • The level is detail required will vary with the complexity of the task

  • Each task analysis should include likely errors and the steps required to correct them.

  • There are a variety of means to represent task analyze. The representation method should be agreed on by the affected parties.

  1. Task execution within the wider use environment

  • The tasks that users will perform with the product or service in development will be performed within a larger context.

  • The tasks within the wider context and how those other tasks relate to each other requires documentation.

  • The other tasks require a breath analysis. Rarely is a depth analysis required unless tasks are intermingled.

I discuss the benefits of knowing what your user does and where they do it in my next article. I shall discuss with reference to HE75 and what the FDA will likely require from the medical product companies.

Where in a company's organizational structure is this work performed?

Knowing your 1) target population and their use environment, and 2) what your users do (task analysis) are the first two steps in the product formation stage of development. This precedes requirements gathering stage of product development. Thus, this would require the engagement of human factors engineers working with marketing and other product and field-focused organizations to engage those working at this early stage. St. Jude Medical with whom I consulted for 15 months has placed their all their human factors engineers in systems engineering. Thus, the placement of human factors engineering only in systems engineering means that important data impacting the beginning of product development is either not produced, or produced at a later stage in the development process where its impact is minimal or non existent.

As I shall discuss later, it is important that human factors engineering be involved from concept to deployment, thus human factors engineers should be distributed throughout an organization.

Stirrings in the Regulatory Environment

Medical product producers are regulated by the Federal Government. Anyone who reads this blog is highly likely to know that. Drug companies are acutely aware of governmental regulation in that their products must be “safe and effective.” Drug companies must prove through research safety and effectiveness. Implanted device manufacturers must demonstrate that the implanted devices themselves are safe and effective. Devices and drugs that deliver therapy have to prove to the FDA safety and effectiveness.

But what about the user interfaces for devices that enable users to make changes to the operation of implanted devices, deliver therapies, provide information about the patient, etc., where's the proof in the form of empirical data to prove that they're safe and effective?

In the US, more people are killed by medical errors each year than are killed in automobile accidents and in the military service combined. Yet, the FDA has placed few relatively requirements on the process for designing user interfaces on medical products and services. This is a disgrace and the FDA knows it. FDA mandates for insuring the usability of medical devices, products and services has been merely to determine only that a usability process is in place. FDA mandates for usability have not reached the level of the Departments of Defense or Transportation. Companies that design and build medical systems have not been required to prove to the FDA that their products are usable in their use environment. Yet, it is clear that usability is just as important in medical practice as with combat systems and cars – particularly when one considers the number of injuries and deaths resulting from medical errors. And with more powerful and complicated systems are being designed and planned, the need for the FDA to act and act effectively in the area of usability grow substantially.

In my personal experience, I saw a device under development that if used improperly, could injure or in one case, lead the to death of a patient. In fact, I uncovered a condition in which the device when used properly could lead to death. And, it would have been surprisingly easy to do. Of course, I raised my concerns regarding this device and its potential for injuring patients. Nevertheless, in the current regulatory environment, I believe that it would be possible that the device could be approved for use by the FDA because of the lack of a clear standard from the FDA that the company prove empirical data regarding that the device is usable and safe.

The Department of Defense has been particularly forceful with contractors regarding insuring that members of the target population will be able to use systems within the environment of their intended use. It does not take a great deal of contemplation to understand the value of insuring that a system can be operated effectively by a soldier in a combat environment. A system that could save the lives of fellow soldiers or civilians would be useless if it could not be used effectively by a member of the target population (i. e., a soldier) in combat. If the user interface is too cumbersome or complicated when used in the stress and difficulties of the combat environment, then all the time and effort taken to create that system has been wasted. NASA and the FAA have taken a stance similar to the DoD. The Department of Transportation in conjunction with Congress have taken a strong stance with respect to the design of user interface of vehicles.

I believe that the FDA will begin to strong stance similar to other regulatory bodies of the Federal Government regarding insuring that the user interfaces of medical devices meet specific usability standards and the meeting of these standards must be demonstrated experimentally with empirical data. I think that one of the first step in the process of ever increasing regulation of the user interfaces of medical will be the adoption of HE75 by the FDA. This would start the process towards mandating that companies prove their products meet user performance standards.