Study finds huge variability in task completion efficiency and accuracy

WASHINGTON, D.C. A novel new study involving clinicians using electronic health records (EHRs) to perform certain common tasks provides compelling evidence that the design, development and implementation of these systems need to be improved to make them easier to use by clinicians and, ultimately, safer for patients.

The study, entitled “A usability and safety analysis of electronic health records: a multi-center study,” was published July 2 by the Journal of the American Medical Informatics Association. It was conducted by researchers with MedStar Health’s National Center for Human Factors in Healthcare, the American Medical Association (AMA), and others, and was funded by the AMA. Researchers focused on the two largest EHR vendors, Epic and Cerner, who comprise more than 50 percent of the market. They conducted the study at two sites per vendor, or four health systems.

Twelve to 15 emergency physicians per site were given common tasks mimicking real patient cases—placing orders for medical imaging, lab tests, and medications. Researchers collected data pertaining to length of time and number of clicks to complete each task plus the degree of accuracy. The findings showed huge variability in performance across the sites. For example, time to complete an imaging order varied from 25 seconds at one site to more than a minute at another. Placing an imaging order required an average of eight clicks at one site, while the same task at a different site averaged 31. For a medication order, one site recorded no errors while another had a 30 percent error rate.

“While there are many benefits to using EHRs, there are also usability and safety challenges that can lead to patient harm,” said the study’s lead author Raj Ratwani, PhD, center director and scientific director of the MedStar Human Factors Center, and a leading researcher of EHR safety and usability.

To have their products certified, EHR vendors are required by the federal government to employ user-centered design as they develop systems and to conduct usability testing near the end of the process. But EHR usability challenges persist due to several reasons outlined in the study. One large factor is that EHRs are configured and customized by the vendor and health system during implementation, and the resulting product may be significantly different from the one that was tested by the vendor to meet government requirements for usability and safety.

“Our findings reaffirm the importance of considering patient care and physician input in the development and implementation of EHRs,” said study co-author Michael Hodgkins, MD, chief medical information officer of the AMA. “There are multiple variables impacting the end user experience that contribute to physician burnout, a diminished patient-physician relationship, and unrealized cost savings. While design can be an important factor, so too can implementation choices made onsite. Increased collaboration between vendors, information technology purchasers and physicians is needed to optimize experiences and address current needs.”

The study’s authors conclude, “Our results suggest that basic performance standards for all implemented EHRs should be considered in order to ensure usable and safe systems. Both EHR vendors and providers should work together to ensure that usable and safe products are implemented and used.”

Study authors are: Raj M. Ratwani, PhD, Erica Savage, MS, Amy Will, Ryan Arnold, MD, Saif Khairat, PhD, Kristen Miller, DrPH, Rollin J.”Terry” Fairbanks, MD, Michael Hodgkins, MD, and A. Zachary Hettinger, MD.

# # #

Media Contacts:
Robert J. Mills
AMA Media & Editorial
(312) 464-5970
[email protected]

Erin Cunningham
MedStar Health Public Relations & Communications
(410) 772-6557
[email protected]

About the American Medical Association
The American Medical Association is the powerful ally and unifying voice for America’s physicians, the patients they serve, and the promise of a healthier nation. The AMA attacks the dysfunction in health care by removing obstacles and burdens that interfere with patient care. It reimagines medical education, training, and lifelong learning for the digital age to help physicians grow at every stage of their careers, and it improves the health of the nation by confronting the increasing chronic disease burden.

About the National Center for Human Factors in Healthcare
The National Center for Human Factors in Healthcare occupies a unique position in the United States as a large human factors program embedded within a healthcare system. It brings together human factors scientists, systems safety engineers, health services researchers, and clinicians to conduct safety science and applied research in medicine to improve safety, quality, efficiency, and reliability. The center is part of the MedStar Institute for Innovation and is affiliated also with the MedStar Health Research Institute. MedStar Health, the parent organization, is the largest not-for-profit healthcare provider in the Maryland and Washington, D.C., region, with 10 hospitals and an extensive ambulatory services network, and is the medical education and clinical partner of Georgetown University. For more information on the Human Factors Center, visit medstarhealth.org.


Saif Khairat was the site PI at the UNC Emergency Medicine Department and recent recipient of an NC TraCS $2K grant: Teleconsent to Overcome Barriers and Gaps in Clinical Trial Enrollment.

Originally published at ama-assn.org.

NC TraCS Institute logo vertical

In partnership with:

Contact Us


Brinkhous-Bullitt, 2nd floor
160 N. Medical Drive
Chapel Hill, NC 27599

919.966.6022
This email address is being protected from spambots. You need JavaScript enabled to view it.

Social


Cite Us


CitE and SUBMit CTSA Grant number - UM1TR004406

© 2008-2024 The North Carolina Translational and Clinical Sciences (NC TraCS) Institute at The University of North Carolina at Chapel Hill
The content of this website is solely the responsibility of the University of North Carolina at Chapel Hill and does not necessarily represent the official views of the NIH   accessibility | contact