AI-enabled apps top ECRI’s list of health tech hazards in 2025

This audio is auto-generated. Please let us know if you have feedback.

Dive Brief:

  • ECRI placed artificial intelligence-enabled applications at the top of its list of health technology hazards in 2025.
  • ECRI, which released the list last week, said biases in the AI training data “can lead to disparate health outcomes or inappropriate responses,” adding that the technology can provide false or misleading outputs. The issues led the patient safety group to warn that putting too much trust in AI can result in inappropriate care decisions. 
  • Other top health technology hazards for 2025 include the unmet technology support needs of home care patients, cybersecurity threats and substandard or fraudulent medical devices.

Dive Insight:

ECRI publishes its top 10 health technology hazards each year. In 2024, the nonprofit placed “insufficient governance of AI used in medical technologies” fifth on its list of the top hazards. ECRI reiterated some of its concerns with AI in the 2025 report and made the technology its top health hazard for the coming year.

The concerns are underpinned by the potential for bias, false or misleading responses, changes in how AI performs over time and the inability of models to adapt when confronted with new conditions. ECRI also said models “can yield disappointing results if organizations have unrealistic expectations, fail to define goals, provide insufficient governance and oversight or don’t adequately prepare their data.”

ECRI warned users against placing too much trust in an AI model and failing to appropriately scrutinize its output. However, if users take precautions to avoid inappropriate patient care decisions, the nonprofit sees a role for AI in increasing the efficiency and precision of medical diagnoses and treatments. 

“AI offers tremendous potential value as an advanced tool to assist clinicians and healthcare staff, but only if human decision-making remains at the core of the care process,” the nonprofit said in the report. “Preventing harm requires careful consideration when incorporating any AI solution into healthcare operations or clinical practice.”

ECRI named unmet technology support needs for home care patients as the second biggest hazard. The risk of home care is a recurring theme across ECRI’s recent reports. The misuse or malfunction of medical devices that are poorly adapted to home environments and users was the top hazard in 2024. Home use devices also topped the list in 2023.

The nonprofit reiterated its concerns in the 2025 report, pointing to “numerous examples of patient harm from improper setup of or lack of familiarity with medical devices used in the home setting” to make its case. ECRI said patients need support to operate, maintain and troubleshoot devices. 

Vulnerable technology vendors and cybersecurity threats placed third on the list of hazards. ECRI said regulatory agencies should “move away from ‘punish but not protect’ approaches to cybersecurity challenges and third-party risks and toward fostering a collective approach to cybercrime and vendor risk.”

Substandard or fraudulent medical devices and fire risks in areas where supplemental oxygen is in use rounded out the top five hazards for 2025. The rest of the list covered dangerously low default alarm limits on anesthesia units, mishandled temporary holds on medication orders, poorly managed infusion lines, skin injuries from medical adhesive products and incomplete investigations of infusion incidents.