Clinicians clacking away at workstations in hospitals know what the ones and zeroes humming away in the background are up to, right?
In fact, doctors and health systems often don’t know important details about the algorithms they rely on for purposes like predicting the onset of dangerous medical conditions. But in what advocates call a step forward, federal regulators now require electronic health record (EHR) companies to disclose to customers a broad swath of information about artificial intelligence tools in their software.
advertisement
Since the beginning of January, clinicians should be able to view a model card or “nutrition label” detailing what variables go into a prediction, whether a tool was tested in the real world, what the tool’s developers did to address potential bias, warnings about improper use, and more.
STAT+ Exclusive Story
Already have an account? Log in
This article is exclusive to STAT+ subscribers
Unlock this article — and get additional analysis of the technologies disrupting health care — by subscribing to STAT+.
Already have an account? Log in
To read the rest of this story subscribe to STAT+.