Advamed backs FDA’s misinformation draft, calls for updates on AI and deep fakes

This audio is auto-generated. Please let us know if you have feedback.

Dive Brief:

  • Advamed has voiced support for the Food and Drug Administration’s plans to update guidance on addressing misinformation about medical devices and prescription drugs.
  • The feedback, which Advamed sent on Sept. 9, covers draft guidance the FDA published in July. The regulator created the guidance in light of feedback, including comments on an earlier text it released for public consultation in 2014. 
  • Advamed said the updated draft “better reflects the wide scope of internet-based content seen in today’s information age” and provides a practical approach to addressing misinformation. The trade group also called for changes including an increased focus on artificial intelligence.

Dive Insight:

The FDA published the 2014 draft guidance to explain how companies should respond to online misinformation about their products. However, the administration never finalized the guidance. The feedback and other comments informed the creation of a new draft that proposed two sets of responses to misinformation. 

One type of response would allow companies to voluntarily address specific misinformation about their products in online posts. The FDA stated that companies should not use tailored responses to counter posts in which a person voices their individual experience, opinion or value judgments. The second type of response supports the use of existing communication channels, including TV and radio advertisements, to implicitly or explicitly correct misinformation.

Advamed supported the draft, telling the FDA that explicit recognition of the options for responding to online posts “is an important step to help stem independent third-party misinformation and protect patients.” The trade group shared recommendations on how to improve the text, including by clarifying how to address misinformation or exaggeration aggregated by AI and not necessarily traceable to one person.

“AI is not sufficiently addressed in the draft guidance,” the trade group said. “As AI and its adoption expand, it becomes increasingly difficult – or impossible – to address the source that is generating certain information that is exaggerated, inaccurate or misinformation based on the sources used.”

Advamed also wants the FDA to provide additional guidance on how to respond to misinformation that is spread by accounts that purport to be from the device manufacturer or use deep fakes to impersonate a leader or employee of the company. The repercussions and credibility of misinformation spread from an account impersonating a CEO may pose a greater risk of harm, the trade group said. 

The potential for technology to continue to drive new forms of misinformation led Advamed to ask the FDA to establish an email address or other contact mechanism for companies that encounter new issues. Advamed cited the FDA’s creation of an email address for comments or questions about medical device sterilization as an example of how the agency has facilitated outreach in other evolving areas.