One Step Forward, Two Steps Back: FDA’s Final Guidance on Clinical Decision Software Raises More Questions Than Answers

Recently, the U.S. Food and Drug Administration (FDA) published a suite of guidance documents relating to software, automation, and artificial intelligence1. One guidance document in particular, addressing clinical decision support (CDS) software, may signal a tightening in FDA’s oversight on software tools with artificial intelligence and machine learning (AI/ML) that could introduce confusion and frustrate innovation in this important, fast-developing area. On October 18, 2022, FDA held a webinar to provide additional clarifications on this final guidance2.

The line between device CDS functions and nondevice CDS functions has long been unclear. In 2016, the 21st Century Cures Act (Cures Act) amended the Federal Food, Drug, and Cosmetic Act (FDCA) to carve out certain CDS software functions from FDA regulation as medical devices. The Cures Act exempts certain software functions from the statutory definition of “medical device” under FDCA Section 201(h)(1) “unless the function is intended to acquire, process, or analyze a medical image or a signal from an in vitro diagnostic device or a pattern or signal from a signal acquisition system, for the purposes of — (i) displaying, analyzing, or printing medical information about a patient or other medical information; (ii) supporting or providing recommendations to a health care professional about prevention, diagnosis, or treatment of a disease or condition; and (iii) enabling such health care professional to independently review the basis of such recommendations that such software presents so that it is not the intent that such health care professional rely primarily on any such recommendations to make a clinical diagnosis or treatment decision regarding an individual patient.”3  To interpret the Section 520(o)(1)(E) carveouts, FDA’s CDS draft guidance published in 2019 adopted a “risk-based” framework in accordance with the recommendations from the International Medical Device Regulators Forum (IMDRF)4. The goal of the framework, said FDA, was to “prioritize patient safety while also recognizing that overregulation could stifle advancements in medical software and clinical support.”5  The final guidance departs from this approach in favor of a new interpretation of the four criteria laid out in Section 520(o)(1)(E)6.

FDA’s shift in policy could create confusion for software developers, particularly those developing products that use AI/ML. First, the Agency’s final guidance diverges from a previously established, internationally recognized risk-based approach. In the draft CDS guidance, FDA described a software function’s level of risk as the linchpin of device status determination and devoted 15 pages to explain its framework7. In the final guidance, however, this entire section of discussion was limited to a single, short paragraph8 9.

FDA’s revised framework for CDS software functions may also harm innovation. The final guidance seems to add a major limitation on top of Section 520(o)(1)(E)(ii), which provides a broad exemption to the device definition and says that CDS software is not a “device” if it is intended for the purpose of “supporting or providing recommendations” to doctors about “prevention, diagnosis, or treatment” of a disease or condition. The final guidance posits that to be considered nondevice software, a CDS function cannot provide a “specific” prevention, diagnostic, or treatment. This “specific” limitation, the agency explained, is to avoid automation bias, or a propensity of humans to over-rely on a suggestion from an automated system. Of course, Congress could not have been believed automation bias was disqualifying when it enacted the Cures Act, because all CDS necessarily involves automation. The agency further explained that this bias is more likely to occur if software provides a user with a single, specific, selected output rather than a list of options. However, there could be more harm than benefit in adding a list of unlikely options when the CDS concludes that one particular outcome is by far the most probable. FDA also clarified in its webinar that an output to “do nothing” constituted a “specific” recommendation under the final guidance10.

Another major limitation is “time-criticalness.” Automation bias increases, the agency explains, if there is not sufficient time for the user to adequately consider other information11. What remains unclear, however, is where exactly the bar is and what level of automation is too much. The underlying purpose of leveraging medical software, especially AI/ML, is exactly to achieve automation in certain aspects of medical decision-making where it is better to delegate to a machine because it saves time or the machine does it better12. In this sense, all intended users of medical software will have to face benefits and problems introduced by automation to a certain degree, and, it is the intended users, not the machine, who determine the level of reliance. One can imagine situations where time-criticalness will call for more use of certain types of AI/ML, not less. Without additional clarification, FDA’s new convoluted and inconsistent discussion and examples provided in the guidance will likely have a chilling effect on innovation in CDS functions, especially those using AI/ML13.

FDA appears to go beyond the carveouts in FDCA Section 520(o)(1)(E) in two other instances:

  • Medical Information”: The statute says in Section 520(o)(1)(E)(i) that if a product is intended for the purpose of analyzing “medical information,” it is a nondevice if the other three criteria are also met. In this final guidance and the webinar, however, FDA decided to take an extremely narrow interpretation by defining “medical information” based on a “rule of thumb” as only “the type of information that normally is, and generally can be, communicated between healthcare providers (HCPs) in a clinical conversation or between HCPs and patients in the context of a clinical decision, meaning that the relevance of the information to the clinical decision being made is well understood and accepted.”14 It is doubtful that Congress intended to use this limited definition because medical information, by definition, is supposed to be a broad concept. In fact, FDA’s interpretation here is in tension with its own thinking on other policy areas. For example, both real-world data15 and electronic health record16 belong to narrower subsets of medical information, but FDA provided much broader definitions for them compared with its interpretation of “medical information” in Section 520(o)(1)(E)(i).
  • Independently Review the Basis”: Section 520(o)(1)(E)(iii) did require that to qualify as an exemption from device definition, a product should be intended for the purpose of enabling a practitioner to “independently review the basis for the recommendations….” To satisfy this requirement, in the final guidance, the agency established an extremely high bar that required inclusion in the labeling elements such as “plain language description of the underlying algorithm development and validation,” “description of data relied upon so that an HCP can assess whether the data is representative of the patient population,” “description of the results from clinical studies conducted to validate the algorithm so that HCP can assess the potential performance and limitations when applied to their patients,” and “a summary of the logic or methods relied upon to provide the recommendations.”17 The examples provided in the guidance further illustrated the agency’s extraordinary and unclear demand in the labeling for software in order to make it a nondevice18. FDA recommends that HCPs “clearly understand how the logic was applied,”19 but it is well known that deep learning and other analytic approaches in AI, by definition, generate insights via unobservable methods and that sometimes clinicians cannot apply the face validity available in more traditional clinical decision tools20. With the new standard, FDA may have effectively classified much CDS software involving AI/ML as device. This tightening is not consistent with the intent of the Cures Act, which was “designed to help accelerate medical product development and bring new innovations and advances to patients who need them faster and more efficiently.”21

In certain respects, the new guidance resembles FDA’s earliest approach to software regulation, particularly in its discussion of an HCP’s reliance on and understanding of software for clinical decision-making. FDA’s first guidance on software regulation, drafted in 1987, stated that software would be exempted from device regulation if intended to “involve competent human intervention before any impact on human health occurs” where clinical judgment can be used to check and interpret a system’s output22. However, following FDA’s first guidance, FDA received industry complaints in the 1980s and ’90s that the increasing complexity and sophistication of software would be difficult for HCPs to understand completely. Ultimately, FDA decided to abandon this unrealistic approach23. Nonetheless, the recent CDS guidance brings back these concepts, especially in its focus on the “specific” and “time-criticalness” concepts introduced to limit HPCs’ reliance on software in clinical decision-making and the extraordinary labeling requirement that, practically, would require an HCP’s complete understanding of the logic behind a software and/or algorithm in a way that is unrealistic24.

CDRH’s key values include promoting innovation and fostering creativity, and in line with these values, CDRH has placed emphasis on digital health25. CDRH’s Digital Health Center for Excellence strives to “accelerate digital health advancement” and “provide efficient and least burdensome oversight while meeting the FDA standards for safe and effective products.”26 All of these aspirations cannot be achieved without a stable regulatory environment and a cohesive regulatory system for digital health innovations, and particularly AI/ML software development. It appears that, however, this final guidance on CDS software represents two steps back from FDA’s years-long effort to create consistency and stimulate innovation in software regulation. Unlike other guidance published on the same date that set clear expectations and instructions in the regulation policy of mobile apps27, medical device data system28, and computer-assisted detection devices29, the new CDS software regulatory framework as announced by the final guidance creates confusion and regulatory uncertainty and may potentially stifle digital health advancement.


1 Clinical Decision Support Software: Guidance for Industry; Policy for Device Software Functions and Mobile Medical Applications: Guidance for Industry; Medical Device Data Systems, Medical Image Storage Devices, and Medical Image Communications Devices: Guidance for Industry; Computer-Assisted Detection Devices Applied to Radiology Images and Radiology Device Data – Premarket Notification [510(k)] Submissions: Guidance for Industry; Computer-Assisted Detection Devices Applied to Radiology Images and Radiology Device Data – Premarket Notification [510(k)] Submissions: Guidance for Industry; Display Devices for Diagnostic Radiology: Guidance for Industry.

2 FDA website: Webinar – Clinical Decision Support Software Final Guidance.

3 21 U.S.C. § 360j(o)(1)(E).

Clinical Decision Support Software: Draft Guidance for Industry at 7 and 13.

5 FDA statement, Statement on new steps to advance digital health policies that encourage innovation and enable efficient and modern regulatory oversight.

6 Clinical Decision Support Software: Guidance for Industry.

7 Clinical Decision Support Software: Draft Guidance for Industry at 17. See also IMDRF document “Software as a Medical Device”: Possible Framework for Risk Categorization and Corresponding Considerations.

8 Clinical Decision Support Software: Guidance for Industry at 13.

9 For example, FDA’s Policy for Device Software Functions and Mobile Medical Applications Guidance (mobile app guidance), finalized on the same day as the CDS guidance, provides that FDA will consider risk when determining whether a mobile app constitutes a medical device. See Policy for Device Software Functions and Mobile Medical Applications: Guidance for Industry. See also Digital Health Innovation Action Plan and Software as a Medical Device (SAMD): Clinical Evaluation: Guidance for Industry.

10 FDA website: Webinar – Clinical Decision Support Software Final Guidance.

11 Clinical Decision Support Software: Guidance for Industry at 10 – 12.

12 Maddox TM, Rumsfeld JS, Payne PRO. Questions for Artificial Intelligence in Health Care. JAMA. 2019;321(1):31–32. doi:10.1001/jama.2018.18932 (“At its core, AI is a tool. Like all tools, it is better deployed for some tasks than for others. In particular, AI is best used when the primary task is identifying clinically useful patterns in large, high-dimensional data sets”).

13 Matheny ME, Whicher D, Thadaney Israni S. Artificial Intelligence in Health Care: A Report From the National Academy of Medicine. JAMA. 2020;323(6):509–510. doi:10.1001/jama.2019.21579 (“Current data generation greatly exceeds human cognitive capacity to effectively manage information, and AI is likely to have an important and complementary role to human cognition to support delivery of personalized health care. For example, recent innovations in AI have shown high levels of accuracy in imaging and signal detection tasks and are considered among the most mature tools in this domain”).

14 Clinical Decision Support Software: Guidance for Industry at 9.

15 Use of Real-World Evidence to Support Regulatory Decision-Making for Medical Devices: Guidance for Industry at 4.

16 See 21 CFR §170.102.

17 Clinical Decision Support Software: Guidance for Industry at 14.

18 Id. at 25.

19 Id. at 15.

20 Matheny ME, Whicher D, Thadaney Israni S. Artificial Intelligence in Health Care: A Report From the National Academy of Medicine. JAMA. 2020;323(6):509–510. doi:10.1001/jama.2019.21579.

21 See FDA website: 21st Century Cures Act.

22 FDA Policy for the Regulation of Computer Products: Draft Guidance for Industry, published on Sept. 21, 1987 (devices would not be subject to active regulation if they were “intended to involve competent human intervention before any impact on human health occurs (e.g., where a clinical judgment and experience can be used to check and interpret a system’s output)”).

23 Crumpler, E. Stewart, and Harvey Rudolph. “FDA software policy and regulation of medical device software.” Food & Drug LJ 52 (1997): 511.

24 Clinical Decision Support Software: Guidance for Industry at 11 and 14.

25 See FDA website: CDRH Mission, Vision and Shared Values.

26 See FDA website: Digital Health Center of Excellence.

27 Policy for Device Software Functions and Mobile Medical Applications: Guidance for Industry.

28 Medical Device Data Systems, Medical Image Storage Devices, and Medical Image Communications Devices: Guidance for Industry.

29 Computer-Assisted Detection Devices Applied to Radiology Images and Radiology Device Data – Premarket Notification [510(k)] Submissions: Guidance for Industry.

This post is as of the posting date stated above. Sidley Austin LLP assumes no duty to update this post or post about any subsequent developments having a bearing on this post.