The FDA's Radiology AI Exemption Petition: What Happens If the Gate Opens

Compliance & SecurityFDA regulationradiology AIpremarket notification510(k)AI governancehealthcare AI

In October 2025, Australia-based Harrison.ai filed a 34-page citizen petition with the FDA asking the agency to exempt certain radiology AI tools from premarket notification requirements. The petition, filed by Rubrum Advising on behalf of Harrison.ai, does not seek deregulation. But the practical effect, if approved, would be significant: AI companies with at least one existing cleared device could deploy new, similar tools without first going through a 510(k) review — trading that upfront scrutiny for post-market monitoring obligations instead.

The FDA formally accepted the petition for comment in the December 29, 2025 edition of the Federal Register, with public comments due by February 27, 2026. The FDA must respond within 180 days, and if it does not deny Harrison.ai's proposal by mid-April, it will go into effect. That is not a long runway.

What the Petition Actually Proposes

This flexibility would be limited to eligible manufacturers of computer-aided detection (CADe), diagnosis (CADx), combined detection/diagnosis (CADe/x) and triage (CADt) devices. Under this option, any manufacturer that already has at least one cleared CAD-type device could launch a similar CAD device without submitting a 510(k) premarket notification to FDA for review.

It is not a free pass. Manufacturers opting out of premarket review would need to follow certain technical standards and offer a plan for post-market monitoring. Devices would also be limited to facilities with appropriate training programs — such programs must include a description of the "qualified end users" of the device, i.e., those with necessary qualifications and training to independently interpret medical images without the aid of the device, such as radiologists.

Harrison.ai argues the current system is actively holding back beneficial technology. The firm contends that the country's current regulatory pathways are impeding access to this technology, with policies that are "burdensome and ill-suited to the rapidly evolving nature of AI." Their numbers make the case: Harrison.ai has 8 FDA clearances covering 12 chest x-ray and CT brain indications, 3 FDA Breakthrough Designations, and is deployed at 1,000+ customer sites globally, including 40+ NHS Trusts, with its tools having impacted 6.7 million patients' lives to date.

Importantly, this proposed pathway is within the FDA's existing authority to implement — the FDA exercised partial exemption authority in 2018 and has express statutory authority under section 513(a)(1)(B) to issue postmarket surveillance as a special control.

What the Major Commenters Said

The comment period drew responses from some of the most influential bodies in radiology. The positions were not monolithic.

ACR (American College of Radiology): The ACR surveyed its own members on the proposal in January 2026. "Respondents consistently emphasized that patient safety should remain the agency's highest priority" — a message the ACR carried directly into its formal comments. The college also flagged a procedural concern that doesn't get enough attention: if the agency grants the petition, the final order will directly amend the classification regulation without conventional notice-and-comment rulemaking. That means the regulatory change could happen fast, with limited public process, even for a decision this consequential.

RSNA (Radiological Society of North America): The RSNA was the most direct in its opposition. RSNA said it opposes the exemption, seeing it as an "inadequate and potentially harmful solution," arguing that "the regulatory checkpoints currently in place serve a critical function by requiring rigorous review, validation and..." continued oversight before tools reach clinical settings.

NCHR (National Center for Health Research): NCHR warned that exempting computer-aided radiology devices from premarket review would weaken safeguards and risk patient health. The organization pointed out a distinction that often gets collapsed in these conversations: not all "AI" radiology tools are the same. Some use machine learning. Others rely on rule-based algorithms, fixed statistical models, or image-processing heuristics. All of them can materially affect how a scan gets read — and all would fall under the scope of the proposed exemption.

Patient Alliance for Safe Imaging also submitted comments, reflecting a broader concern from advocacy groups that the burden of catching errors could shift downstream — from the FDA to hospitals, radiologists, and ultimately patients.

The Real Trade-Off

The honest version of this debate comes down to one question: when does oversight catch real problems versus when does it just slow things down?

The answer, in radiology AI, is that both happen. The 510(k) process has cleared hundreds of AI devices with limited real-world validation data. A 2025 JAMA Network Open systematic review found that the number of AI-enabled tools cleared by the FDA continues to rise, but evidence about clinical generalizability is lacking. So the premarket review as currently designed is not a guarantee of performance — it is a checkpoint. Removing it shifts the weight entirely to post-market monitoring.

Whether that shift is safe depends entirely on whether the post-market monitoring conditions have teeth. The petition's proposed framework — structured surveillance plans, technical standards, qualified-user requirements — is more substantive than the de facto post-market monitoring that exists today, which is largely complaint-driven and reactive. The petition also notes that the EU's Medical Device Regulation (MDR) requires manufacturers to implement continuous and proactive data collection on device safety and performance throughout the entire lifecycle, and the proposed framework aligns closely with those objectives.

But there is a real case study worth sitting with here. RSNA's journal published a clinical scenario involving a FDA-cleared AI triage tool that analyzed a CT brain scan and flagged an intracranial hemorrhage — but the location was not typical for hemorrhagic stroke. The patient was ultimately diagnosed with ischemic stroke. The algorithm got it wrong. The radiologist caught it. That is exactly how the human-in-the-loop system is supposed to work. The concern is whether it always will — especially in under-resourced settings where the AI may function less as a second opinion and more as a first (and only) read.

If the FDA Approves: What Changes for Whom

For AI Companies

The most immediate effect is speed to market. A 510(k) submission can take anywhere from several months to over a year, depending on FDA workload, the novelty of the device, and the back-and-forth that often follows an initial submission. Exempting qualified manufacturers from that process doesn't remove oversight — it moves it. Companies would still need to build and maintain surveillance infrastructure, hit technical standards, and restrict deployment to appropriately trained facilities.

For established players like Harrison.ai, Aidoc, or any vendor with existing clearances, this is good news. For new entrants without a cleared device, nothing changes — they still go through 510(k). The petition effectively creates a two-tier system: a faster lane for companies that have already cleared the baseline bar, and the existing process for everyone else.

There is also a conflict-of-interest issue worth naming. At the same time the FDA is considering the petition, the agency appointed a former senior executive from a Harrison.ai subsidiary to direct the office that informs agency policy on AI. That appointment does not make the petition wrong, but it is the kind of thing that will — and should — draw scrutiny.

For Healthcare Providers

Hospitals and health systems absorb the most practical risk under this model. If a new AI tool ships without FDA premarket review, the institution deploying it becomes a more active participant in its validation. The petition's requirement for facility-level training programs and qualified end users puts real operational expectations on health system IT, clinical informatics, and radiology leadership.

For large academic medical centers with mature AI governance programs, this is manageable. They already have vendor evaluation processes, shadow-deployment protocols, and radiologist oversight workflows. For smaller community hospitals or rural health systems — where AI tools are arguably most needed and radiologist coverage is thinnest — the institutional capacity to catch AI errors may be far more limited.

This is not a hypothetical. Many of the settings where radiology AI creates the most value are exactly the settings least equipped to serve as a substitute for FDA review.

For Patients

Patients get faster access to tools that, on average, perform well. They also carry more of the risk if those tools don't. The 510(k) process, flawed as it is, functions as a documented accountability checkpoint. When a device clears premarket review, there is a paper trail: the predicate device, the test data, the labeling review, the FDA's own assessment.

Under the proposed exemption, that trail starts later — after deployment, through post-market surveillance. If a tool is biased toward one patient population, underperforms on a specific imaging protocol, or misses a class of finding at a higher rate than disclosed, the detection mechanism is now a combination of manufacturer self-reporting, facility-level tracking, and adverse event submissions. History with self-reported post-market surveillance across medical devices is not uniformly encouraging.

Patients don't get a say in which AI tools their hospital deploys. They rarely know these tools exist. Making informed consent or patient notification a meaningful part of AI deployment — not just a line in a training program description — is something the petition does not address, and the FDA should.

The Likely Outcome

The political and regulatory environment in early 2026 tilts toward approval. The Trump administration has promised to reduce the barriers between health AI developers and patients. The petition is legally grounded, narrowly scoped to a specific device class, and not without precedent — the FDA has used partial exemption authority before. Publication in the Federal Register itself signals the agency views the request as credible.

That does not mean approval is certain. The RSNA's strong opposition, the ACR's patient-safety framing, and public pressure from groups like the Patient Alliance for Safe Imaging give the FDA political cover to add conditions, narrow the eligible device types, or require more rigorous surveillance commitments before granting the exemption.

The most likely outcome is a modified approval: exemption granted, but with stronger post-market monitoring requirements than currently proposed, explicit enforcement mechanisms, and possibly a phased rollout that lets the FDA observe how the framework performs before opening it to the full CAD device category.

The gate is probably going to open. The question is how wide, and who's watching on the other side.