The United States government is preparing to launch a pilot program that will use artificial intelligence to determine whether certain Medicare treatments should be approved or denied. The initiative, named WISeR, aims to reduce wasteful spending but has generated significant concern among lawmakers, medical professionals, and patient advocates about potential barriers to necessary care.
Key Takeaways
- A new federal pilot program, WISeR, will use an AI algorithm for Medicare prior authorization decisions starting January 1.
- The program will run through 2031 and affect Medicare patients in Arizona, Ohio, Oklahoma, New Jersey, Texas, and Washington.
- The goal is to reduce spending on services deemed "low-value," but critics fear it will improperly deny necessary medical care.
- The initiative has drawn bipartisan criticism for expanding a practice the administration has simultaneously criticized in the private sector.
Medicare to Pilot AI for Cost Reduction
The federal government will introduce a new program called "Wasteful and Inappropriate Service Reduction" (WISeR) on January 1. This long-term pilot, scheduled to run until 2031, will test the effectiveness of an AI algorithm in managing Medicare costs by implementing prior authorization for specific medical services.
Prior authorization is a process requiring healthcare providers to get pre-approval from an insurer before a patient can receive certain tests, procedures, or medications. While common in private health insurance, it has been used sparingly in traditional Medicare.
The WISeR program will initially apply to Medicare beneficiaries in six states: Arizona, Ohio, Oklahoma, New Jersey, Texas, and Washington. The AI will assess the necessity of treatments such as skin and tissue substitutes, electrical nerve stimulator implants, and knee arthroscopy. The government has indicated that more procedures could be added to this list over time.
What is Prior Authorization?
Prior authorization is a cost-control measure used by health insurance companies. It requires doctors or hospitals to justify a proposed treatment to the insurer before it is performed. Insurers argue it prevents unnecessary procedures and fraud, while critics contend it creates delays and can lead to the denial of essential care.
Concerns Over Mixed Messages and Patient Access
The timing of the WISeR announcement has been met with confusion and criticism from various quarters. It was revealed just days after the Trump administration publicly urged private health insurers to reform and scale back their own use of prior authorization, a process that Mehmet Oz, the administrator of the Centers for Medicare & Medicaid Services (CMS), said causes significant care delays and "erodes public trust."
This apparent contradiction has been highlighted by policymakers. "Administration officials are talking out of both sides of their mouth," said Representative Suzan DelBene, a Democrat from Washington. "It’s hugely concerning."
Doctors and patient advocates share these concerns, framing prior authorization as a tactic that can slow down or block access to treatment, sometimes with severe consequences. Rep. Greg Murphy, a Republican from North Carolina and a urologist, criticized the private insurance model, stating, "Insurance companies have put it in their mantra that they will take patients’ money and then do their damnedest to deny giving it to the people who deliver care."
Public Opinion on Prior Authorization
According to a July poll from KFF, a health information nonprofit, nearly 75% of respondents believe prior authorization is a "major" problem within the U.S. healthcare system. This widespread dissatisfaction reflects public frustration with insurance denials and treatment delays.
How the AI-Powered System Will Work
The WISeR program is designed to identify and prevent spending on services the government considers vulnerable to "fraud, waste, and abuse." However, the specifics of how the AI algorithm will make its determinations remain unclear, leading to questions about transparency and accountability.
CMS spokesperson Alexx Pons has stated that safeguards will be in place. According to Pons, no Medicare request will be denied without a review by a "qualified human clinician." He also said that vendors operating the AI system are prohibited from compensation arrangements that are directly tied to denial rates.
Despite these assurances, some experts are skeptical. The government has confirmed that vendors will be rewarded based on savings achieved, a model known as a "shared savings arrangement." Jennifer Brackeen of the Washington State Hospital Association noted that such structures can create a strong incentive to deny care. "Shared savings arrangements mean that vendors financially benefit when less care is delivered," she explained.
"I’m not sure they know, even, how they’re going to figure out whether this is helping or hurting patients."
The Role of Human Oversight
A key debate surrounding the use of AI in healthcare is the effectiveness of human review. While CMS insists that a clinician will review denials, past incidents in the private sector have raised doubts about how thorough this oversight truly is.
A 2023 ProPublica report found that doctors at Cigna reviewing payment requests spent an average of just 1.2 seconds on each case over a two-month period. While Cigna stated the report referenced a simple software process and not AI-driven prior authorization, the findings have fueled concerns about what constitutes "meaningful human review."
Furthermore, class-action lawsuits against major insurers have alleged that AI models are used to systematically deny claims based on flawed algorithms that ignore individual patient needs and override physician recommendations.
A Regulatory Blind Spot
The expansion of AI in health insurance decisions is moving faster than regulatory frameworks can adapt, creating what some experts call a "regulatory blind spot." Jennifer Oliva, a law professor at Indiana University-Bloomington, has studied how these algorithms function.
Her research suggests that algorithms may be programmed to deny high-cost care, particularly for patients with a poor prognosis. "The more expensive it is, the more likely it is to be denied," Oliva said. She explained that the lengthy appeals process that follows a denial increases the chance a patient may pass away before the insurer has to pay the claim.
Carmel Shachar, a faculty director at Harvard Law School, described the WISeR pilot as an "interesting step" but stressed that the lack of detail makes it difficult to assess its potential impact on patients. The uncertainty has prompted a bipartisan response in Congress, with some House members supporting a measure to block funding for the program.
While acknowledging the potential benefits of AI, Rep. Murphy emphasized the importance of physician judgment. "This is a pilot, and I’m open to see what’s going to happen with this," he stated, "but I will always, always err on the side that doctors know what’s best for their patients."