Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Utah allows an AI system to prescribe psychiatric medications without a doctor. It is only the second time that the state – and the country – have been empowered This type From the clinical authority of AI. State officials say it could cut costs and alleviate shortages in care, but doctors warn that the system is arcane and risky, and is unlikely to expand mental health care for those who need it.
pilot for one year, Announced last weekwill allow Legion Health’s chatbot to renew certain prescriptions for psychiatric medications, in some cases. The San Francisco startup promises Utah-based patients “quick and simple refills” with a $19-a-month subscription. The program begins sometime in April, though the company is only running a waiting list at the moment.
The AI-powered chatbot will renew prescriptions for psychiatric medications, in some cases.
The scope of the program has been deliberately narrow, limited both in terms of the drugs it covers and the conditions patients must meet to qualify. According to the Corps deal And with Utah’s Office of Artificial Intelligence Policy, the chatbot can renew 15 low-risk maintenance medications that have already been prescribed by a doctor. These include fluoxetine (Prozac), sertraline (Zoloft), bupropion (Wellbutrin), mirtazapine, and hydroxyzine, which is commonly used to treat anxiety and depression. Patients must also be considered stable: anyone with a recent dose or medication change or admission to a psychiatric hospital in the past year is excluded, and patients must see their health care provider every 10 packs or after six months, whichever comes first.
The system cannot issue new prescriptions or handle medications that require close clinical supervision, including medications that need to be monitored by blood tests. Controlled substances are also banned, which rules out many ADHD medications. Exclude benzodiazepines used for anxiety. antipsychotics, used for conditions such as schizophrenia and bipolar disorder; And lithium — widely considered the gold standard treatment for bipolar disorder — leaves many more complex psychiatric conditions off-pilot.
To use the system, patients must sign up, verify their identity and prove they already have a prescription, such as providing a photo of the label or pill bottle. They are then asked about their symptoms, as well as side effects and the effectiveness of the medication. They are asked questions about suicidal thoughts, self-harm, severe reactions, and pregnancy in order to score red flags. If any answers fall outside the pilot program’s low-risk criteria, cases are supposed to be escalated to a physician before any refills are issued. Patients and pharmacists can also request human review.
“By safely automating the renewal process for maintenance medications, we allow patients to get the care they need more quickly and affordably,” state officials said. He said When the pilot was announced. Over time, they said, the program could free health care providers “to focus their time on the needs of more complex and critical patients” and help address a shortage that has left 500,000 Utahns without access to mental health care. Yash Patel, co-founder and CEO of Legion, put the program in grander terms, a description It is a global first that will dramatically expand access to healthcare and marks “the beginning of something much bigger than repackaging.”
Psychiatrists are less convinced. Brent Kuse, a psychiatrist and professor at the University of Utah School of Medicine, said: Edge He believes that “the advantages of an AI-based refill system may be overstated.” He suspects the tool “will not increase access for those most in need of care.” The target patient must already be involved in a treatment plan with their psychiatrist in order to use the service.
“It would be better if there was more transparency, more science, and more rigorous testing before people were required to use this.”
Kious points out that automation could contribute to what he calls an “epidemic of overmedication” in psychiatry, where some patients stay on medication longer than they need to. John Toros, director of digital psychiatry at Beth Israel Deaconess Medical Center and professor of psychiatry at Harvard Medical School, raised related concerns, noting that some people benefit from continuing to take psychiatric medications long-term, while others may benefit from reducing or stopping them. “It requires more active management, changes and careful study,” he said. This is difficult to do if you are outsourcing backfilling of chatbot check-ins.
The biggest concern is whether a chatbot can safely automate even the most routine parts of psychiatric care. Prescribing medication involves more than just checking for drug interactions, Toros said, and wondered whether any AI system today “can understand the unique context and factors that go into a person’s medication plan.” Caiuss made a similar point: “This is something that could be safe in principle, but it all depends on the details.” These concerns are exacerbated by how new these systems are, and how opaque they are to outsiders. “It’s a bit like alchemy right now,” he said. “It would be better if there was more transparency, more science, and more rigorous testing before people were required to use this.”
There are more pressing safety concerns as well. A chatbot might miss something during an exam: it might not ask the right questions, a patient might not recognize a side effect, or it might answer inaccurately, Kious said. Some may simply tell the system what it wants to hear in order to speed up the care process. He stressed that this is not limited to chatbots; Much of psychiatry relies on self-report. But human doctors usually have access to other information as well, he said, adding that when he sees patients, he pays attention not only to what they say, but also to what they don’t say and how they present themselves. While patients can also mislead human providers, Kious said a chatbot system could make it easier for patients to modify their answers until they get the desired outcome.
There are more overt security risks as well, which will be familiar to anyone who follows the performance of chatbots in the real world, Toros said. Legion’s chatbot is Utah’s second experiment in AI prescription prescribing, joining an ongoing, broader primary care-focused pilot with Doctronic, which Fired Last December. Within weeks of going live, Security researchers She managed to get Doctronic’s system to spread vaccine conspiracy theories, generate instructions for cooking methamphetamine, and triple a patient’s opioid dose. State officials say the more focused program with the Legion is specifically designed to target the state’s “mental health shortages.”
The Corps says the pilot operated under tight guardrails. In addition to what it calls “conservative eligibility gates,” its agreement with Utah requires it to submit detailed monthly reports and closely review the first 1,250 applications by human doctors, with periodic sampling of about 5 to 10 percent of applications after that.
said Legion co-founder and president Arthur McWaters Edge “Risks exist in any remote care model, whether AI-powered or entirely human-led,” he stressed, adding that “the company’s workflow does not rely on a single self-reported answer to unlock treatment.” Key safeguards include narrow trial restrictions on medications and patient eligibility, AI-integrated safety screens, pharmacist engagement, and the ability to escalate to a physician, he said. “We see this as critical to expanding access to the hundreds of thousands of people in Utah who live in areas with mental health shortages, as well as being an important proving ground for AI in medicine.”
MacWaters did not comment on additional use cases, medications or expansions into other states, but said the company is “excited about what the future holds.” He didn’t provide a timeline for Legion’s expansion plans either, though both MacWaters and Legion have publicly signaled broader ambitions outside of Utah: Legion’s refill site says the service will be available “nationwide in 2026,” and MacWaters has Suggested “It’s going to be in every state very, very quickly.”
For the psychiatrists I spoke to, all of this seemed to raise a fairly basic question: What problem is the Corps really solving? Approved patients often don’t even need an appointment to get a refill, Kuse said, explaining that most psychiatrists would probably be “happy to refill prescriptions for free and without an appointment” unless they’re concerned about the patient or the medication carries too much risk. These are the very cases that Legion’s AI is prevented from dealing with.
“I personally would avoid it for now,” Toros said, adding that if you find a good treatment plan that works for you, it’s best to stick with that doctor.