The government’s tossing around a new idea that’s got some folks scratching their heads—a bill called H.R. 238, the Healthy Technology Act of 2025, aiming to let AI and machine learning write drug scripts. Medscape Oncology’s laying it out—this would tweak the Federal Food, Drug, and Cosmetic Act so AI could act as a “practitioner” if a state says okay and the FDA’s already cleared it for something else. It’s a wild swing—computers prescribing pills—but it’s hitting a wall hard and fast.
This bill’s trying to stretch the rules—H.R. 238 says AI could join the prescribing game alongside MDs, pharmacists, nurses, if states greenlight it and the FDA’s got its stamp elsewhere. Right now, only humans—licensed doctors—can legally sign off on scripts; AI’s locked out ‘cause it can’t hold a medical license. The FDA’s cleared AI for diagnostics—like spotting tumors on scans, 510(k) approvals since 2018—but prescribing’s a leap nobody’s made yet. This isn’t just tech stepping up—it’s a crack at handing machines a job that’s always been human, and it’s stirring trouble already.
Prescribing’s no simple trick—it’s not just scribbling a name on a pad and calling it a day. Patients need watching—side effects can hit hard, misuse happens, compliance flops—and law says only a licensed MD can oversee that. H.R. 238’s push—AI as a “practitioner”—means software could churn out scripts, but who’s checking if the patient’s doubled over from a bad reaction? X chatter’s buzzing—folks say AI could fake a patient profile, snag scripts for anything from Oxy to Adderall, no human in the loop. This ain’t tight—it’s a hole big enough to drive a truck through, and the law’s not built for it.
The details stack up messy—FDA’s got over 300 AI approvals since 2012, mostly diagnostics, none for prescribing ‘cause it’s a legal no-go. States like California demand a human MD license—AI can’t sit boards, can’t swear an oath. Medscape notes AI’s fast—ChatGPT’s spitting drug lists in seconds—but it’s blind to a patient’s sweat or shakes. Machines can crunch—this needs eyes, and H.R. 238’s pretending code can see what it can’t.
Look at the flip side—folks pushing this say AI could ease MD shortages—3.2 million scripts daily in the U.S., per IQVIA, and docs are stretched thin. But law’s firm—only humans license, only humans supervise, and AI faking patients could flood streets with pills—DEA tracked 15 million opioid doses misused in 2023. This isn’t help—it’s a floodgate, and the bill’s skipping the real fix for a flashy gimmick.
H.R. 238 is a stretch—847,000 homes sit unsold in the U.S., and now AI doctors? Last year’s spending hit $6.8 trillion, and now this bill wants machines prescribing with state approvals and FDA side-eyes. The law says no—only MDs can prescribe—and patients need more than a script from a screen. This is a misfire—tech is useful, but this isn’t medicine; it’s a loophole waiting to bust open. Time’s ticking—Congress better act before pills are everywhere.
Sources:
https://www.govtrack.us/congress/bills/119/hr238