CRBC News

Congress Directs VA to Expand AI Tools to Help Prevent Veteran Suicides

Congress has directed the VA to expand use of AI and machine learning to better identify veterans at risk of suicide, as part of the FY26 appropriations bill that provides over $115 billion for veterans’ health care and about $698 million for suicide prevention. Committees urged wider application of real‑time analytics and REACH VET, the VA’s predictive tool that flags the highest‑risk patients. Officials stress AI should augment clinical care — not replace clinicians — while improving early detection, training and personalized treatment.

Congress Directs VA to Expand AI Tools to Help Prevent Veteran Suicides

Congress’s recently enacted FY26 Military Construction and Veterans Affairs bill directs the Department of Veterans Affairs to broaden its use of artificial intelligence and machine‑learning tools to identify veterans at risk of suicide. The appropriations measure, signed into law on Nov. 12, provides more than $115 billion for veterans’ health care and designates roughly $698 million for the VA’s suicide‑prevention efforts.

Lawmakers approved separate House and Senate spending plans months earlier, but the consolidated bill only received final congressional approval after a brief government shutdown ended. The legislation explicitly encourages the VA to pursue "further innovative tools," including real‑time analytics and omnichannel technologies that can flag veterans showing elevated suicidal ideation.

Why AI is being considered

Reducing veteran suicide has been a persistent challenge. VA reports estimate about 6,500 veterans die by suicide each year — roughly 17 deaths per day — a rate that has changed little since 2008. In recent years the VA has already piloted AI-based approaches, and the FY26 funding gives the department more latitude to scale and refine those efforts.

Committee recommendations and REACH VET

A House Appropriations Committee report commended the VA’s current prevention efforts but recommended expanding early detection using artificial intelligence and machine learning to improve operational efficiency and outcomes.

"There is a significant need to improve early suicide indicators and detection using artificial intelligence and machine learning technologies that improve operational efficiency and effectiveness throughout veteran service delivery."

Senators also urged broader use of REACH VET, the VA’s machine‑learning program launched in 2017 that scans medical records and flags patients in the top 0.1 percent for suicide risk. The program has been updated to include additional red‑flag indicators — such as military sexual trauma and spousal abuse — while removing race and ethnicity as predictive inputs.

How AI would be used — and limits

Reports accompanying the bill recommend that interactions across government service channels be leveraged with omnichannel capabilities and real‑time analytics so that insights can be generated quickly and resources deployed decisively. Lawmakers and VA officials emphasize that automated tools are meant to support, not replace, human clinicians.

"AI‑enabled resources are used as supplementary tools to support therapists, help train early intervention, quickly connect to peers, and facilitate connections with VHA’s caring staff — not replace them," said VA Press Secretary Pete Kasperowicz. "Across the VA, personal relationships are the primary pathway for restoring hope and healing."

Proponents say AI can improve early detection, train crisis responders, integrate alerts into clinical workflows, and provide decision support to clinicians so they can intervene faster. But some veterans and advocates worry about overreliance on automation — especially amid federal workforce reductions — and stress the need for clear safeguards, human oversight, and transparency around how predictive models are used.

Next steps

With congressional backing and targeted funding, the VA plans to expand its use of predictive analytics and machine learning across the system while continuing collaboration with external researchers and nonprofit partners. Officials say the goal is to enhance predictive models, improve personalized care, and strengthen early intervention without diminishing human contact or clinical judgment.

Similar Articles