At the Department of Veterans Affairs, the goal of increasing AI use cases means better health outcomes for veterans, and greater job satisfaction for the VA employees who serve them. Through programs like Stratification Tool for Opioid Risk Mitigation (STORM) and Recovery Engagement and Coordination for Health – Veterans Enhanced Treatment (REACH VET), data is being analyzed to proactively identify veterans that might benefit from a specific intervention. Following the executive order on artificial intelligence, VA was able to identify a few dozen use cases for AI. Radiology is particularly leading the way, with over 75% of FDA approved AI devices being from radiology, but the department notes that AI has the ability to improve functions throughout their operations.
“In general, we see the potential for AI to contribute in the near term across really a wide range of areas at VA. And I think I see that in kind of three main categories. So one category is in reducing health care provider burnout. And in that vein we have actively these AI tech sprints which are part of the AI executive order. We’re running two of them right now. One is around assisting with documenting clinical encounters, and the other is extracting information from paper medical records,” Kimberly McManus, deputy chief technology officer of artificial intelligence at the Department of Veterans Affairs, said on Federal Monthly Insights – Operationalizing AI. “Another area similar to that is broadly improving veteran and staff experience. So this is around how can we augment our current staff by helping with these tedious tasks and reducing administrative burden… And then the third area is better care for patients. And so that’s where I see a lot of these FDA approved medical devices, such as the ones in radiology. But there’s new ones also in pathology and dermatology.”
One way VA is addressing staff experience is by using optical character recognition to extract handwritten notes to convert them into computer written notes. Once those notes are converted, AI can be used to compile and extract information that a caseworker may be looking for.
“We can potentially use generative technology to better summarize and identify information … and that kind of search and summarization is really applicable across the VA enterprise as well as across many other companies, whether its our health care providers trying to find information in their electronic health records or whether its our benefits adjudicators trying to identify information when they are processing claims,” McManus said.
A top priority for VA is veteran suicide prevention, and AI has a huge role in those efforts. The REACH VET program uses an algorithm that stratifies veterans by high risk or low risk of suicide, and identifies those in need of intervention.
“We have programs that can reach out to and provide supportive services to veterans in one area, and we are looking at a variety of areas, for the future,” McManus said. “Broadly, it involves using factors that we know are clinical risk factors for suicide that are already in their electronic health record, and then putting those together to identify risk score.”
With the health and well being of veterans at stake, and the large volume of their data on hand, the VA puts the highest level of importance on the security of veteran information. They have developed six basic principles to follow for the use of data and AI trustworthiness for themselves and their partners:
- Is it purposeful?
- Is it effective and safe?
- Is it secure and private?
- Is it fair and equitable?
- Is it transparent and explainable?
- Is it accountable and monitored?
“We have an enterprise data platform that’s called Summit that contains much of our EHR electronic health record data as well as other types of data,” McManus told the Federal Drive with Tom Temin. “We have much of our data in our healthcare, in our cloud platforms that we keep on our network. We also work with other organizations such as Oak Ridge National Labs, who does some of our more deep R&D related to data science and AI.”
As the VA introduces more uses for AI, concerns about drift or bias become a larger issue for the organization. VA’s approach to this is to keep humans in the loop, and designing use cases around the human end user. The human factor also allows the agency to evaluate when AI is the proper solution.
“The number one key to any success of AI, machine learning and AI algorithm is that starting from the beginning, really having those end users, those health care clinical experts, the workflow experts, all really at the table. As data scientists, we understand the math and the computers, but how an algorithm will actually fit into a workflow, actually impact end users, that really requires everyone to be at the table from the beginning.” McManus said. “I’m really excited and optimistic about this space. We definitely are keeping a very strong focus on trustworthy AI and safety. And there are just so many areas that AI and ML and new technology has the potential to positively impact our mission to care for veterans. So I am optimistic, and I think we’ve made a lot of progress. We have a long way to go, but, I’m excited.”
Copyright
© 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.