Agencies are seeing opportunities for artificial intelligence tools to help federal employees pour over vast quantities of data.
The Pandemic Response Accountability Committee, for example, sees AI as a valuable tool to flag potential fraud in pandemic spending data.
PRAC Chief Data Officer Brien Lorenze told Federal News Network in a recent panel discussion that pandemic oversight is a “target-rich environment” for AI, and that the technology can highlight when fraudsters are using synthetic identities or shell corporations to exploit federal benefits programs.
“We’re really interested in the phenomena of fraud — the way schemes are changing,” Lorenze said on Aug. 2 during an ATARC AI conference. “I think AI, if properly set up, can allow us to work a lot faster.”
The PRAC holds about 95 data use agreements with agencies and about a billion transaction records from eight different agencies.
“[It’s] a lot of data to parse through, and all of that data has different rules and governance over what we can use the data to do,” Lorenze said.
The Biden administration is taking steps to ensure the ethical use of AI tools across government, but Lorenze said agencies are also increasingly having to prepare for fraud scheming accelerated by malicious large-language models like FraudGPT and WormGPT.
These large-language models can help fraudsters craft malware and phishing attacks more quickly, and at a much greater volume.
“There’s a lot of burgeoning tools in the fraud space to actually enable fraud,” Lorenze said. “And they have no compunction around ethics whatsoever.”
“You have to put yourself in the mind of the fraudster to be able to properly counteract the fraudster. And so, we want to experiment with the tools that they’re using, just so we can understand what to recognize and how to recognize it,” he added.
Heather Martin, the acting director of the Office of Plans, Programs and Strategies within the National Geospatial-Intelligence Agency’s Directorate for Data and Digital Innovation, said AI tools are helping intelligence community analysts to more easily track down objects when sorting through a growing volume of satellite imagery.
“We’re going to have more data at our fingertips, which is great for the kinds of missions that all of us are doing. And we aren’t going to have enough people to look at all this data and work with this data,” Martin said.
“We really are turning to the adoption of AI and computer vision to help our analysts find those things they care about in large amounts of imagery, so they can focus their time on other priority things, and really do more in-depth analysis,” she added.
Damian Kostiuk, deputy CDO for U.S. Citizenship and Immigration Services, said the agency sees potential for AI, but is taking cautious steps toward implementation.
“The initial use cases are wonderfully boring because nobody wants to apply it right off the bat. There’s a lot of laws, and I think there’s definitely going to be regulation against it,” Kostiuk said.
Kostiuk said AI, for example, is helping USCIS pre-draft documents, but the agency still requires a human in the loop to read over and redraft those documents.
“Can we use this to synthesize and summarize thousands of pages of information that we’re getting from people? Yes,” he said. “At least that’s a starting point — and then it still has to pass by the eyeballs of an analyst before doing any kind of reporting.”
Kostiuk said AI needs huge volumes of data to train on, but that much of the federal government lacks the type of data environment to train AI models.
“A lot of federal agencies are still very much into siloed and turfed territory for data, and some groups are loath to share. Well, that actually inhibits our ability to actually get this type of technology up and running — and actually make it running with high quality — if we’re missing holes in that knowledge gap,” Kostiuk said.
To address these systemic data challenges, Chakib Chraibi, the chief data scientist at the National Technical Information Service, said the Commerce Department is wrapping up an agency-wide data maturity assessment report. Chraibi said the department expects to complete the assessment by the end of September
“Each agency will look at how they are governing the data, how they manage the data, what are the gaps, how they can improve data, and when we were talking about data in terms of quality, in terms of governance, in terms of accessibility, in terms of sharing that data, and how to best leverage the data,” Chraibi said.
Copyright
© 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.