Developing models and training them with data is a way of life for the National Oceanic and Atmospheric Administration. Across its line offices, the agency provides weather forecasting and guidance information necessary to prepare local communities for major storms.
To do this, NOAA needs to efficiently manage all of the observational data it receives from satellites both geostationary and polar, autonomous systems, buoys or aircraft. Jamese Sims, senior science advisor for Artificial Intelligence, said that AI is one of the tools the agency uses in that regard. This includes robots used on the ocean floor or cameras on ships, for example. The advanced technology is collecting not just more data but higher quality data, as well.
“When we launched the current generation of geostationary satellites, back in 2016, when we launched the GOES-R Series satellite, we increased the amount of data that we were receiving by 60 times over the legacy system,” Sims said on Federal Monthly Insights – Artificial Intelligence and Data. “With this new source of data, in addition to the expansion of autonomous systems, we’re talking data within terabytes and larger.”
The NOAA National Centers for Environmental Information houses the data and provides over 37 petabytes of comprehensive atmospheric coastal, oceanographic and geophysical data. As NOAA uses more AI, the synergy between that technology and the data grows more important, she said.
“And also, as we continue to migrate into the cloud, these are areas that basically serve, somewhat as a foundational piece for other science and technology areas, as we are evolving in these different fields,” she said on Federal Drive with Tom Temin.
NOAA uses AI to understand the physical parameterization of its models, to choose which physical parameters will give scientists the best forecasts. Sims said they also use AI to understand what the errors and biases are within those models, so using AI to update models when needed can make for a more accurate forecast.
Data formats vary depending on the system bringing it in, and depending on how it is used. In its conversations with other agencies NOAA talks about making sure data is clean, properly labeled and accessible, Sims said. Therefore, communication among the workforce of scientists and data analysts is crucial. She said diversity and inclusion among that workforce matters, too.
“We talk about training the current workforce, as well as the future workforce, to work within our agency because as you mentioned, it’s interdisciplinary and we need skills. But we also need different perspectives, in order for us to do our job at the best that we can,” she said.
As for storage, right now NOAA relies on its supercomputers but as technology evolves, the agency is looking more to the cloud. The intangible nature of cloud storage versus physical high-performance computers is more cost-effective in the long term, especially for all of those petabytes produced by AI. Sims acknowledged NOAA is “not there yet” but still in the exploratory phase.
Approximately 200 projects use AI at NOAA, which are at different levels of readiness, from research to near operations and actual operational status.
“First of all, we need to better understand what data is actually needed, as we talk about using AI more across the agency, and then being able to store that data for training purposes, and then for the verification and validation of the solutions that that we have, and the output,” Sims said. “From that, everything that we do goes through very strenuous processes, before we can declare something to be ready for operations, because our role is to save lives and property.”
Copyright
© 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.