Big Data - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Wed, 10 Apr 2024 22:43:36 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Big Data - Federal News Network https://federalnewsnetwork.com 32 32 Ask the CIO: Federal Emergency Management Agency https://federalnewsnetwork.com/cme-event/federal-insights/ask-the-cio-federal-emergency-management-agency/ Wed, 10 Apr 2024 20:41:23 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4957819 How is digital transformation impacting the mission at FEMA?

The post Ask the CIO: Federal Emergency Management Agency first appeared on Federal News Network.

]]>
In this exclusive webinar edition of Ask the CIO, host Jason Miller and his guest, Charlie Armstrong, chief information officer at FEMA will discuss the how digital transformation is supporting the mission at FEMA. In addition, Don Wiggins, senior solutions global architect at Equinix will provide an industry perspective.

Learning Objectives:

  • Digital transformation at FEMA
  • Shifting FEMA to the cloud
  • Edge computing for the future
  • Employing artificial intelligence
  • Industry analysis

The post Ask the CIO: Federal Emergency Management Agency first appeared on Federal News Network.

]]>
Understanding the data is the first step for NIH, CMS to prepare for AI https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/ https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/#respond Fri, 29 Mar 2024 19:53:52 +0000 https://federalnewsnetwork.com/?p=4944463 NIH and CMS have several ongoing initiatives to ensure employees and their customers understand the data they are providing as AI and other tools gain traction.

The post Understanding the data is the first step for NIH, CMS to prepare for AI first appeared on Federal News Network.

]]>
var config_4944551 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3043668049.mp3?updated=1711741714"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"NIH, CMS finding a path to better data management","description":"[hbidcpodcast podcastid='4944551']nnThe National Institutes of Health\u2019s BioData Catalyst cloud platform is only just starting to take off despite it being nearly six years old.nnIt already holds nearly four petabytes of data and is preparing for a major expansion later this year as part of NIH\u2019s goal to democratize health research information.nnSweta Ladwa, the chief of the Scientific Solutions Delivery Branch at NIH, said the <a href="https:\/\/www.nhlbi.nih.gov\/science\/biodata-catalyst" target="_blank" rel="noopener">BioData Catalyst<\/a> provides access to clinical and genomic data already and the agency wants to add imaging and other data types in the next few months.nn[caption id="attachment_4944475" align="alignright" width="300"]<img class="size-medium wp-image-4944475" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/03\/sweta-ladwa-300x300.jpg" alt="" width="300" height="300" \/> Sweta Ladwa is the chief of the Scientific Solutions Delivery Branch at NIH.[\/caption]nn\u201cWe're really looking to provide a free and accessible resource to the research community to be able to really advance scientific outcomes and therapeutics, diagnostics to benefit the public health and outcomes of Americans and really people all over the world,\u201d Ladwa said during a recent panel discussion sponsored by AFCEA Bethesda, an excerpt of which ran on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cTo do this, it takes a lot of different skills, expertise and different entities. It's a partnership between a lot of different people to make this resource available to the community. We're also part of the <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2024\/02\/ai-data-exchange-state-depts-matthew-graviss-nihs-susan-gregurick-on-ai-as-force-multiplier\/">larger NIH data ecosystem<\/a>. We participate with other NIH institutes and centers that provide cloud resources.\u201dnnLawda said the expansion of new datasets to the BioData Catalyst platform means NIH also can <a href="https:\/\/federalnewsnetwork.com\/cloud-computing\/2023\/06\/cloud-exchange-2023-nihs-nick-weber-explains-how-strides-cloud-program-bridges-27-institutes\/">provide new tools<\/a> to help mine the information.nn\u201cFor imaging data, for example, we want to be able to leverage or build in tooling that's associated with machine learning because that's what imaging researchers are primarily looking to do is they're trying to process these images to gain insights. So tooling associated with machine learning, for example, is something we want to be part of the ecosystem which we're actively actually working to incorporate,\u201d she said. \u201cA lot of tooling is associated with data types, but it also could be workflows, pipelines or applications that help the researchers really meet their use cases. And those use cases are all over the place because there's just a wealth of data there. There's so much that can be done.\u201dnnFor NIH, the users in the research and academic communities are driving both the datasets and associated tools. Lawda said NIH is trying to make it easier for the communities to gain access.n<h2>NIH making cloud storage easier<\/h2>nThat is why cloud services have been and will continue to play an integral role in this big data platform and others.nn\u201cThe NIH in the Office of Data Science Strategy has been negotiating rates with cloud vendors, so that we can provide these cloud storage free of cost to the community and at a discounted rate to the institute. So even if folks are using the services for computational purposes, they're able to actually leverage and take benefit from the discounts that have been negotiated by the NIH with these cloud vendors,\u201d she said. \u201cWe're really happy to be working with multi-cloud vendors to be able to pass some savings on to really advanced science. We're really looking to continue that effort and expand the capabilities with some of the newer technologies that have been buzzing this year, like generative artificial intelligence and things like that, and really provide those resources back to the community to advance the science.\u201dnnLike NIH, the Centers for Medicare and Medicaid Services is spending a lot of time <a href="https:\/\/federalnewsnetwork.com\/workforce\/2024\/02\/hhh-takes-step-toward-goal-for-better-health-information-sharing\/">thinking about its data<\/a> and how to make it more useful for its customers.nnIn CMS\u2019s case, however, the data is around the federal healthcare marketplace and the tools to make citizens and agency employees more knowledgeable.nn[caption id="attachment_4944476" align="alignleft" width="300"]<img class="size-medium wp-image-4944476" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/03\/kate-wetherby-300x300.png" alt="" width="300" height="300" \/> Kate Wetherby is the acting director for the Marketplace Innovation and Technology Group at CMS.[\/caption]nn nn nn nn nn nn nn nn nn nn nn nnKate Wetherby, the acting director for the Marketplace Innovation and Technology Group at CMS, said the agency is reviewing all of its data sources and data streams to better understand what they have and make their websites and the user experience all work better.nn\u201cWe use that for performance analytics to make sure that while we are doing open enrollment and while we're doing insurance for people, that our systems are up and running and that there's access,\u201d she said. \u201cThe other thing is that we spend a lot of time using Google Analytics, using different types of testing fields, to make sure that the way that we're asking questions or how we're getting information from people makes a ton of sense.\u201dnnWetherby said her office works closely with both the business and policy offices to bring the data together and ensure its valuable.nn\u201cReally the problem is if you're not really understanding it at the point of time that you're getting it, in 10 years from now you're going to be like, \u2018why do I have this data?\u2019 So it's really being thoughtful about the data at the beginning, and then spending the time year-over-year to see if it's something you should still be holding or not,\u201d she said.nnUnderstanding the business, policy and technical aspects of the data becomes more important for CMS as it <a href="https:\/\/federalnewsnetwork.com\/automation\/2020\/10\/cms-untangles-its-data-infrastructure-to-enable-ai-powered-fraud-detection\/">moves more into AI<\/a>, including generative AI, chatbots and other tools.n<h2>CMS creating a data lake<\/h2>nWetherby said CMS must understand their data first before applying these tools.nn\u201cWe have to understand why we're asking those questions. What is the relationship between all of that data, and how we can we improve? What does the length of data look like because we have some data that's a little older and you've got to look at that and be like, does that really fit into the use cases and where we want to go with the future work?\u201d she said. \u201cWe\u2019ve spent a lot of time, at CMS as a whole, really thinking about our data, and how we're curating the data, how we know what that's used for because we all know data can be manipulated in any way that you want. We want it to be really clear. We want it to be really usable. Because when we start talking in the future, and we talk about generative AI, we talk about chatbots or we talk about predictive analytics, it is so easy for a computer if the data is not right, or if the questions aren't right, to really not get the outcome that you're looking for.\u201dnnWetherby added another key part of getting data right is for the user\u2019s experience and how CMS can share that data across the government.nnIn the buildup to using GenAI and other tools, CMS is creating a data lake to pull information from different centers and offices across the agency.nnWetherby said this way the agency can place the right governance and security around the data since it crosses several types including clinical and claims information."}};

The National Institutes of Health’s BioData Catalyst cloud platform is only just starting to take off despite it being nearly six years old.

It already holds nearly four petabytes of data and is preparing for a major expansion later this year as part of NIH’s goal to democratize health research information.

Sweta Ladwa, the chief of the Scientific Solutions Delivery Branch at NIH, said the BioData Catalyst provides access to clinical and genomic data already and the agency wants to add imaging and other data types in the next few months.

Sweta Ladwa is the chief of the Scientific Solutions Delivery Branch at NIH.

“We’re really looking to provide a free and accessible resource to the research community to be able to really advance scientific outcomes and therapeutics, diagnostics to benefit the public health and outcomes of Americans and really people all over the world,” Ladwa said during a recent panel discussion sponsored by AFCEA Bethesda, an excerpt of which ran on Ask the CIO. “To do this, it takes a lot of different skills, expertise and different entities. It’s a partnership between a lot of different people to make this resource available to the community. We’re also part of the larger NIH data ecosystem. We participate with other NIH institutes and centers that provide cloud resources.”

Lawda said the expansion of new datasets to the BioData Catalyst platform means NIH also can provide new tools to help mine the information.

“For imaging data, for example, we want to be able to leverage or build in tooling that’s associated with machine learning because that’s what imaging researchers are primarily looking to do is they’re trying to process these images to gain insights. So tooling associated with machine learning, for example, is something we want to be part of the ecosystem which we’re actively actually working to incorporate,” she said. “A lot of tooling is associated with data types, but it also could be workflows, pipelines or applications that help the researchers really meet their use cases. And those use cases are all over the place because there’s just a wealth of data there. There’s so much that can be done.”

For NIH, the users in the research and academic communities are driving both the datasets and associated tools. Lawda said NIH is trying to make it easier for the communities to gain access.

NIH making cloud storage easier

That is why cloud services have been and will continue to play an integral role in this big data platform and others.

“The NIH in the Office of Data Science Strategy has been negotiating rates with cloud vendors, so that we can provide these cloud storage free of cost to the community and at a discounted rate to the institute. So even if folks are using the services for computational purposes, they’re able to actually leverage and take benefit from the discounts that have been negotiated by the NIH with these cloud vendors,” she said. “We’re really happy to be working with multi-cloud vendors to be able to pass some savings on to really advanced science. We’re really looking to continue that effort and expand the capabilities with some of the newer technologies that have been buzzing this year, like generative artificial intelligence and things like that, and really provide those resources back to the community to advance the science.”

Like NIH, the Centers for Medicare and Medicaid Services is spending a lot of time thinking about its data and how to make it more useful for its customers.

In CMS’s case, however, the data is around the federal healthcare marketplace and the tools to make citizens and agency employees more knowledgeable.

Kate Wetherby is the acting director for the Marketplace Innovation and Technology Group at CMS.

 

 

 

 

 

 

 

 

 

 

 

Kate Wetherby, the acting director for the Marketplace Innovation and Technology Group at CMS, said the agency is reviewing all of its data sources and data streams to better understand what they have and make their websites and the user experience all work better.

“We use that for performance analytics to make sure that while we are doing open enrollment and while we’re doing insurance for people, that our systems are up and running and that there’s access,” she said. “The other thing is that we spend a lot of time using Google Analytics, using different types of testing fields, to make sure that the way that we’re asking questions or how we’re getting information from people makes a ton of sense.”

Wetherby said her office works closely with both the business and policy offices to bring the data together and ensure its valuable.

“Really the problem is if you’re not really understanding it at the point of time that you’re getting it, in 10 years from now you’re going to be like, ‘why do I have this data?’ So it’s really being thoughtful about the data at the beginning, and then spending the time year-over-year to see if it’s something you should still be holding or not,” she said.

Understanding the business, policy and technical aspects of the data becomes more important for CMS as it moves more into AI, including generative AI, chatbots and other tools.

CMS creating a data lake

Wetherby said CMS must understand their data first before applying these tools.

“We have to understand why we’re asking those questions. What is the relationship between all of that data, and how we can we improve? What does the length of data look like because we have some data that’s a little older and you’ve got to look at that and be like, does that really fit into the use cases and where we want to go with the future work?” she said. “We’ve spent a lot of time, at CMS as a whole, really thinking about our data, and how we’re curating the data, how we know what that’s used for because we all know data can be manipulated in any way that you want. We want it to be really clear. We want it to be really usable. Because when we start talking in the future, and we talk about generative AI, we talk about chatbots or we talk about predictive analytics, it is so easy for a computer if the data is not right, or if the questions aren’t right, to really not get the outcome that you’re looking for.”

Wetherby added another key part of getting data right is for the user’s experience and how CMS can share that data across the government.

In the buildup to using GenAI and other tools, CMS is creating a data lake to pull information from different centers and offices across the agency.

Wetherby said this way the agency can place the right governance and security around the data since it crosses several types including clinical and claims information.

The post Understanding the data is the first step for NIH, CMS to prepare for AI first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/feed/ 0
DoD Cloud Exchange 2024: USTRANSCOM’s Michael Howard on becoming a more agile-minded organization https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-transcoms-mike-howard-on-becoming-a-more-agile-minded-organization/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-transcoms-mike-howard-on-becoming-a-more-agile-minded-organization/#respond Tue, 26 Mar 2024 00:49:43 +0000 https://federalnewsnetwork.com/?p=4939358 The U.S. Transportation Command will kick off a three-year effort to make it easier for users to access data anywhere, anytime, its transformation chief says.

The post DoD Cloud Exchange 2024: USTRANSCOM’s Michael Howard on becoming a more agile-minded organization first appeared on Federal News Network.

]]>

The U.S. Transportation Command isn’t new to cloud services. The command has put workloads and applications in off-premise compute and store instances since 2016.

Despite its time and experience using cloud, USTRANSCOM continues to take a measured approach in how it expands the use of these capabilities.

Michael Howard, engineering and digital transformation division chief at USTRANSCOM, said about 60% of the command’s working capital fund programs are currently in the cloud, mostly though an infrastructure as a service approach.

Over the coming next few years, the goal is to combine process with technology to become a more agile organization, Howard said during Federal News Network’s DoD Cloud Exchange.

An important step in that direction is an upcoming update to the agency’s memo guiding IT modernization .

“The memo helps us set the tone for all of our IT efforts. It really shows the strategic level importance of modernization in an organization like USTRANSCOM that really needs to strive at being at the leading edge of the transportation industry,” Howard said. “We’re not only shifting our software development processes to be more agile, but at an organizational level, we’re also shifting to be more agile-minded. The modernization memo really projects some goals over the next 12 to 18 months.”

Improving USTRANSCOM’s IT management

USTRANSCOM also wants to further expand its DevSecOps platform and end the use of the waterfall development methodology, he said. It also wants to create a continuous authorization to operate process that is coupled with the agency’s software architecture and containerized microservices.

“We really will finally get after what we know today is as the way things communicate through ports and protocols, and really adopting application programming interface-based communication,” Howard said. “The key ingredient to it is something we added this year when the chief financial officer joined the chief information officer as a signature authority. We’re synergized from a business perspective, and we want to be cost-informed as we strategically move through this memo, certainly over the next 12 to 18 months.”

With the CFO actively involved with technology planning and implementation, Howard said command leaders can better answer questions about the cost to develop and sustain IT programs as well as the prioritization of these initiatives.

“To quote our CFO, she says, ‘One of my biggest problems is managing our IT portfolio.’ It’s the same problem, I think, across the entire DoD. The fortunate thing for USTRANSCOM is we are a working capital fund, whereas the appropriated combatant commands have to demonstrate a lot more scrutiny. We have some flexibility, but we also know that flexibility needs to have some responsibility,” he said. “Our memo before missed an opportunity where we could be more cost-informed. The other thing that does is it now provides some audit capability of that cost, schedule and performance — and then helps us be good stewards of taxpayer dollars as we maneuver through the cloud.”

New USTRANSCOM cloud initiative

A new initiative called USTRANSCOM Anywhere illustrates this more integrated approach to cloud.

The command wants to use Microsoft Azure for cloud hosting to gain some of the capabilities that come with the disconnected Microsoft Azure stack hosted on premise today. Through the Azure stack, USTRANSCOM would deploy capabilities as microservices so the right person could access data at the right time from anywhere through the unclassified network, Howard said.

“What we realized is that once we achieve a platform as a service capability of microsegmented, data-containerized applications, the next thing would be is how can that microsegmented data exist in a denied, degraded, intermittent or limited — DDIL — environment,” he said. “USTRANSCOM Anywhere has a focus to utilize our current [unclassified network] that we provide today and to segment that capability in a continuous integration, continuous delivery fashion.”

The agency will roll out USTRANSCOM Anywhere in a three-phased approach over the next three years.

Howard said this first year is focused on creating the “landing zone” to determine what services users will need most in the environment.

“That culminates with a beta test. Today, I think we looked at about 25 uses. That might increase as we learn more about the environment. It also culminates with 70 of 91 zero trust target-level activities,” he said. “Phase 2 looks like a deployment phase. We will look at the migration of on-premise services that we provide today and house them in the Azure cloud capability. Then, Phase 3 looks like the test and use cases in the disconnected state. We will look to the Azure stack capability to provide some of the microsegmented data in certain parts geographically to try to get it to as close as to what we would provide the warfighter tomorrow.”

Scalability, reliability of cloud services

Howard acknowledged this would be a major culture shift from how USTRANSCOM operates today and has for decades.

As part of the USTRANSCOM Anywhere initiative, the agency expects to introduce a virtual desktop infrastructure and a disconnected capability through the cloud.

“First and foremost, I suppose it’s an understanding of how this will work for global logistics. We wouldn’t want to rush to that,” Howard said. “Secondly, I would say that in the first year, this is really a proof of concept. We, again, with our CFO, are in line to prove out in that first year that this is really something that we want to do, instead of saying, ‘Yeah, we’re all in, and we have no points of return.’ The phasing really helps us get to some decision points to ensure that this is exactly how we want to proceed forward.”

Through all of these updated memos, new procedures and technology pilots, Howard said one of the most important goals is to improve how the command takes advantage of the scalability and reliability of cloud services to improve logistics, especially in a contested environment.

Getting warfighters the data necessary to make better and faster decisions is the most important metric underlying these efforts, he said.

“What’s nice is, today, with our modernization memo, we’re able to somewhat forecast what the probability is of cost, schedule and performance for an application to migrate, whether lift-and-shift or migrate through a DevSecOps platform. What’s nice about that is we are tied in with our enterprise IT portfolio mission area manager or our chief operating officer that’s listed on our modernization memo. And we give quarterly updates. Those quarterly updates actually go into an update to our CEO,” Howard said.

“We’re describing the benefits of a fully modernized platform as a service, where you have microsegmented containerized applications that exist for business function, have immutable code and are really, for lack of better words, defendable. The end state is truly that capability to be business-focused. It’s not that you do zero trust. It’s how you use it, and this is the same thing: It’s not that we’re doing the cloud. It’s how we’re using it.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: USTRANSCOM’s Michael Howard on becoming a more agile-minded organization first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-transcoms-mike-howard-on-becoming-a-more-agile-minded-organization/feed/ 0
CDAO expanding data integration layer for CJADC2 https://federalnewsnetwork.com/defense-main/2024/03/cdao-expanding-data-integration-layer-for-cjadc2/ https://federalnewsnetwork.com/defense-main/2024/03/cdao-expanding-data-integration-layer-for-cjadc2/#respond Fri, 22 Mar 2024 22:53:19 +0000 https://federalnewsnetwork.com/?p=4936514 The CDAO's minimum viable capability for the Pentagon’s Combined Joint All-Domain Command and Control initiative is up and running.

The post CDAO expanding data integration layer for CJADC2 first appeared on Federal News Network.

]]>
The Chief Digital and Artificial Intelligence Office’s minimum viable capability for the Pentagon’s Combined Joint All-Domain Command and Control initiative is up and running. The office is now getting ready to bring in more companies.

The CDAO will spend the next three to six months developing a set of requirements that will allow more companies to contribute to the expansion of the data integration layer.

“We’ve been doing this with key industrial partners, mostly Palantir and Anduril. The technology that we’ve built — it’s there, it’s available. That’s why we call it a minimum viable capability. We then need to build out a set of requirements that allow other industrial partners to join in,” CDAO’s chief Craig Martell said during the House Armed Services Cyber, Information Technologies, and Innovation Subcommittee hearing Friday.

CJADC2, more broadly, aims to use data, cloud, artificial intelligence and much more to connect all military assets and enable faster decision-making.

When Martell assumed his role as the CDAO’s chief in 2022, Deputy Defense Secretary Kathleen Hicks tasked the office with launching the Global Information Dominance Experiment, also known as GIDE.

Last year, the CDAO office held several GIDE events. The primary goal was to deliver a joint data integration layer allowing INDOPACOM, CENTCOM, NORTHCOM, their components and international partners to access and exchange data. The last iteration of the experiment resulted in the minimum viable capability for CJADC2.

“We’ve been building out the prototype of what it would mean for the hardware to support the flow of data across combatant commands so combatant commands have a unified picture of what’s going on in the world,” Martell said.

“We do it every 90 days to these guided exercises, as the key learning exercise to understand through Wargaming what combatant commanders would need to see and what all of the components under the combatant commanders would need to see, would need to exchange and how data would need to flow in order for it to go from swivel chair and PowerPoint and email to digital data flows as that information goes across the combatant command and within the combatant commands,” he added.

The Pentagon has been vague about the applications or the regions the minimum viable capability is currently being used.

During the hearing, Martell was also asked about the possibility of his office having direct authority over the military services and their development of CJADC2 solutions.

“I’m not a fan of that. And I’m not a fan of hard authority there. Let me just say, in general, the center should provide oversight and policy and best practices. The edge knows their problems best. Solving the problems from the center and imposing it upon the edge, I think is dangerous. It’s going to create a one-size-fits-all solution,” said Martell.

He said there has to be authority about the interface, which will allow the right data to flow out of the services. The Army-led Project Convergence and the Navy’s Project Overmatch, for example, aim to find solutions for the ways data would flow.

“What we’ve been doing is putting the tech before the policy. To allow the data to flow is going to force the right questions. Then there’s going to be an increased demand for data to flow. And then we can say, ‘Well, that now is a policy issue that we can tackle,” Martell said.

“The change is not going to my very strong opinion, that change is not going to happen by some large a priori view, a philosophical view of the way the world should be, and then trying to implement that.”

The post CDAO expanding data integration layer for CJADC2 first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/defense-main/2024/03/cdao-expanding-data-integration-layer-for-cjadc2/feed/ 0
Federal agencies beware: AI is not all it’s cracked up to be – at least not yet https://federalnewsnetwork.com/commentary/2024/03/federal-agencies-beware-ai-is-not-all-its-cracked-up-to-be-at-least-not-yet/ https://federalnewsnetwork.com/commentary/2024/03/federal-agencies-beware-ai-is-not-all-its-cracked-up-to-be-at-least-not-yet/#respond Fri, 22 Mar 2024 11:30:37 +0000 https://federalnewsnetwork.com/?p=4935653 The general public is talking more about and understanding the possibilities with AI. This buzz is also present inside federal agencies. 

The post Federal agencies beware: AI is not all it’s cracked up to be – at least not yet first appeared on Federal News Network.

]]>
Samsung released its new Galaxy S24 phone in January. The big buzz was centered around how Galaxy users discover the world around them by incorporating artificial intelligence into photo manipulation, product searches for e-commerce, instantaneous language translation while abroad, and more. The general public is talking more about and understanding the possibilities with AI. This buzz is also present inside federal agencies.

There’s no doubt that artificial intelligence has begun – and will continue – to transform the way we work and live, and there’s no denying its power and potential. But the fact is despite what most people want to believe what AI is and can do, we’re just not there yet.

As the founder of a digital accessibility company who is disabled and has challenges completing many everyday tasks that most people take for granted, there isn’t a bigger fan of AI than me. I have big hopes for it. In time, I know it will succeed. I’m also a realist, and I know that time isn’t now.

AI has made significant strides in recent years, with advancements in machine learning, natural language processing and computer vision. However, despite these breakthroughs, AI still has a long way to go before reaching its full potential. In fact, there are numerous challenges and limitations that hinder the development and deployment of AI systems. Rather than talk about them in great detail, it’s better to move the conversation along by concentrating on viable solutions that will bring us closer to implementing real AI.

Patience is a virtue

The most prudent thing that any federal agency looking to adopt AI can do is sit on the sidelines and wait. Granted, that’s not the popular answer or the response that anyone wants to hear, but it’s the smartest move at this point in time. Federal agencies that take a premature leap into AI will most likely be disappointed, waste time and money, and will probably have more work redoing tasks that didn’t yield the desired outcomes from AI. This technology is in its infant stages. Although it is groundbreaking and exciting, we must walk before we can run.

Better data and bias mitigation

You may have heard the saying, “garbage in, garbage out.” AI systems are data-driven, and if that data is skewed, incorrect or biased, the AI models will inherit and perpetuate these mistakes. Issues related to fairness, transparency and accountability are significant concerns. It’s critical that federal agencies take extreme caution in this area. They must implement measures to identify and mitigate biases and errors in AI algorithms and data sets. This may involve conducting thorough bias assessments during the development phase and ongoing monitoring of AI systems in operation. Agencies must remember that when it comes to data, humans tend to steer things for the outcome they want instead of what the facts are. Right now, there is such a backlog of corrupt data that current AI models misinterpret differentiating between correct information and skewed data. AI can only be as good as the information it is given.

Implementing safeguards

Federal agencies must have safeguards in place when it comes to AI. Promote transparency in AI systems by documenting their development process, data sources, algorithms used, and again, potential biases. Agencies should also establish mechanisms for accountability, such as assigning responsibility for AI system decisions and outcomes. Ensure that AI systems are interpretable and explainable, especially for critical decision-making processes. This involves designing algorithms that produce transparent results and providing explanations for AI-generated decisions when necessary.

Risk management

Even with all these safeguards in place, federal agencies must conduct comprehensive risk assessments to identify potential risks associated with AI implementation, including cybersecurity threats, legal liabilities and unintended consequences. Develop risk mitigation strategies to address these concerns effectively. We’ve already seen what happens when we’re not careful. Take AI facial recognition technology, for example. The number of innocent people arrested after being misidentified by AI facial recognition technology (FRT) keeps increasing, wreaking havoc on innocent people and bringing about lawsuits against federal agencies. Similarly, federal agencies need to be especially careful when it comes to predictive modeling. A predictive policing algorithm was recently found to be both discriminatory and inaccurate.

Collaboration and knowledge sharing

Federal agencies must take a think-tank approach to AI because we’re all in this together. Foster collaboration and knowledge sharing among federal agencies, industry partners, academic institutions, and other stakeholders. Sharing best practices, lessons learned and research findings can help improve the responsible use of AI across the government. Similarly, federal agencies must establish mechanisms for continuous monitoring and evaluation of AI systems’ performance, effectiveness and impact. This includes soliciting feedback from end-users and stakeholders to identify areas for improvement and address emerging issues promptly.

The takeaway

I can’t wait for the day that AI gets to where it fully enhances the way we live and work, but that day isn’t today. It won’t be next month or next year, either. We’ve only begun to scrape the surface. To celebrate AI as this life-changing technology that’s revolutionizing a new mobile device or anything else is irresponsible and nothing more than marketing hype at this time. Consumers see a new phone with built-in AI and think they need it, yet most couldn’t explain why or tell you how or if it will differ from their current device. It’s no different at the federal level when agencies want a faster and better way to do things. But all in due time.

In order for AI to truly reach its full potential will take researchers, developers, policymakers and ethicists to work collaboratively to navigate the complex landscape of AI development, ensuring that it evolves responsibly and ethically. Only through concerted efforts and further development can we pave the way for AI to make a lasting, positive impact on society, the way everyone imagines.

Mark Pound is the founder and CEO of CurbcutOS, a digital accessibility firm making the digital world more user-friendly for people with disabilities.

The post Federal agencies beware: AI is not all it’s cracked up to be – at least not yet first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/03/federal-agencies-beware-ai-is-not-all-its-cracked-up-to-be-at-least-not-yet/feed/ 0
DISA sets the table for better AI with data management https://federalnewsnetwork.com/artificial-intelligence/2024/03/disa-sets-the-table-for-better-ai-with-data-management/ https://federalnewsnetwork.com/artificial-intelligence/2024/03/disa-sets-the-table-for-better-ai-with-data-management/#respond Tue, 19 Mar 2024 20:43:53 +0000 https://federalnewsnetwork.com/?p=4931530 Steve Wallace, the director of emerging technology at DISA, said a new tool, called Concierge AI, will reduce the friction to the user to find and analyze data.

The post DISA sets the table for better AI with data management first appeared on Federal News Network.

]]>
var config_4931637 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2519135803.mp3?updated=1710879959"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"DISA sets the table for better AI with data management","description":"[hbidcpodcast podcastid='4931637']nnThe Defense Information Systems Agency\u2019s data strategy is less than two years old, but it\u2019s already ripe for an update.nnThe next version, under development, will put an even sharper focus on data quality and advanced analytics to improve how DISA uses artificial intelligence and other tools.nnSteve Wallace, the director of emerging technology and chief technology officer at DISA, said a new tool, called Concierge AI, embodies the agency\u2019s plans for integrating data with AI today and in the future.nn[caption id="attachment_2303751" align="alignleft" width="300"]<img class="wp-image-2303751 size-medium" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2019\/03\/disa-wallace-e1581700154514-300x200.jpg" alt="" width="300" height="200" \/> Steve Wallace, the director of emerging technology and chief technology officer at the Defense Information Systems Agency.[\/caption]nn\u201cHow do we augment our staff, leveraging large language models, and bring the sheer amount of data that we have, whether it be unstructured documents or structured documents, and deliver that to the user in a user friendly sort of manner?\u201d Wallace said in an interview with Federal News Network. \u201cAlmost like a chat bot that you're seeing in many different places, but how do we make that specific around the DISA mission, sometimes focused on the back office features, but then also with an eye on how do we do defense cyber operations (DCO) and help an analyst better do their job to dissect an attack and what have we seen before, based on after action reports and that type of thing.\u201dnnThe overarching goal is to reduce the friction to the user to find and analyze data to drive better decisions. And doing all it in a way that uses natural language to make it easier on the employees, Wallace said.nnThat idea of reducing friction, making data easier to understand and use is central to the update to the DISA data strategy.nnDISA wrote in a <a href="https:\/\/www.linkedin.com\/pulse\/disa-outlines-blueprint-new-data-strategy-u4jfc\/?trackingId=hNpbXWugSH%2BukncYhngytA%3D%3D" target="_blank" rel="noopener">LinkedIn post<\/a> on March 13 that the agency has made progress around setting up data governance and a data architecture as part of its implementation plan.nn\u201cThe evolution of quality data and advanced analytics within the DISA community is the sole focus of the chief data office and will empower the agency to harness AI technology and AI situational awareness, predictive intelligence and decision-making agility, thereby enhancing national security and organizational readiness,\u201d DISA wrote.nnThe updated <a href="https:\/\/federalnewsnetwork.com\/defense-main\/2021\/11\/dod-it-agency-gets-chief-data-office-to-help-shift-toward-automated-cybersecurity\/">DISA Data Strategy<\/a> for 2025-2027 will focus on these mechanisms as part of the modernization and maturation of the agency\u2019s data efforts.n<h2>DISA's lessons in using AI<\/h2>nConcierge AI is part of how DISA is demonstrating the modernization and maturation. Wallace said the initial pilot is giving security folks and users a level of comfort in using the LLMs in a government-cloud Impact Level 5 environment.nn\u201cSome of the lessons we've learned is really around how do we secure these [LLM] environments? The concept of these vector databases is generally new, how do we secure them? How do we make sure that we're doing the right thing by the data that we're ultimately storing,\u201d Wallace said. \u201cI think we're going to learn a lot as well as we start to ingest a large document set, which we haven't necessarily done yet in the lab. It's been very small dribs and drabs, but I've been encouraged by what I've seen just with the limited amount of what we have been able to do. In the first half of this calendar year, we expect to have something out to the digital workforce to start experimenting with, and from there, we'll gather information about the user\u2019s experience, and then, potentially, make it go more wide scale.\u201dnnOne big opportunity to take <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2024\/02\/dod-wants-to-popularize-data-patterns-to-leverage-ai\/">better advantage of LLMs and AI tools<\/a> is for defensive cybersecurity actions. Wallace said some of DISA\u2019s cyber analysts already are modeling an attack, decomposing an attack and understanding exactly what happened by applying AI and LLMs.nn\u201cAny way that we can augment them by taking the datasets and feeding them into some sort of model to provide some sort of output so that they can have a one stop shop, if you will, to understand the dynamics of things that we're seeing is probably one of the biggest ones that that's out there in front of us,\u201d he said.n<h2>Other priorities: Quantum, mobile devices<\/h2>nAside from AI, Wallace is also focused on <a href="https:\/\/federalnewsnetwork.com\/defense-news\/2023\/09\/a-dozen-or-more-pilots-advancing-disas-cyber-cloud-efforts\/">several other priorities<\/a>, including quantum encryption and rolling out classified mobile devices.nnDISA awarded an Other Transaction Agreement to Sandbox AQ to figure out how to build a quantum resistant infrastructure in 2023. The prototype under development is for quantum resistant cryptography public key infrastructure.nn\u201cWe're in the phase right now of doing some crypto discovery. The OTA has, I think, eight different deliverables. We're approximately halfway through it right now,\u201d Wallace said. \u201cThis is about an education for us, and how we're equipping the workforce to actually understand how some of these things work and the differences and, and watching as all of this evolves a lot more to come.\u201dnnThe classified mobile device effort is further along. Wallace said he expects DISA to start rolling out the next generation devices in the coming months.nn "}};

The Defense Information Systems Agency’s data strategy is less than two years old, but it’s already ripe for an update.

The next version, under development, will put an even sharper focus on data quality and advanced analytics to improve how DISA uses artificial intelligence and other tools.

Steve Wallace, the director of emerging technology and chief technology officer at DISA, said a new tool, called Concierge AI, embodies the agency’s plans for integrating data with AI today and in the future.

Steve Wallace, the director of emerging technology and chief technology officer at the Defense Information Systems Agency.

“How do we augment our staff, leveraging large language models, and bring the sheer amount of data that we have, whether it be unstructured documents or structured documents, and deliver that to the user in a user friendly sort of manner?” Wallace said in an interview with Federal News Network. “Almost like a chat bot that you’re seeing in many different places, but how do we make that specific around the DISA mission, sometimes focused on the back office features, but then also with an eye on how do we do defense cyber operations (DCO) and help an analyst better do their job to dissect an attack and what have we seen before, based on after action reports and that type of thing.”

The overarching goal is to reduce the friction to the user to find and analyze data to drive better decisions. And doing all it in a way that uses natural language to make it easier on the employees, Wallace said.

That idea of reducing friction, making data easier to understand and use is central to the update to the DISA data strategy.

DISA wrote in a LinkedIn post on March 13 that the agency has made progress around setting up data governance and a data architecture as part of its implementation plan.

“The evolution of quality data and advanced analytics within the DISA community is the sole focus of the chief data office and will empower the agency to harness AI technology and AI situational awareness, predictive intelligence and decision-making agility, thereby enhancing national security and organizational readiness,” DISA wrote.

The updated DISA Data Strategy for 2025-2027 will focus on these mechanisms as part of the modernization and maturation of the agency’s data efforts.

DISA’s lessons in using AI

Concierge AI is part of how DISA is demonstrating the modernization and maturation. Wallace said the initial pilot is giving security folks and users a level of comfort in using the LLMs in a government-cloud Impact Level 5 environment.

“Some of the lessons we’ve learned is really around how do we secure these [LLM] environments? The concept of these vector databases is generally new, how do we secure them? How do we make sure that we’re doing the right thing by the data that we’re ultimately storing,” Wallace said. “I think we’re going to learn a lot as well as we start to ingest a large document set, which we haven’t necessarily done yet in the lab. It’s been very small dribs and drabs, but I’ve been encouraged by what I’ve seen just with the limited amount of what we have been able to do. In the first half of this calendar year, we expect to have something out to the digital workforce to start experimenting with, and from there, we’ll gather information about the user’s experience, and then, potentially, make it go more wide scale.”

One big opportunity to take better advantage of LLMs and AI tools is for defensive cybersecurity actions. Wallace said some of DISA’s cyber analysts already are modeling an attack, decomposing an attack and understanding exactly what happened by applying AI and LLMs.

“Any way that we can augment them by taking the datasets and feeding them into some sort of model to provide some sort of output so that they can have a one stop shop, if you will, to understand the dynamics of things that we’re seeing is probably one of the biggest ones that that’s out there in front of us,” he said.

Other priorities: Quantum, mobile devices

Aside from AI, Wallace is also focused on several other priorities, including quantum encryption and rolling out classified mobile devices.

DISA awarded an Other Transaction Agreement to Sandbox AQ to figure out how to build a quantum resistant infrastructure in 2023. The prototype under development is for quantum resistant cryptography public key infrastructure.

“We’re in the phase right now of doing some crypto discovery. The OTA has, I think, eight different deliverables. We’re approximately halfway through it right now,” Wallace said. “This is about an education for us, and how we’re equipping the workforce to actually understand how some of these things work and the differences and, and watching as all of this evolves a lot more to come.”

The classified mobile device effort is further along. Wallace said he expects DISA to start rolling out the next generation devices in the coming months.

 

The post DISA sets the table for better AI with data management first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/artificial-intelligence/2024/03/disa-sets-the-table-for-better-ai-with-data-management/feed/ 0
Using end-to-end observability for cyber, CX improvements https://federalnewsnetwork.com/innovation-in-government/2024/03/using-end-to-end-observability-for-cyber-cx-improvements/ https://federalnewsnetwork.com/innovation-in-government/2024/03/using-end-to-end-observability-for-cyber-cx-improvements/#respond Tue, 19 Mar 2024 02:48:00 +0000 https://federalnewsnetwork.com/?p=4930537 Brian Mikkelsen, the vice president for US public sector at Datadog, said reducing tool complexity helps agencies understand how their systems are working.

The post Using end-to-end observability for cyber, CX improvements first appeared on Federal News Network.

]]>

The Office of Management and Budget’s 2022 IT Operating Plan highlighted the need to reduce complexity of systems to bring down costs. And, of course, it promoted the idea of using data to better drive decisions.

Over the years, agencies and vendors have made their technology environments a little more complex due to too many bespoke tools and a lack of integration of data. With all those challenges that that have come up over the last 20-25 years, OMB has actually pushed agencies toward enterprise services as one way to overcome many of these IT modernization obstacle.

There’s other opportunities for agencies to become more efficient, more secure, and improve how they deliver services decisions. One way: the use of end-to-end observability tools that can help agencies innovate by consolidating the tools that they use, reducing the complexity of those tools, and of course, give them visibility across many of their tools across the technology stack.

Brian Mikkelsen, the vice president and general manager for US public sector at Datadog, said end-to-end observability gives organizations an opportunity to observe or monitor any application, any infrastructure, anywhere. This includes infrastructure and applications no matter if they are on-premise or in the cloud.

“The three pillars of observability at its core context is infrastructure metrics. This is understanding the health of my operating systems, my virtual machines, my containers, all the way up into cloud native serverless functions,” Mikkelsen said on the discussion Innovation in Government, sponsored by Carahsoft. “It’s infrastructure metrics paired with application traces so now I’m starting to think about on top of that infrastructure, where am I running my applications, whether it’s on-premise or in the cloud, but what can I actually see in terms of how my applications are performing? What are they doing from a memory constraints perspective? What’s their overall performance? How much lag time is there between requests and actions? The third component of that three pillars of observability is logs. So it’s the end-to-end observability part is really this idea that we’re creating context for the users of these systems.”

Reducing time to solve problems

One of the biggest benefits of this approach is reducing the number of tools required to monitor networks, mitigate risks and creating context between infrastructure, applications and logs.

“The real benefit is to try and reduce the time to know when I have a problem. And the reduced  time to solving that problem is correlating all that information and not having separate teams working in separate tools, all with a separate perspective,” Mikkelsen said. “One of the key characteristics of a more modern observability and security solution, we talk all the time about the cultural changes of getting people out of individual tools and individual contexts, and giving everybody the same view of the same information. I don’t want to have five tools and five teams looking at it from a different perspective. I want one tool with all the teams in that same tool, folks having the same context so we’re not arguing about what’s happening. We’re observing what’s happening, and we’re solving for it.”

The need to solve problems more quickly is as much about the evolving nature of the cyber threat as it is about meeting the growing expectations of an organization’s customers.

A recent Government Accountability Office report found agencies are struggling to meet the cybersecurity logging requirements as required by President Joe Biden’s May 2021 executive order.

“What it’s really asking you to be able to do is track issues in real time, hold those logs in storage for, I think, a minimum of 12 months in hot storage, and I think 30 months in total,” Mikkelsen said. “The benefit of an end-to-end observability and security company is that we think about logs in multiple perspectives. We can talk about IT infrastructure and application. But here from a cybersecurity perspective, now, we’re really talking about cloud security management.”

Solving mission problems

From a customer experience perspective, end-to-end observability also includes tools that provide digital experience monitoring.

Mikkelsen said the tools help organizations understand the user’s experience from login throughout the entire front-end event.

“They can generally understand what’s working and where are the bottlenecks. What are the challenges with that customer’s front end experience?” he said. “If you think about this from a synthetics [data] point of view, what synthetics allows you to do is proactively understand ‘is that system up and is that front end application up and running the way I want it to? Is it handling requests from various operating systems? Is it working with various browsers?’ And we can actually set up proactive tests so even more important than knowing when you have an issue and fixing it is knowing you have it before it’s a real issue, and resolving it before you have a negative customer experience or citizen experience. This all boils down to the real drive for a lot of our IT mission owners across government: They’re in the business of solving for the mission. A lot of times the mission is improving the citizen’s experience with government.”

Mikkelsen said the Agriculture Department’s Digital Infrastructure Services Center (DISC) took advantage of end-to-end observability tools and saw immediate improvements.

“They had one ongoing problem with memory utilization. The way I think about it was it was an executable loop and every time it fired up, it was causing memory depletion. That same or systematic set of tickets had popped up something in the neighborhood of 700 times in a short period of time,” he said. “They’ve taken that memory utilization challenge down from 700 plus tickets down to zero tickets relatively quickly because we were able to show them what the challenge was. On top of that, they were able to bring, I think, 95% of their target infrastructure up and running with monitors and dashboards from an observability point of view within 75 days. I think that includes over 4,000 containers as part of that infrastructure setup.”

The post Using end-to-end observability for cyber, CX improvements first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2024/03/using-end-to-end-observability-for-cyber-cx-improvements/feed/ 0
Federal News Network’s Industry Exchange Data 2024 https://federalnewsnetwork.com/cme-event/big-data/federal-news-networks-industry-exchange-data-2024/ Mon, 18 Mar 2024 17:00:35 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4929833 Are you taking full advantage of data to drive change in your agency? Join us to discover the latest techniques and technologies to help do just that.

The post Federal News Network’s Industry Exchange Data 2024 first appeared on Federal News Network.

]]>
How are you employing data to modernize your agency? Join us May 6 at 1 p.m. ET for Federal News Network’s Industry Exchange Data event where leading technologists will share insights, tips and technology know-how.

Our 2023 Industry Exchange Data event featured experts from Alation, Commvault, Amazon Web Services, Dataiku, ThunderCat Technology, Dell Technologies and TVAR Solutions.

Register today to save the date on your calendar and receive updates!

The post Federal News Network’s Industry Exchange Data 2024 first appeared on Federal News Network.

]]>
Why DoD needs 5G private networks for training https://federalnewsnetwork.com/federal-insights/2024/03/why-dod-needs-5g-private-networks-for-training/ https://federalnewsnetwork.com/federal-insights/2024/03/why-dod-needs-5g-private-networks-for-training/#respond Fri, 15 Mar 2024 14:53:04 +0000 https://federalnewsnetwork.com/?p=4927249 Private 5G networks allow DoD to replicate real-world environments for training, as well as enable new technologies. But it needs a foundation.

The post Why DoD needs 5G private networks for training first appeared on Federal News Network.

]]>
As the military continues to experiment with and adopt new technologies, it’s becoming increasingly important to ensure that soldiers are training the way they’ll be expected to operate, either in peacetime or in conflicts. But most military training bases are in remote areas that lack the kind of network coverage from public operators needed for things like augmented or virtual reality (AR/VR), and to conduct drones and connected warfighter trainings. That’s why it’s important for them to consider 5G private networks to get the high degree of connectivity and bandwidth needed to replicate real-world environments and meet the requirements of today’s mission.

“Creating high-bandwith connectivity across sprawling military bases is a challenge, “ said Steve Vogelsang, chief technology officer for Nokia Federal Solutions. “That’s where private wireless shines, ensuring the bandwidth and coverage required to enable seamless monitoring of training exercises and the advanced use cases being deployed.”

Before the advent of private cellular technologies, most military communications were handled through push-to-talk two-way radio systems. Those were excellent at voice communications, but today’s mission also requires text, high-speed data, and all the other features of modern communications that most people take for granted. Troops need to be able to use smart phones and tablets, cell modems, virtual private networks for encrypted communications and data, as well as transmit video for surveillance and monitoring purposes.

Security concerns

The modern battlefield also requires much higher security standards. In recent years, there have been a number of incidents where warfighters using everyday apps like FitBit, Facebook, TikTok and the beer rating app Untappd over public networks have inadvertently revealed training methods, locations and other data that could compromise operations or reveal classified information.

“So as long as there’s a way for data to get out to the internet, there’s a security concern,” said Robert Justice, chief technology officer for Future Technologies Ventures, LLC. “So by locking all that down and having no outbound internet access, we keep everything closely held. And it removes that ability to get into the public domain where an adversary may be able to inadvertently get that information.”

And the security concerns don’t end with operational security; keeping bad actors locked out is just as important as keeping sensitive data locked in. Warfighters need to learn how to defend against cyberattacks, which are increasingly being integrated into the arsenal of modern conflicts; that became especially apparent as the war in Ukraine unfolded. Warfighters training to defend against these cyberattacks aren’t yet prepared to fend off real-world attacks at the same time, and actual intrusions could muddy and confuse training exercises. In order to mitigate these risks and control the variables, it’s just as important to secure these training networks against inbound traffic as it is to secure outbound data.

Training scenarios

And that’s what these training exercises are all about: replicating real-world communications as closely as possible in a controlled environment.

“When you’ve got a brigade of 5000 troops coming in, part of what they have as a mission is to win the hearts and minds. So they need to be able to communicate with the local population in a realistic way,” Justice said. “So this cell phone network started out as a way to represent realistic communications that troops would see when they come into a theater of operations. So it gives them a way to communicate with all these personnel. It let the troops communicate with the politicians, with the law enforcement, things like that in their training scenarios.”

Since then, it’s evolved to include things like drone and other unmanned vehicle operations, and AR/VR tools to support long-distance training. The high bandwidth and lower latency of 5G networks make these technologies viable, assuming bases have the access to the spectrum they need.

Importance of spectrum

And that’s one of the biggest questions when it comes to laying the foundation for these private 5G networks at training bases, or really any base: What access to spectrum does the base already have, and what does it need to accomplish the mission? Voice communications, text, images and data require less throughput than things like AR/VR or unmanned vehicles, so they require access to different parts of the spectrum. But in recent years, the FCC has been selling off parts of the spectrum that the Defense Department  used to own to the private sector, with more such sales planned for the future. That means bases may have look into leasing agreements, or reconsider what bits of spectrum they plan to sell with regards to potential future needs.

“Having access to spectrum is critical to enable the bandwidth and performance required for private 5G network installations,” Vogelsang said. “The Defense Department has access to some 5G bands, but ultimately will need to find ways to partner with the mobile network operators to utilize their licensed spectrum and find ways to dynamically share spectrum for multiple uses.”

The post Why DoD needs 5G private networks for training first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/03/why-dod-needs-5g-private-networks-for-training/feed/ 0
DoD Cloud Exchange 2024: Qlik’s Andrew Churchill on unifying DoD’s cloud enterprise https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-qliks-andrew-churchill-on-unifying-dods-cloud-enterprise/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-qliks-andrew-churchill-on-unifying-dods-cloud-enterprise/#respond Fri, 15 Mar 2024 00:11:20 +0000 https://federalnewsnetwork.com/?p=4926637 As part of its move to the cloud, the Defense Department needs to integrate data and platforms to drive better, faster decisions, Qlik’s federal VP says.

The post DoD Cloud Exchange 2024: Qlik’s Andrew Churchill on unifying DoD’s cloud enterprise first appeared on Federal News Network.

]]>

The Defense Department’s IT modernization journey is as complicated as it is long. There is plenty of progress but even more opportunity to take advantage of cloud services capabilities — current and still to come. In 2024, DoD asked for more than $58 billion for technology and cyber funding, which is $13 billion more than what it asked for in 2023.

Andrew Churchill, vice president of public sector at Qlik, said while the funding is important, the Pentagon still is working to overcome policy hurdles, such as those around cyber authorization to move systems and data outside of its on-premise data centers and networks and to the cloud.

“One of the things that’s really important is now creating an enterprise of enterprises. DoD has made awards to Microsoft, Google, AWS and other cloud providers, and they now need to make sure that those systems and data environments are interconnected and operate just as they did when it was all behind their firewall,” Churchill said on Federal News Network’s DoD Cloud Exchange 2024. “One of the big things that needs to happen is a cultural shift around how they are going to bring all of those people that manage these platforms together and begin to break down some of those silos now that they’re in the cloud so they can take advantage of what the cloud is designed to make possible.”

Churchill, of course, is referring to the data that lives in each of the cloud instances. Military and civilian employees from across the department must share information and communicate in bigger and more immediate ways than ever before.

Available, trusted and ready DoD cloud presences

He said this is why DoD must rationalize the policies and access to those systems to better support coordination — to create agility within the processes that integrate and govern data.

“We reimagined the way that those cloud services were going to be consumed and deployed, and therefore how we architected those systems. What we really see as the potential is the idea that you are not going to simply deliver that same application that you had on-premise. You are going to have a set of services from ServiceNow, AWS and Salesforce, and build a set of capabilities that does benefits enrollment or does personnel readiness in the DoD,” Churchill said. “So how am I going to make those things available, trusted and ready to be able to support what obviously is going to become the most important thing in terms of strategic advantage going forward?”

This integration of different software as a service applications is starting to pick up steam across DoD.

The Navy’s big data platform, Jupiter, and the Army’s enterprise resource planning system, the Enterprise Business Systems – Convergence, are two examples of  such one-stop-shop platforms for cloud services.

Churchill said in the end, for both DoD and civilian agencies, these technologies all must lead to improved mission outcomes. In that vein, agencies need to rethink the path they take to IT modernization, he said.

“With low-code, no-code types of capabilities, the level of effort that you previously needed to deliver new capabilities is very different,” Churchill said. “When you start talking about artificial intelligence and analytics, it is more and everywhere. That should be the goal if you’re going to deliver financial management data that belongs everywhere in personnel decisions, supply chain decisions and in tactical execution. It’s about pervasively embedding decision support capability in business processes, in mission process workflows and everywhere you go.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Qlik’s Andrew Churchill on unifying DoD’s cloud enterprise first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-qliks-andrew-churchill-on-unifying-dods-cloud-enterprise/feed/ 0
Multimodal geospatial data – Shrinking time from acquisition to actionable insights https://federalnewsnetwork.com/commentary/2024/03/multimodal-geospatial-data-shrinking-time-from-acquisition-to-actionable-insights/ https://federalnewsnetwork.com/commentary/2024/03/multimodal-geospatial-data-shrinking-time-from-acquisition-to-actionable-insights/#respond Wed, 13 Mar 2024 20:03:52 +0000 https://federalnewsnetwork.com/?p=4924638 Why is geospatial data performance (fast query time) such an obstacle? Various types of geospatial data are often kept in purpose-built data silos.

The post Multimodal geospatial data – Shrinking time from acquisition to actionable insights first appeared on Federal News Network.

]]>
Disputes and conflicts abroad call for rigorous contingency planning on the part of U.S. defense and intelligence operations. Part of this preparation involves having an accurate, up-to-the-minute intelligence from multiple geospatial modalities.

Geospatial knowledge has always been core to military intelligence, but its highly dynamic nature makes it vital to collapse the time window between data capture, and analysis and dissemination. Today, this latency is growing more pronounced as data volume, variety and speed grow at a mind-boggling rate. Such cumbersome access to information leads to slower, less accurate decision-making, which can negatively impact geo-intelligence.

Why is geospatial data performance (fast query time) such an obstacle? Various types of geospatial data are often kept in purpose-built data silos. Not having all this data in one place is a major impediment, forcing geospatial analysts to resort to inefficient, time-consuming and cumbersome methods to consolidate and analyze data in aggregate, where it becomes inherently richer.

Any single dataset only gives a finite amount of information that is used for a limited number of purposes. Integrating and linking these datasets is the key to unlocking valuable insights that lead to better decision-making. Spatial overlays – or the joining and viewing together of separate datasets sharing all or part of the same area – are an example. In a defense context, this ability to view data in aggregate via superimposing may translate into increased situational awareness, such as mapping terrain and the movement of people to identify hotspots for illegal border crossings or drug trafficking.

Traditionally, data analysts have had to download multiple file formats and develop their processing pipelines to synthesize and enrich data in this way. Before starting a processing task, an analyst had to search various databases to find the needed data and then download this complex data in multiple formats as inputs into a processing pipeline, with each input requiring its own API. In a defense example, target detection using hyperspectral data requires a custom processing pipeline incorporating aerial imagery for context and possibly point clouds for advanced 3D visualization.

Naturally, this convoluted approach hinders the ability to do rapid data processing spanning multiple sources. There isn’t a single, consolidated place for all geospatial analytics and machine learning, preventing deeper data contextualization.

Rapid processing from multiple data sources is the key to achieving this integrated information richness that supports more informed decision-making. Beyond basic data access and capture, this type of analysis adds even more complexity because heterogeneous tools are used to analyze each data type. For example, currently, advanced imagery analytics require custom tools with limited API integration. Imagine the power that could be unleashed if a single API optimized data access and could integrate all of these tools.

Finally, today’s geospatial analysts face restrictive computing limitations, given that data and compute are largely kept separate. Analysts often have to take the time to spin up clusters, which is outside of their core competency and can slow down time to insights even further. Advances in serverless architectures can eradicate this problem, allowing developers to spin up or down as many applications as they want, as frequently as possible, without concerns about hardware availability.

Any industry relying on geospatial data needs a new approach, one that is capable of delivering insights in minutes as opposed to days or weeks. This can be achieved through:

  • A single platform to support all data types – there needs to be an efficient and unified method to store and analyze all geospatial data and derivative results.
  • Distributed and highly scalable computing that allows geospatial analysts to fully embrace the cloud to run any pipeline at scale without having to initiate and activate clusters.
  • Finally, all of this needs to be accomplished while protecting sensitive information and ensuring data integrity. There should be compliant and isolated on-premises capabilities to ensure compliance with data sovereignty requirements for both your mission and partners.

Geospatial knowledge continues to offer a deep well of insights that are used for the betterment of defense and intelligence operations and, at a higher level, human society. However, the volume, variety, and velocity of this data require a new approach to manage it cohesively since current methods are too fragmented. Doing so will be the key to maximizing the power of geospatial information in the coming years, hopefully transforming data into life-changing intelligence within increasingly short timespans.

Norman Barker is vice president of geospatial at TileDB.

The post Multimodal geospatial data – Shrinking time from acquisition to actionable insights first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/03/multimodal-geospatial-data-shrinking-time-from-acquisition-to-actionable-insights/feed/ 0
At VA, AI and data optimization hold the promise of better health outcomes, job satisfaction https://federalnewsnetwork.com/federal-insights/2024/03/at-va-ai-and-data-optimization-hold-the-promise-of-better-health-outcomes-and-job-satisfaction/ https://federalnewsnetwork.com/federal-insights/2024/03/at-va-ai-and-data-optimization-hold-the-promise-of-better-health-outcomes-and-job-satisfaction/#respond Tue, 12 Mar 2024 18:37:25 +0000 https://federalnewsnetwork.com/?p=4922598 With the health and well being of veterans at stake, and the large volume of their data on hand, the VA puts the highest level of importance on security.

The post At VA, AI and data optimization hold the promise of better health outcomes, job satisfaction first appeared on Federal News Network.

]]>

At the Department of Veterans Affairs, the goal of increasing AI use cases means better health outcomes for veterans, and greater job satisfaction for the VA employees who serve them. Through programs like Stratification Tool for Opioid Risk Mitigation (STORM) and Recovery Engagement and Coordination for Health – Veterans Enhanced Treatment (REACH VET), data is being analyzed to proactively identify veterans that might benefit from a specific intervention. Following the executive order on artificial intelligence, VA was able to identify a few dozen use cases for AI. Radiology is particularly leading the way, with over 75% of FDA approved AI devices being from radiology, but the department notes that AI has the ability to improve functions throughout their operations.

“In general, we see the potential for AI to contribute in the near term across really a wide range of areas at VA. And I think I see that in kind of three main categories. So one category is in reducing health care provider burnout. And in that vein we have actively these AI tech sprints which are part of the AI executive order. We’re running two of them right now. One is around assisting with documenting clinical encounters, and the other is extracting information from paper medical records,” Kimberly McManus, deputy chief technology officer of artificial intelligence at the Department of Veterans Affairs, said on Federal Monthly Insights – Operationalizing AI. “Another area similar to that is broadly improving veteran and staff experience. So this is around how can we  augment our current staff by helping with these tedious tasks and reducing administrative burden… And then the third area is better care for patients. And so that’s where I see a lot of these FDA approved medical devices, such as the ones in radiology. But there’s new ones also in pathology and dermatology.”

One way VA is addressing staff experience is by using optical character recognition to extract handwritten notes to convert them into computer written notes. Once those notes are converted, AI can be used to compile and extract information that a caseworker may be looking for.

“We can potentially use generative technology to better summarize and identify information … and that kind of search and summarization is really applicable across the VA enterprise as well as across many other companies, whether its our health care providers trying to find information in their electronic health records or whether its our benefits adjudicators trying to identify information when they are processing claims,” McManus said.

A top priority for VA is veteran suicide prevention, and AI has a huge role in those efforts. The REACH VET program uses an algorithm that stratifies veterans by high risk or low risk of suicide, and identifies those in need of intervention.

“We have programs that can reach out to and provide supportive services to veterans in one area, and we are looking at a variety of areas, for the future,” McManus said. “Broadly, it involves using factors that we know are clinical risk factors for suicide that are already in their electronic health record, and then putting those together to identify risk score.”

With the health and well being of veterans at stake, and the large volume of their data on hand, the VA puts the highest level of importance on the security of veteran information. They have developed six basic principles to follow for the use of data and AI trustworthiness for themselves and their partners:

  • Is it purposeful?
  • Is it effective and safe?
  • Is it secure and private?
  • Is it fair and equitable?
  • Is it transparent and explainable?
  • Is it accountable and monitored?

“We have an enterprise data platform that’s called Summit that contains much of our EHR electronic health record data as well as other types of data,” McManus told the Federal Drive with Tom Temin. “We have much of our data in our healthcare, in our cloud platforms that we keep on our network. We also work with other organizations such as Oak Ridge National Labs, who does some of our more deep R&D related to data science and AI.”

As the VA introduces more uses for AI, concerns about drift or bias become a larger issue for the organization. VA’s approach to this is to keep humans in the loop, and designing use cases around the human end user. The human factor also allows the agency to evaluate when AI is the proper solution.

“The number one key to any success of AI, machine learning and AI algorithm is that starting from the beginning, really having those end users, those health care clinical experts, the workflow experts, all really at the table. As data scientists, we understand the math and the computers, but how an algorithm will actually fit into a workflow, actually impact end users, that really requires everyone to be at the table from the beginning.” McManus said. “I’m really excited and optimistic about this space. We definitely are keeping a very strong focus on trustworthy AI and safety. And there are just so many areas that AI and ML and new technology has the potential to positively impact our mission to care for veterans. So I am optimistic, and I think we’ve made a lot of progress. We have a long way to go, but, I’m excited.”

The post At VA, AI and data optimization hold the promise of better health outcomes, job satisfaction first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/03/at-va-ai-and-data-optimization-hold-the-promise-of-better-health-outcomes-and-job-satisfaction/feed/ 0
Air Force vice chief needs better solutions for moving data https://federalnewsnetwork.com/air-force/2024/03/air-force-vice-chief-needs-better-solutions-for-moving-data/ https://federalnewsnetwork.com/air-force/2024/03/air-force-vice-chief-needs-better-solutions-for-moving-data/#respond Tue, 05 Mar 2024 23:42:17 +0000 https://federalnewsnetwork.com/?p=4914244 The Air Force collects a lot of valuable data that will "never see the light of day," and the service's new vice chief of staff is looking for solutions.

The post Air Force vice chief needs better solutions for moving data first appeared on Federal News Network.

]]>
var config_4914299 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB1643098318.mp3?updated=1709681420"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Air Force vice chief needs better solutions for moving data","description":"[hbidcpodcast podcastid='4914299']nnThe Air Force collects terabytes of data during each mission, but most of it will \u201cnever see the light of day.\u201d Gen. James Slife, the service\u2019s newly confirmed vice chief of staff, is making it a priority to improve how the service takes advantage of its own data.nn\u201cWe\u2019re not at all organized, educated or trained, we don\u2019t have the right policies, we are wholly out of position to be able to take advantage of this,\u201d Slife said at the AFCEA luncheon on Feb. 29. \u201cThis is going to be something I\u2019m going to spend the next X number of years being the designated hammer inside the Air Force on this topic. We\u2019ve got a long way to go.\u201dnnIn his first public speaking engagement since he was sworn in as the service\u2019s No. 2 in December, Slife emphasized the need for the service to address challenges associated with capturing, managing, and more importantly, utilizing the information collected as a strategic asset to enhance its operations.nnEvery time the F-35 stealth fighter takes off, its various systems and sensors start collecting information. The aircraft\u2019s electronic warfare system, its electro-optical targeting system, the communication suite and cameras provide airmen with a \u201cdetailed, cohesive image of everything that [the aircraft] sees and senses.\u201d All the while, it\u2019s recording a massive amount of data, but most of it will most likely be lost.nn\u201cThere are lessons learned built into that data. There\u2019s the wingman that did the wrong thing. There\u2019s the bad radio call. There is the signal that we\u2019ve never seen before. We need to incorporate that into our future missions to feed our algorithms the truth required for <a href="https:\/\/federalnewsnetwork.com\/all-about-data\/2023\/03\/department-of-air-force-tackling-data-workforce-issues-at-the-heart-of-ai-readiness\/">accurate AI models<\/a>. The problem \u2014 there is a high probability that every bit of that valuable data will never ever see the light of day. It\u2019ll all be deleted. And we\u2019ll record over it the very next day,\u201d Slife said.nnThe reason why this data gets deleted is because it\u2019s just too large to be transmitted. Hours of transit time, unbroken horizon video footage of the plane going from point A to point B \u2014 all of it takes up a lot of space.nnRecorded data needs to be indexed and tagged \u2014 a labor-intensive and time-consuming process. When there is no time, that data just gets dumped into a so-called data lake.nn\u201cThese data lakes have more unusable data than that which is actually usable. These lakes, therefore, become data swamps,\u201d said Slife.nnIn addition, there is an issue of overclassification. For example, there is data on one of the F-35s airmen want to use for an upcoming exercise. But the tape includes a short conversation about a B-21 taking off somewhere, making the entire recording classified at a top-secret level, even though \u201c99% of what\u2019s going on that sortie is unclassified and could be more readily accessible to the force,\u201d Slife said.nn\u201cOur current solutions are sluggish and they\u2019re not totally accurate. And frankly, our own culture of over-classification and protecting data past the point at which we lose the ability for it to become operationally relevant is part of our own problem,\u201d Slife said.nnWhen it comes to the C-17 cargo plane, a wealth of data flows back and forth across a data bus to all the various aircraft systems, but none of the data gets recorded.nn\u201cEvery one of those 1553 data buses watches that treasure trove of information speed right past it every second. None of it is recorded or analyzed or saved or looked at \u2014 none of it,\u201d Slife said.nnGiven the challenges, Slife said he needs better solutions for moving large quantities of data. The current setup where airmen have to physically transport hard drives between bases is inadequate and unsustainable.nnAdditionally, he needs automated data processing solutions to filter out irrelevant information and index, tag, and catalog data efficiently. There is also a need for better cross-domain solutions to securely transfer data between different classification levels, putting the right information on the right classification systems quickly, accurately, and most importantly, in an automated fashion.nn\u201cI hope this is a bit of a call to action. We need a more holistic approach to this. It can\u2019t just be vendor product A solving problem A,\u201d Slife said. \u201cI need help getting our arms around these problems I\u2019m outlining to you today.\u201d"}};

The Air Force collects terabytes of data during each mission, but most of it will “never see the light of day.” Gen. James Slife, the service’s newly confirmed vice chief of staff, is making it a priority to improve how the service takes advantage of its own data.

“We’re not at all organized, educated or trained, we don’t have the right policies, we are wholly out of position to be able to take advantage of this,” Slife said at the AFCEA luncheon on Feb. 29. “This is going to be something I’m going to spend the next X number of years being the designated hammer inside the Air Force on this topic. We’ve got a long way to go.”

In his first public speaking engagement since he was sworn in as the service’s No. 2 in December, Slife emphasized the need for the service to address challenges associated with capturing, managing, and more importantly, utilizing the information collected as a strategic asset to enhance its operations.

Every time the F-35 stealth fighter takes off, its various systems and sensors start collecting information. The aircraft’s electronic warfare system, its electro-optical targeting system, the communication suite and cameras provide airmen with a “detailed, cohesive image of everything that [the aircraft] sees and senses.” All the while, it’s recording a massive amount of data, but most of it will most likely be lost.

“There are lessons learned built into that data. There’s the wingman that did the wrong thing. There’s the bad radio call. There is the signal that we’ve never seen before. We need to incorporate that into our future missions to feed our algorithms the truth required for accurate AI models. The problem — there is a high probability that every bit of that valuable data will never ever see the light of day. It’ll all be deleted. And we’ll record over it the very next day,” Slife said.

The reason why this data gets deleted is because it’s just too large to be transmitted. Hours of transit time, unbroken horizon video footage of the plane going from point A to point B — all of it takes up a lot of space.

Recorded data needs to be indexed and tagged — a labor-intensive and time-consuming process. When there is no time, that data just gets dumped into a so-called data lake.

“These data lakes have more unusable data than that which is actually usable. These lakes, therefore, become data swamps,” said Slife.

In addition, there is an issue of overclassification. For example, there is data on one of the F-35s airmen want to use for an upcoming exercise. But the tape includes a short conversation about a B-21 taking off somewhere, making the entire recording classified at a top-secret level, even though “99% of what’s going on that sortie is unclassified and could be more readily accessible to the force,” Slife said.

“Our current solutions are sluggish and they’re not totally accurate. And frankly, our own culture of over-classification and protecting data past the point at which we lose the ability for it to become operationally relevant is part of our own problem,” Slife said.

When it comes to the C-17 cargo plane, a wealth of data flows back and forth across a data bus to all the various aircraft systems, but none of the data gets recorded.

“Every one of those 1553 data buses watches that treasure trove of information speed right past it every second. None of it is recorded or analyzed or saved or looked at — none of it,” Slife said.

Given the challenges, Slife said he needs better solutions for moving large quantities of data. The current setup where airmen have to physically transport hard drives between bases is inadequate and unsustainable.

Additionally, he needs automated data processing solutions to filter out irrelevant information and index, tag, and catalog data efficiently. There is also a need for better cross-domain solutions to securely transfer data between different classification levels, putting the right information on the right classification systems quickly, accurately, and most importantly, in an automated fashion.

“I hope this is a bit of a call to action. We need a more holistic approach to this. It can’t just be vendor product A solving problem A,” Slife said. “I need help getting our arms around these problems I’m outlining to you today.”

The post Air Force vice chief needs better solutions for moving data first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/air-force/2024/03/air-force-vice-chief-needs-better-solutions-for-moving-data/feed/ 0
Scaling data safely: Enhancing data security in data storage solutions https://federalnewsnetwork.com/commentary/2024/03/scaling-data-safely-enhancing-data-security-in-data-storage-solutions/ https://federalnewsnetwork.com/commentary/2024/03/scaling-data-safely-enhancing-data-security-in-data-storage-solutions/#respond Mon, 04 Mar 2024 21:09:55 +0000 https://federalnewsnetwork.com/?p=4912540 Businesses are faced with a situation where, if they want to grow in this new era of “big data,” they must invest in data storage.

The post Scaling data safely: Enhancing data security in data storage solutions first appeared on Federal News Network.

]]>
These days, data is increasingly commoditized and is among an organization’s most valuable assets. However, although each data point might be a small byte, the amount of data a business produces builds up quickly, leaving many organizations with a surplus of data and a number of unique challenges.

As businesses are producing more and more data, there must be somewhere for that data to go. One source reports that around 90% of the world’s data was generated in the last two years alone, showing massive growth in data production year over year.

Businesses are faced with a situation where, if they want to grow in this new era of “big data,” they must invest in data storage. Unfortunately, those organizations that hope to store their data in-house have the costly expense of purchasing and maintaining storage infrastructure, or paying even larger public cloud storage bills

Cluster architecture is emerging as a popular data storage solution that allows greater scalability. Unused storage space is one of the biggest sources of waste — both financially and environmentally — in the data industry.

If a growing enterprise purchases more dedicated storage space for each new use case and application, it will be left with low utilization rates. On the other hand, cluster architecture allows organizations to purchase more storage as needed and distribute data across cluster nodes, regardless of application, making it a generally more cost-affordable solution to an increasing breadth of data.

The relationship between scalability and security

Businesses whose data storage needs are growing find that their cybersecurity needs are as well. The more valuable data an organization stores, the more vulnerable it is to becoming a target of hackers and other wrongdoers who hope to access the data for their own nefarious purposes. Thus, while simpler access protection and encryption methods might be sufficient for smaller organizations, their data solutions require several layers of protection and failsafes as they continue to grow.

Ultimately, the main goal of a data storage solution that implements cluster architecture is to maintain access to data even in cases of massive failure. With the proper layers of protection, a user can still access their data — even if a server, rack, or an entire row or data center fails. For example, if a particular server is compromised with a ransomware attack, organizations using a cluster architecture storage system should not lose access to their data, as another copy should be stored in another location to ensure continuous access.

How cluster architecture provides data security

One method that cluster architecture systems often use to protect and preserve access to data is data replication. This is a measure in which multiple copies of the data are written so that one or more copies could be lost or erased and the ability to read the data would not be compromised. Granted, this method can grow expensive, depending on the amount of data an organization needs to store and secure, considering the amount of storage space required to support multiple redundant copies.

To address the cost challenges of redundancies, many solutions have also begun to implement a process known as erasure coding. The purpose of erasure coding is to use parity to store data so that fewer drives are needed to reconstruct lost data. Although this method is associated with some loss of performance, the cost savings related to erasure coding often make it a desirable alternative for growing organizations with increasingly complex data needs.

Ceph as an ideal cluster storage solution

Of the cluster architecture solutions that have emerged, the most popular is the open-source platform Ceph. Ceph implements many of these features — data replication, erasure coding and proper distribution of data — to ensure users can maintain access to their data without interruption, even in cases where there is what may have otherwise been perceived as “catastrophic failure.”

Because Ceph, along with many other cluster architecture solutions, is an open-source platform, compliance measures are not built-in. However, there are third-party providers designed to streamline the implementation and use of these open-source platforms, which also help manage the compliance of organizations’ data storage. These providers ensure that users’ data has the level of security and privacy required by law and that each Ceph deployment will hold up under the scrutiny of an audit.

For businesses moving forward with a growth-oriented mindset, data storage is likely at the forefront of their concerns, but organizations should also be careful with the security of their growing base of data. Cluster architecture-based storage solutions, such as Ceph, are ideal in that they fulfill a business’s storage needs while providing the level of protection needed for peace of mind.

Martin Verges co-founded Croit GmbH in the field of innovative software-defined scale-out storage solutions in 2017, and is currently CEO.

The post Scaling data safely: Enhancing data security in data storage solutions first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/03/scaling-data-safely-enhancing-data-security-in-data-storage-solutions/feed/ 0
Biden EO aims to safeguard sensitive data on fed employees, facilities https://federalnewsnetwork.com/cybersecurity/2024/02/biden-eo-aims-to-safeguard-sensitive-data-on-fed-employees-facilities/ https://federalnewsnetwork.com/cybersecurity/2024/02/biden-eo-aims-to-safeguard-sensitive-data-on-fed-employees-facilities/#respond Wed, 28 Feb 2024 23:28:13 +0000 https://federalnewsnetwork.com/?p=4906456 The new EO will target areas like biometrics, geolocation data, personal health information, and other sensitive data.

The post Biden EO aims to safeguard sensitive data on fed employees, facilities first appeared on Federal News Network.

]]>
The Biden administration, as part of a broader data privacy effort, is attempting to curtail foreign adversaries from gathering sensitive data on federal employees and military service members.

The new initiative comes under an executive order President Joe Biden was expected to sign Wednesday. The EO is aimed at protecting Americans’ sensitive data from being accessed by so-called “countries of concern” through data brokerages and other transactions.

Under the EO, the Justice Department will issue regulations around the bulk sale of sensitive data to unfriendly foreign nations, including China, Russia, Iran, North Korea, Cuba, and Venezuela. The EO will specifically highlight geolocation data, personal health data, personal financial data, and biometrics, among other categories.

“Buying data through data brokers is currently legal in the United States,” a senior Biden administration official told reporters on Tuesday. “And that reflects a gap in our national security toolkit that we’re working to fill with this program.”

In a fact sheet, DoJ also described how its program will regulate the sale of “government related data,” regardless of whether it meets the “bulk” thresholds or not.

DoJ’s regulations will focus on sensitive data marketed “as linked or linkable to current or recent former employees or contractors, or former senior officials, of the federal government, including the intelligence community and military.”

The rulemaking will also address government-related locations by focusing on “geolocation data that is linked or linkable to certain sensitive locations within geofenced areas that the department would specify on a public list,” the department said in its factsheet.

Officials have warned for years that foreign adversaries could use commercially available data as an intelligence tool. In a January 2023 white paper, the MITRE Corporation summarized how advertising technology or “adtech” on mobile phones and other devices could be used to target influential individuals for blackmail and coercion, or even physically map and target sensitive sites.

Brandon Pugh, policy director of the R Street Institute’s cybersecurity and emerging threats team, noted members of the military and intelligence professionals face unique threats in the digital age.

“Adversaries have an interest in identifying them, targeting them for blackmail and disinformation, and tracking their movements to and from government facilities for strategic advantage,” Pugh told Federal News Network. “We have seen this play out in the Russia-Ukraine conflict and there is no doubt that countries like China have similar interests.”

Pugh also noted it will be difficult to track when and how sensitive data reaches countries like China and Russia.

“An area that will be tricky to follow and enforce is when data is ‘re-exported,’ where third parties share the data with countries of concern,” he said.

CISA security requirements for sensitive data

While DoJ will play a lead role under the EO with its advanced notice of proposed rulemaking, several other agencies will be involved in advancing the order’s goals to better protect sensitive data

The Cybersecurity and Infrastructure Security Agency, for instance, will establish security requirements for “restricted transactions” that will be allowed to go forward under the EO, so long as they meet certain stipulations.

“These security requirements will be designed to mitigate the risk of access by countries of concern or covered persons and may include cybersecurity measures such as basic organizational cybersecurity posture requirements, physical and logical access controls, data masking and minimization, and the use of privacy-preserving technologies,” the DoJ fact sheet explains.

Pugh said it will be notable to watch “how prescriptive these requirements are, how they are assessed and monitored, and how they might evolve over time.”

“Enhancing baseline cybersecurity has been a priority of the federal government even outside of this executive order as the updated NIST cybersecurity framework from this week conveys,” he added.

Meanwhile, the departments of Defense, Health and Human Services, Veterans Affairs, and the National Science Foundation will “consider taking steps to use their existing grant making and contracting authorities to prohibit federal funding that supports, or to otherwise mitigate, the transfer of sensitive health data and human genomic data to countries of concern and covered persons,” the DoJ fact sheet states.

Biden urges Congress on privacy legislation

The White House acknowledged that while the EO is aimed at protecting American’s sensitive data, it isn’t a substitute for broader privacy actions. Biden is encouraging the Consumer Financial Protection Bureau to take steps to prevent data brokers from “illegally assembling and selling extremely sensitive data, including that of U.S. military personnel,” the White House said.

And Biden is also urging Congress to pass “comprehensive bipartisan privacy legislation, especially to protect the safety of our children.”

In a statement, Sen. Mark Warner (D-Va.) applauded the forthcoming EO. “While I welcome these steps, today’s action does not assuage the need for comprehensive data privacy legislation,” Warner said. “I urge my colleagues to come together on legislation that finally protects Americans’ privacy online.”

Meanwhile, Sen. Ron Wyden (D-Ore.) also praised the White House’s actions, while calling on the Senate to consider his Protecting Americans’ Data from Foreign Surveillance Act of 2023, which could potentially apply to a much broader set of countries than Biden’s EO.

“Authoritarian dictatorships like Saudi Arabia and UAE cannot be trusted with Americans’ personal data, both because they will likely use it to undermine U.S. national security and target U.S. based dissidents, but also because these countries lack effective privacy laws necessary to stop the data from being sold onwards to China,” Wyden said.

The post Biden EO aims to safeguard sensitive data on fed employees, facilities first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/02/biden-eo-aims-to-safeguard-sensitive-data-on-fed-employees-facilities/feed/ 0