Cloud Computing - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Wed, 10 Apr 2024 22:43:36 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Cloud Computing - Federal News Network https://federalnewsnetwork.com 32 32 Ask the CIO: Federal Emergency Management Agency https://federalnewsnetwork.com/cme-event/federal-insights/ask-the-cio-federal-emergency-management-agency/ Wed, 10 Apr 2024 20:41:23 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4957819 How is digital transformation impacting the mission at FEMA?

The post Ask the CIO: Federal Emergency Management Agency first appeared on Federal News Network.

]]>
In this exclusive webinar edition of Ask the CIO, host Jason Miller and his guest, Charlie Armstrong, chief information officer at FEMA will discuss the how digital transformation is supporting the mission at FEMA. In addition, Don Wiggins, senior solutions global architect at Equinix will provide an industry perspective.

Learning Objectives:

  • Digital transformation at FEMA
  • Shifting FEMA to the cloud
  • Edge computing for the future
  • Employing artificial intelligence
  • Industry analysis

The post Ask the CIO: Federal Emergency Management Agency first appeared on Federal News Network.

]]>
Air Force begins phase 2 of enterprise IT service delivery https://federalnewsnetwork.com/air-force/2024/04/air-force-begins-phase-2-of-enterprise-it-service-delivery/ https://federalnewsnetwork.com/air-force/2024/04/air-force-begins-phase-2-of-enterprise-it-service-delivery/#respond Tue, 02 Apr 2024 21:58:59 +0000 https://federalnewsnetwork.com/?p=4947954 The Air Force released a new solicitation and plans to issue another one as part of its overall strategy to centralize many IT modernization efforts.

The post Air Force begins phase 2 of enterprise IT service delivery first appeared on Federal News Network.

]]>
var config_4948030 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB9398328124.mp3?updated=1712094403"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Air Force begins phase 2 of enterprise IT service delivery","description":"[hbidcpodcast podcastid='4948030']nnThe Air Force is out with a new multiple award solicitation to modernize all of its base network infrastructure.nnThe <a href="https:\/\/piee.eb.mil\/sol\/xhtml\/unauth\/search\/oppMgmtLink.xhtml?solNo=FA872624RB015" target="_blank" rel="noopener">request for proposals<\/a> uses the phrase, \u201centerprise IT-as-a-service\u201d only a handful of times, but for all intent and purposes, this potentially 10-year contract with a $12.5 billion ceiling is considered Wave 2.nnThe new RFP calls for a group of large and small businesses to \u201cmodernize, operate and maintain the network infrastructure on all Department of the Air Force locations, to include Guard and Reserve bases.\u201dnnThe Air Force is planning to award at least five contracts to 8(a) firms as well as a minimum of three awards to HUBZone companies, women-owned small businesses, service-disabled veteran-owned small business firms and other small businesses not in a socioeconomic program.nn\u201cThis effort takes lessons learned from the EITaaS risk reduction effort network-as-a-service effort as well as lessons learned from existing base IT infrastructure modernization efforts to modernize the future base area network (BAN) offering at Air Force bases worldwide,\u201d the RFP states. \u201cThis effort intends to modernize the Non-Secure Internet Protocol Router (NIPR) and Secure Internet Protocol Router (SIPR) BAN through an as-a-service model utilizing contractor provided networking services.\u201dnnThe Air Force says its goal through the BIM vehicle is to obtain standardized, innovative and agile IT services, increase integration through a modern streamlined network and to be an investment for future mission sets.n<h2>Air Force to reduce data centers<\/h2>nWinston Beauchamp, the deputy chief information officer at the Air Force, said the goal is to award the multiple award contract later this spring with the first set of task orders going out before the end of the fiscal year.nnBeauchamp said the Wave 2 EITaaS RFP comes as the <a href="https:\/\/federalnewsnetwork.com\/air-force\/2023\/04\/air-force-5-7b-eitaas-contract-freed-from-protests\/">Wave 1 effort<\/a> is picking up steam.nn\u201cThey started by essentially absorbing the bases that were part of our risk reduction experiment originally, that preceded the acquisition, and they are right now delivering common central services that will be applicable to all bases,\u201d Beauchamp said in an interview with Federal News Network after speaking at the AFCEA NOVA Space IT day. \u201cWe're talking about things like a centralized helpdesk automation so that folks can do certain things on their own, like resetting passwords, and answering tier zero help desk type questions. Then also to come there's field services. The option for folks to use our contract to put people in the field to support them at the bases of all that for centralized security and help desk services.\u201dnnThe Air Force is using the base infrastructure modernization contract as a key piece to its <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2023\/05\/air-forces-knausenberger-puts-biggest-obstacles-to-digital-transformation-in-rearview-mirror\/">centralization strategy<\/a>. Beauchamp said not every IT service needs to be an enterprise service, but there are a wide variety of opportunities for the Air Force to improve how it delivers technology to its users.nnFor example, across the 185 Air Force and Space Force bases there are about 1,000 data centers running.nnBeauchamp said the CIO\u2019s office is making a big push to move applications to the cloud, where it makes sense.nn\u201cWe fully expect that more and more applications will be moving into our cloud architecture. That's called CloudOne today, and that contract is up for renewal. It will be re competed, and it will be calling it CloudOne Next, but the intent is that it will be just the next evolution of the CloudOne program,\u201d he said. \u201cThe interface between that and the Joint Warfighting Cloud Capability (JWCC) our intent to leverage that contract to the maximum extent possible by buying cloud services capacity through JWCC, and then managing it under the CloudOne contract. The expectation is that we would continue to acquire cloud through JWCC, where it's cost effective to do so in bulk and then we would provision it with security services that DevSecOps and the other layers of services that we've built up over the years on the under the CloudOne contract.\u201dn<h2>Three cloud contracts in the works<\/h2>nThe Air Force released its request for information for CloudOne Next in September and just in March, it offered more details on its <a href="https:\/\/sam.gov\/opp\/d4ff2b612d5e4b81ad6534dccc2af336\/view" target="_blank" rel="noopener">acquisition strategy<\/a>.nnThe Air Force expects to release three solicitations for CloudOne Next in the third quarter of 2024 and make the award in the fourth quarter of this year. It will be three single-award blanket purchase agreements on top of the schedules program run by the General Services Administration.nnThe three BPAs will focus on:n<ul>n \t<li>Cloud service provider (CSP) reseller and software management<\/li>n \t<li>Architecture and common shared services<\/li>n \t<li>Enterprise application modernization and migration<\/li>n<\/ul>nBeauchamp said the Air Force is evolving from siloes of excellence where every system built its own technology stack to a series of enterprise capabilities where the burden to sustain, modernize and secure is shared.nn\u201cWe really have is an opportunity to look at the degree to which there may be commonality between those approaches, either in factor or in potential, and where we can either use collective buying strategies to reduce the overall cost collective across the Air Force and collectively across DOD, to get the best possible deal through economies of scale,\u201d he said. \u201cIf there's an architectural approach that perhaps could leverage an existing enterprise service, we want to make sure that we have the ability to see them and to make those recommendations to really free up the time and resources so that those dollars can be applied towards more effective mission capability.\u201dnnThis approach to IT portfolio management is one of the six lines of effort Air Force CIO Venice Goodwine outlined in her strategy.nnOther lines of effort include the acceleration of cloud adoption, the future of cybersecurity, including zero trust, workforce development and training, software management and data and <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2023\/12\/air-forces-new-policy-sets-guardrails-around-generative-ai\/">artificial intelligence<\/a>.nnBeauchamp said IT portfolio management, or line of effort 4, is one of the most exciting opportunities for the Air Force. He said IT portfolio management can create leverage across the entire department that can result in both savings and money redirected toward mission needs.nn\u201cOverall, I think that each of the sub objectives within line of effort four are going to contribute in some way in that direction. Everything from implementing a capital planning and investment control (CPIC) approach within the Department of Air Force, which we are piloting this year, to improving our monitoring of the user\u2019s experience, which really enables us to target our modernization efforts on those areas where folks are suffering the most will allow us to make better use of the resources that we have for free enterprise IT,\u201d he said. \u201cOne of the things we're going to have to do is really reexamine how we're implementing CPIC. When I say the pilot, what we've done is we've selected a major command and a couple of functional areas, where we're going to put a more rigorous capability in place to really meet not just the letter of the law, but the spirit as well, and apply the data to actually make business decisions. That's the key. If you if you're going to go to the trouble of collecting all this data about your programs, you might as well use that data for informing your decision making.\u201d"}};

The Air Force is out with a new multiple award solicitation to modernize all of its base network infrastructure.

The request for proposals uses the phrase, “enterprise IT-as-a-service” only a handful of times, but for all intent and purposes, this potentially 10-year contract with a $12.5 billion ceiling is considered Wave 2.

The new RFP calls for a group of large and small businesses to “modernize, operate and maintain the network infrastructure on all Department of the Air Force locations, to include Guard and Reserve bases.”

The Air Force is planning to award at least five contracts to 8(a) firms as well as a minimum of three awards to HUBZone companies, women-owned small businesses, service-disabled veteran-owned small business firms and other small businesses not in a socioeconomic program.

“This effort takes lessons learned from the EITaaS risk reduction effort network-as-a-service effort as well as lessons learned from existing base IT infrastructure modernization efforts to modernize the future base area network (BAN) offering at Air Force bases worldwide,” the RFP states. “This effort intends to modernize the Non-Secure Internet Protocol Router (NIPR) and Secure Internet Protocol Router (SIPR) BAN through an as-a-service model utilizing contractor provided networking services.”

The Air Force says its goal through the BIM vehicle is to obtain standardized, innovative and agile IT services, increase integration through a modern streamlined network and to be an investment for future mission sets.

Air Force to reduce data centers

Winston Beauchamp, the deputy chief information officer at the Air Force, said the goal is to award the multiple award contract later this spring with the first set of task orders going out before the end of the fiscal year.

Beauchamp said the Wave 2 EITaaS RFP comes as the Wave 1 effort is picking up steam.

“They started by essentially absorbing the bases that were part of our risk reduction experiment originally, that preceded the acquisition, and they are right now delivering common central services that will be applicable to all bases,” Beauchamp said in an interview with Federal News Network after speaking at the AFCEA NOVA Space IT day. “We’re talking about things like a centralized helpdesk automation so that folks can do certain things on their own, like resetting passwords, and answering tier zero help desk type questions. Then also to come there’s field services. The option for folks to use our contract to put people in the field to support them at the bases of all that for centralized security and help desk services.”

The Air Force is using the base infrastructure modernization contract as a key piece to its centralization strategy. Beauchamp said not every IT service needs to be an enterprise service, but there are a wide variety of opportunities for the Air Force to improve how it delivers technology to its users.

For example, across the 185 Air Force and Space Force bases there are about 1,000 data centers running.

Beauchamp said the CIO’s office is making a big push to move applications to the cloud, where it makes sense.

“We fully expect that more and more applications will be moving into our cloud architecture. That’s called CloudOne today, and that contract is up for renewal. It will be re competed, and it will be calling it CloudOne Next, but the intent is that it will be just the next evolution of the CloudOne program,” he said. “The interface between that and the Joint Warfighting Cloud Capability (JWCC) our intent to leverage that contract to the maximum extent possible by buying cloud services capacity through JWCC, and then managing it under the CloudOne contract. The expectation is that we would continue to acquire cloud through JWCC, where it’s cost effective to do so in bulk and then we would provision it with security services that DevSecOps and the other layers of services that we’ve built up over the years on the under the CloudOne contract.”

Three cloud contracts in the works

The Air Force released its request for information for CloudOne Next in September and just in March, it offered more details on its acquisition strategy.

The Air Force expects to release three solicitations for CloudOne Next in the third quarter of 2024 and make the award in the fourth quarter of this year. It will be three single-award blanket purchase agreements on top of the schedules program run by the General Services Administration.

The three BPAs will focus on:

  • Cloud service provider (CSP) reseller and software management
  • Architecture and common shared services
  • Enterprise application modernization and migration

Beauchamp said the Air Force is evolving from siloes of excellence where every system built its own technology stack to a series of enterprise capabilities where the burden to sustain, modernize and secure is shared.

“We really have is an opportunity to look at the degree to which there may be commonality between those approaches, either in factor or in potential, and where we can either use collective buying strategies to reduce the overall cost collective across the Air Force and collectively across DOD, to get the best possible deal through economies of scale,” he said. “If there’s an architectural approach that perhaps could leverage an existing enterprise service, we want to make sure that we have the ability to see them and to make those recommendations to really free up the time and resources so that those dollars can be applied towards more effective mission capability.”

This approach to IT portfolio management is one of the six lines of effort Air Force CIO Venice Goodwine outlined in her strategy.

Other lines of effort include the acceleration of cloud adoption, the future of cybersecurity, including zero trust, workforce development and training, software management and data and artificial intelligence.

Beauchamp said IT portfolio management, or line of effort 4, is one of the most exciting opportunities for the Air Force. He said IT portfolio management can create leverage across the entire department that can result in both savings and money redirected toward mission needs.

“Overall, I think that each of the sub objectives within line of effort four are going to contribute in some way in that direction. Everything from implementing a capital planning and investment control (CPIC) approach within the Department of Air Force, which we are piloting this year, to improving our monitoring of the user’s experience, which really enables us to target our modernization efforts on those areas where folks are suffering the most will allow us to make better use of the resources that we have for free enterprise IT,” he said. “One of the things we’re going to have to do is really reexamine how we’re implementing CPIC. When I say the pilot, what we’ve done is we’ve selected a major command and a couple of functional areas, where we’re going to put a more rigorous capability in place to really meet not just the letter of the law, but the spirit as well, and apply the data to actually make business decisions. That’s the key. If you if you’re going to go to the trouble of collecting all this data about your programs, you might as well use that data for informing your decision making.”

The post Air Force begins phase 2 of enterprise IT service delivery first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/air-force/2024/04/air-force-begins-phase-2-of-enterprise-it-service-delivery/feed/ 0
Understanding the data is the first step for NIH, CMS to prepare for AI https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/ https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/#respond Fri, 29 Mar 2024 19:53:52 +0000 https://federalnewsnetwork.com/?p=4944463 NIH and CMS have several ongoing initiatives to ensure employees and their customers understand the data they are providing as AI and other tools gain traction.

The post Understanding the data is the first step for NIH, CMS to prepare for AI first appeared on Federal News Network.

]]>
var config_4944551 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3043668049.mp3?updated=1711741714"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"NIH, CMS finding a path to better data management","description":"[hbidcpodcast podcastid='4944551']nnThe National Institutes of Health\u2019s BioData Catalyst cloud platform is only just starting to take off despite it being nearly six years old.nnIt already holds nearly four petabytes of data and is preparing for a major expansion later this year as part of NIH\u2019s goal to democratize health research information.nnSweta Ladwa, the chief of the Scientific Solutions Delivery Branch at NIH, said the <a href="https:\/\/www.nhlbi.nih.gov\/science\/biodata-catalyst" target="_blank" rel="noopener">BioData Catalyst<\/a> provides access to clinical and genomic data already and the agency wants to add imaging and other data types in the next few months.nn[caption id="attachment_4944475" align="alignright" width="300"]<img class="size-medium wp-image-4944475" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/03\/sweta-ladwa-300x300.jpg" alt="" width="300" height="300" \/> Sweta Ladwa is the chief of the Scientific Solutions Delivery Branch at NIH.[\/caption]nn\u201cWe're really looking to provide a free and accessible resource to the research community to be able to really advance scientific outcomes and therapeutics, diagnostics to benefit the public health and outcomes of Americans and really people all over the world,\u201d Ladwa said during a recent panel discussion sponsored by AFCEA Bethesda, an excerpt of which ran on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cTo do this, it takes a lot of different skills, expertise and different entities. It's a partnership between a lot of different people to make this resource available to the community. We're also part of the <a href="https:\/\/federalnewsnetwork.com\/artificial-intelligence\/2024\/02\/ai-data-exchange-state-depts-matthew-graviss-nihs-susan-gregurick-on-ai-as-force-multiplier\/">larger NIH data ecosystem<\/a>. We participate with other NIH institutes and centers that provide cloud resources.\u201dnnLawda said the expansion of new datasets to the BioData Catalyst platform means NIH also can <a href="https:\/\/federalnewsnetwork.com\/cloud-computing\/2023\/06\/cloud-exchange-2023-nihs-nick-weber-explains-how-strides-cloud-program-bridges-27-institutes\/">provide new tools<\/a> to help mine the information.nn\u201cFor imaging data, for example, we want to be able to leverage or build in tooling that's associated with machine learning because that's what imaging researchers are primarily looking to do is they're trying to process these images to gain insights. So tooling associated with machine learning, for example, is something we want to be part of the ecosystem which we're actively actually working to incorporate,\u201d she said. \u201cA lot of tooling is associated with data types, but it also could be workflows, pipelines or applications that help the researchers really meet their use cases. And those use cases are all over the place because there's just a wealth of data there. There's so much that can be done.\u201dnnFor NIH, the users in the research and academic communities are driving both the datasets and associated tools. Lawda said NIH is trying to make it easier for the communities to gain access.n<h2>NIH making cloud storage easier<\/h2>nThat is why cloud services have been and will continue to play an integral role in this big data platform and others.nn\u201cThe NIH in the Office of Data Science Strategy has been negotiating rates with cloud vendors, so that we can provide these cloud storage free of cost to the community and at a discounted rate to the institute. So even if folks are using the services for computational purposes, they're able to actually leverage and take benefit from the discounts that have been negotiated by the NIH with these cloud vendors,\u201d she said. \u201cWe're really happy to be working with multi-cloud vendors to be able to pass some savings on to really advanced science. We're really looking to continue that effort and expand the capabilities with some of the newer technologies that have been buzzing this year, like generative artificial intelligence and things like that, and really provide those resources back to the community to advance the science.\u201dnnLike NIH, the Centers for Medicare and Medicaid Services is spending a lot of time <a href="https:\/\/federalnewsnetwork.com\/workforce\/2024\/02\/hhh-takes-step-toward-goal-for-better-health-information-sharing\/">thinking about its data<\/a> and how to make it more useful for its customers.nnIn CMS\u2019s case, however, the data is around the federal healthcare marketplace and the tools to make citizens and agency employees more knowledgeable.nn[caption id="attachment_4944476" align="alignleft" width="300"]<img class="size-medium wp-image-4944476" src="https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2024\/03\/kate-wetherby-300x300.png" alt="" width="300" height="300" \/> Kate Wetherby is the acting director for the Marketplace Innovation and Technology Group at CMS.[\/caption]nn nn nn nn nn nn nn nn nn nn nn nnKate Wetherby, the acting director for the Marketplace Innovation and Technology Group at CMS, said the agency is reviewing all of its data sources and data streams to better understand what they have and make their websites and the user experience all work better.nn\u201cWe use that for performance analytics to make sure that while we are doing open enrollment and while we're doing insurance for people, that our systems are up and running and that there's access,\u201d she said. \u201cThe other thing is that we spend a lot of time using Google Analytics, using different types of testing fields, to make sure that the way that we're asking questions or how we're getting information from people makes a ton of sense.\u201dnnWetherby said her office works closely with both the business and policy offices to bring the data together and ensure its valuable.nn\u201cReally the problem is if you're not really understanding it at the point of time that you're getting it, in 10 years from now you're going to be like, \u2018why do I have this data?\u2019 So it's really being thoughtful about the data at the beginning, and then spending the time year-over-year to see if it's something you should still be holding or not,\u201d she said.nnUnderstanding the business, policy and technical aspects of the data becomes more important for CMS as it <a href="https:\/\/federalnewsnetwork.com\/automation\/2020\/10\/cms-untangles-its-data-infrastructure-to-enable-ai-powered-fraud-detection\/">moves more into AI<\/a>, including generative AI, chatbots and other tools.n<h2>CMS creating a data lake<\/h2>nWetherby said CMS must understand their data first before applying these tools.nn\u201cWe have to understand why we're asking those questions. What is the relationship between all of that data, and how we can we improve? What does the length of data look like because we have some data that's a little older and you've got to look at that and be like, does that really fit into the use cases and where we want to go with the future work?\u201d she said. \u201cWe\u2019ve spent a lot of time, at CMS as a whole, really thinking about our data, and how we're curating the data, how we know what that's used for because we all know data can be manipulated in any way that you want. We want it to be really clear. We want it to be really usable. Because when we start talking in the future, and we talk about generative AI, we talk about chatbots or we talk about predictive analytics, it is so easy for a computer if the data is not right, or if the questions aren't right, to really not get the outcome that you're looking for.\u201dnnWetherby added another key part of getting data right is for the user\u2019s experience and how CMS can share that data across the government.nnIn the buildup to using GenAI and other tools, CMS is creating a data lake to pull information from different centers and offices across the agency.nnWetherby said this way the agency can place the right governance and security around the data since it crosses several types including clinical and claims information."}};

The National Institutes of Health’s BioData Catalyst cloud platform is only just starting to take off despite it being nearly six years old.

It already holds nearly four petabytes of data and is preparing for a major expansion later this year as part of NIH’s goal to democratize health research information.

Sweta Ladwa, the chief of the Scientific Solutions Delivery Branch at NIH, said the BioData Catalyst provides access to clinical and genomic data already and the agency wants to add imaging and other data types in the next few months.

Sweta Ladwa is the chief of the Scientific Solutions Delivery Branch at NIH.

“We’re really looking to provide a free and accessible resource to the research community to be able to really advance scientific outcomes and therapeutics, diagnostics to benefit the public health and outcomes of Americans and really people all over the world,” Ladwa said during a recent panel discussion sponsored by AFCEA Bethesda, an excerpt of which ran on Ask the CIO. “To do this, it takes a lot of different skills, expertise and different entities. It’s a partnership between a lot of different people to make this resource available to the community. We’re also part of the larger NIH data ecosystem. We participate with other NIH institutes and centers that provide cloud resources.”

Lawda said the expansion of new datasets to the BioData Catalyst platform means NIH also can provide new tools to help mine the information.

“For imaging data, for example, we want to be able to leverage or build in tooling that’s associated with machine learning because that’s what imaging researchers are primarily looking to do is they’re trying to process these images to gain insights. So tooling associated with machine learning, for example, is something we want to be part of the ecosystem which we’re actively actually working to incorporate,” she said. “A lot of tooling is associated with data types, but it also could be workflows, pipelines or applications that help the researchers really meet their use cases. And those use cases are all over the place because there’s just a wealth of data there. There’s so much that can be done.”

For NIH, the users in the research and academic communities are driving both the datasets and associated tools. Lawda said NIH is trying to make it easier for the communities to gain access.

NIH making cloud storage easier

That is why cloud services have been and will continue to play an integral role in this big data platform and others.

“The NIH in the Office of Data Science Strategy has been negotiating rates with cloud vendors, so that we can provide these cloud storage free of cost to the community and at a discounted rate to the institute. So even if folks are using the services for computational purposes, they’re able to actually leverage and take benefit from the discounts that have been negotiated by the NIH with these cloud vendors,” she said. “We’re really happy to be working with multi-cloud vendors to be able to pass some savings on to really advanced science. We’re really looking to continue that effort and expand the capabilities with some of the newer technologies that have been buzzing this year, like generative artificial intelligence and things like that, and really provide those resources back to the community to advance the science.”

Like NIH, the Centers for Medicare and Medicaid Services is spending a lot of time thinking about its data and how to make it more useful for its customers.

In CMS’s case, however, the data is around the federal healthcare marketplace and the tools to make citizens and agency employees more knowledgeable.

Kate Wetherby is the acting director for the Marketplace Innovation and Technology Group at CMS.

 

 

 

 

 

 

 

 

 

 

 

Kate Wetherby, the acting director for the Marketplace Innovation and Technology Group at CMS, said the agency is reviewing all of its data sources and data streams to better understand what they have and make their websites and the user experience all work better.

“We use that for performance analytics to make sure that while we are doing open enrollment and while we’re doing insurance for people, that our systems are up and running and that there’s access,” she said. “The other thing is that we spend a lot of time using Google Analytics, using different types of testing fields, to make sure that the way that we’re asking questions or how we’re getting information from people makes a ton of sense.”

Wetherby said her office works closely with both the business and policy offices to bring the data together and ensure its valuable.

“Really the problem is if you’re not really understanding it at the point of time that you’re getting it, in 10 years from now you’re going to be like, ‘why do I have this data?’ So it’s really being thoughtful about the data at the beginning, and then spending the time year-over-year to see if it’s something you should still be holding or not,” she said.

Understanding the business, policy and technical aspects of the data becomes more important for CMS as it moves more into AI, including generative AI, chatbots and other tools.

CMS creating a data lake

Wetherby said CMS must understand their data first before applying these tools.

“We have to understand why we’re asking those questions. What is the relationship between all of that data, and how we can we improve? What does the length of data look like because we have some data that’s a little older and you’ve got to look at that and be like, does that really fit into the use cases and where we want to go with the future work?” she said. “We’ve spent a lot of time, at CMS as a whole, really thinking about our data, and how we’re curating the data, how we know what that’s used for because we all know data can be manipulated in any way that you want. We want it to be really clear. We want it to be really usable. Because when we start talking in the future, and we talk about generative AI, we talk about chatbots or we talk about predictive analytics, it is so easy for a computer if the data is not right, or if the questions aren’t right, to really not get the outcome that you’re looking for.”

Wetherby added another key part of getting data right is for the user’s experience and how CMS can share that data across the government.

In the buildup to using GenAI and other tools, CMS is creating a data lake to pull information from different centers and offices across the agency.

Wetherby said this way the agency can place the right governance and security around the data since it crosses several types including clinical and claims information.

The post Understanding the data is the first step for NIH, CMS to prepare for AI first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/ask-the-cio/2024/03/nih-cms-finding-a-path-to-better-data-management/feed/ 0
FedRAMP’s overhaul begins with 28-near term initiatives https://federalnewsnetwork.com/cybersecurity/2024/03/fedramps-overhaul-begins-with-28-near-term-initiatives/ https://federalnewsnetwork.com/cybersecurity/2024/03/fedramps-overhaul-begins-with-28-near-term-initiatives/#respond Thu, 28 Mar 2024 21:44:18 +0000 https://federalnewsnetwork.com/?p=4943500 The Federal Risk Authorization and Management Program’s is planning several pilots to bring in automation, test out reciprocity and speed up reviews.

The post FedRAMP’s overhaul begins with 28-near term initiatives first appeared on Federal News Network.

]]>
The first piece of the Federal Risk Authorization and Management Program’s overhaul is out, but it’s not the document that you are expecting.

Instead of the Office of Management and Budget’s revamped guidance, the program management office released a new roadmap for the cloud security program outlining 4 primary goals, 6 initiatives and 28 near-term priorities.

OMB’s updated guidance remains a work in progress after releasing the draft memo in October and accepting comments through Dec. 22. OMB received more than 285 comments.

“Today, what federal agencies need from FedRAMP is not only computing infrastructure, but everything that’s being built on top of it. Modern enterprises today run on a kaleidoscope of cloud-based applications, large and small. It is critical that FedRAMP be well-positioned to make sure federal agencies get the full benefit of these software-as-a-service (SaaS) cloud offerings,” the PMO wrote in a blog post today. “While SaaS applications are used in government, and FedRAMP does have some in its marketplace, it’s not nearly enough and it’s not working the way that it should. We know that for many companies, especially software-focused companies, it takes too much time and money to get a FedRAMP authorization. And we’re particularly cognizant that we need to scale and automate our own processes beyond where they’re at now if we want to meaningfully grow the FedRAMP marketplace.”

The FedRAMP program office has spent much of the past decade, really ever since OMB launched the initiative in 2011, trying to address the criticisms and frustrations over how much time it takes and the cost to earn approvals and certifications.

The new roadmap puts these issues, and several others including reciprocity, front and center through a series of pilots FedRAMP will undertake over the next 18 months.

Source: FedRAMP March 2024 roadmap.

One such proof of concept will focus on enabling agile software delivery by piloting a replacement “significant change request” process that does not block on advance approval.

Another would focus on how FedRAMP could better support machine-readable “digital authorization packages” through automation using the Open Security Controls Assessment Language (OSCAL), something the program has been talking about for four years. The roadmap says FedRAMP will piloting OSCAL with commercial cloud providers and agency partners.

FedRAMP says, “pilot partners should see reduced PMO review of their packages based on their mature processes.”

DISA, CISA pilots on tap

Two other pilots are focused on working with the Defense Department and the Homeland Security Department.

FedRAMP says it wants to test out how it could implement a low-review process with trusted authorizing partners such as the Defense Information Systems Agency.

“We will work with trusted authorizing partners to align our processes and eliminate the need for extensive per-package review by the program,” the PMO wrote.

Another pilot is a combination of new technology and the move toward continuous monitoring. FedRAMP says it wants to migrate to a new technology platform and pilot user workflows within that technology. Additionally, it wants to test the sharing of threat information between FedRAMP platform and the Cybersecurity and Infrastructure Security Agency’s continuous diagnostics and mitigation (CDM) dashboard.

“We will also work closely with CISA to develop and deploy the best protections for and minimize the risk to the federal enterprise. By combining this with more public documentation and examples of how cloud providers meet FedRAMP’s security goals, we can also streamline the authorization process overall,” the PMO wrote. “There are other things we’re working on too, like exploring reciprocity with external frameworks, and partnering with our colleagues at CISA on scaling secure configuration guides and threat sharing.”

Hiring a new FedRAMP director

Mike Hettinger, a former House staff member and now president of the Hettinger Strategy Group, said while he was pleased to see the roadmap, many of the initiatives are variations of what has been tried in the past.

“I am also glad to see an attempt to address some of the more longstanding issues that have previously plagued the program. One issue that stands out to me in that respect is the proposed pilot on change management. The issue of what triggers a ‘significant change request’ has been a thorn in the side for a lot of cloud providers over recent years and any real effort to address it represents a welcome change,” Hettinger wrote in an email to Federal News Network. “I continue to believe that we must build greater efficiency into the authorization process, including increasing overall capacity and adding automation to speed up the process and reduce costs for CSPs. At the end of the day, we have to find a way to get more FedRAMP authorized products into the federal marketplace, so hopefully these changes help.”

The release of the roadmap comes on the heels of Brian Conrad, the acting FedRAMP director for the last three-plus years, stepping down earlier this month.

The General Services Administration said it will hold two information sessions on April 1 and April 3 about the opening for the new FedRAMP director role.

GSA also will hold an information session about the new roadmap on April 11 to answer questions.

“We’re hoping to see a number of outcomes from our efforts over time. We expect our industry providers to be able to more effectively deploy changes, and our agency partners to see more features — including security features — faster. We expect to stabilize our review ‘backlog,’ and keep it stabilized over the long term. We expect cloud providers, agencies and third party assessors to have a better understanding of our security requirements, leading to higher quality packages and ultimately greater trust in the FedRAMP program,” the PMO wrote. “Most importantly, we want to understand early what’s working and what’s not so that we can adapt our work and priorities as we go. That’s why we’re planning to initiate pilots and deliver minimum viable products (MVPs) early wherever we can, and why we’ll be checking in with customers throughout the process.”

The post FedRAMP’s overhaul begins with 28-near term initiatives first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/03/fedramps-overhaul-begins-with-28-near-term-initiatives/feed/ 0
DoD Cloud Exchange 2024: OSD’s Danielle Metz on moving from ‘fiefdoms’ to coherent IT enterprise https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-danielle-metz-on-moving-from-it-fiefdoms-to-a-coherent-enterprise/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-danielle-metz-on-moving-from-it-fiefdoms-to-a-coherent-enterprise/#respond Tue, 26 Mar 2024 20:01:45 +0000 https://federalnewsnetwork.com/?p=4931937 Over the past 18 months, DoD has been working to turn a myriad OSD offices into an IT enterprise. A new agreement takes that effort a step further.

The post DoD Cloud Exchange 2024: OSD’s Danielle Metz on moving from ‘fiefdoms’ to coherent IT enterprise first appeared on Federal News Network.

]]>

Up until about a year and a half ago, the 16,000 employees who make up the Office of the Secretary of Defense were the biggest technology user base in the Defense Department that didn’t much resemble an IT enterprise. Collectively, the organization is bigger than the Space Force and many large DoD agencies, but from an IT perspective, the nearly two dozen entities that comprise OSD were largely left to their own devices — figuratively and literally.

But an enormous amount has changed since October 2022, when DoD created a new CIO position to unify 17 OSD staff assistant offices and four agencies into a coherent IT management structure. Most recently — just this month — everyone involved signed a memorandum of agreement to make clear all assigned roles and responsibilities.

“Over the past 10 to 15 years of IT efficiency and consolidation drills, there was a lot of movement of money and resources, but nothing was written down,” Danielle Metz, OSD CIO said during Federal News Network’s DoD Cloud Exchange 2024.

“Since we weren’t really united and no one viewed themselves as part of a collective, everyone had different expectations, different thoughts. And because we didn’t have a memorandum of agreement that articulated the common services that were going to be delivered by the service provider — and the price points and metrics associated with that — there wasn’t an understanding of whether what was being delivered was considered good, what was considered not so good and how to correct that. All of that needed to be sorted through. And so just getting that baseline is what we’ve endeavored on in the past 18 months.”

The service provider is the Defense Information Systems Agency, which has been delivering IT services to tenants inside the Pentagon and the National Capital Region through its joint service provider since 2015, when DoD ordered an earlier consolidation of its IT service providers.

Buying, managing IT services at an enterprise level

But until recently, each OSD organization has been on its own when it comes to ordering and implementing those services, depending on their needs, and figuring out for themselves how to use them.

“We’re now acting as an enterprise instead of individual fiefdoms, and that it works two ways,” Metz said. “One is that we have collective buying power, but we also are able to advocate for the resources that we all need and not just the piece parts by those who were able to navigate the Planning, Programming, Budgeting and Execution process on their own, which is what was happening. There were a lot of organizations that were struggling, and the whole point of a CIO is to democratize access so that we don’t have winners and losers.”

In its initial stages, beyond creating usage, spending and user experience baselines, Metz’s new office — part of the Pentagon’s Directorate of Administration and Management — has had some early wins in deploying common services to the parts of the DoD “fourth estate” that fall within the new OSD enterprise portfolio.

For unclassified email and collaboration services, all 21 of the organizations have now moved to DoD 365, the Pentagon’s cloud-based implementation of Microsoft 365. As of this month, all but one of those organizations has also migrated to their secret-level systems to the new classified version of DoD 365, eliminating the need for a hodgepodge of aging information sharing tools at Impact Level 6.

Migrating those systems to a single cloud environment also helps mitigate the network fragmentation DoD organizations have been creating for the last several decades.

“It doesn’t make those fragmentation issues irrelevant, but it helps us prioritize the fact that we do need to do some network simplification, both on our unclassified and classified networks. That’s what DISA has been leading with what they call DoDNet,” Metz said. “We’re working with DISA to accelerate their plans to have that in the Pentagon, so that you don’t have like a Pentagon local area network that’s kind of sandwiched in between all these other various networks, whether it’s classified or unclassified. We really do need to streamline and simplify the network because we have a lot of network outages. We have performance issues.”

Moving toward a single budget for OSD IT

Another major objective: figuring out how to create a unified IT budget for nearly two dozen organizations with widely varying missions, expertise and needs.

Metz said the most sensible way to provide for each organization would be to create a single working capital fund for the entire enterprise’s IT expenditures, rather than forcing each of them to plan their technology budgets via DoD’s arduous and rigid PPBE process.

“In that model, you’re using your crystal ball to assess what is the technology that we need to be able to implement, and then you have to get a lot of details to be able to come up with a funding profile over five years — but you’re doing it two years out, and you’re going to be wrong. And even if you have it programmed, if you’re operating under a continuing resolution, you don’t have access to those dollars. It really slows your ability to drive the important changes that need to take place. In a working capital fund or fee-for-service model, you’re able to make those capital investments and technology insertions a lot more gracefully instead of having to do big bang approaches — which we know in technology never ever works.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: OSD’s Danielle Metz on moving from ‘fiefdoms’ to coherent IT enterprise first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-danielle-metz-on-moving-from-it-fiefdoms-to-a-coherent-enterprise/feed/ 0
DoD Cloud Exchange 2024: Splunk’s LaLisha Hurt on achieving digital resilience https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-splunks-lalisha-hurt-on-achieving-digital-resilience/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-splunks-lalisha-hurt-on-achieving-digital-resilience/#respond Tue, 26 Mar 2024 16:19:42 +0000 https://federalnewsnetwork.com/?p=4934098 Focus on three modernization musts to achieve cloud transformation: strategy, security and buy-in, says Splunk federal leader.

The post DoD Cloud Exchange 2024: Splunk’s LaLisha Hurt on achieving digital resilience first appeared on Federal News Network.

]]>

Military and civilian agencies have long struggled to make the jump to cloud computing. Deciding on the right cloud approach and strategy that best aligns with their mission needs for today and tomorrow is no easy task. But more important, agencies continue to struggle with modernization efforts amid concerns about potential security gaps and vulnerabilities the cloud introduces. 

“It’s a tricky balance. The reason why it’s tricky is because organizations rely on various IT and security architecture applications and legacy systems implemented for their specific mission support. Another challenge is that many agencies struggle with having so many tools, having an influx of data coming in from various logs across all these disparate legacy systems  — and they don’t integrate well. They don’t talk to one another,” said LaLisha Hurt,  public sector industry advisor at Splunk.

Cloud security concerns persist for most federal agencies for a reason, Hurt said during Federal News Network’s DoD Cloud Exchange 2024.

In its 2023 CISO Report, for example, Splunk found that chief information security officers identified that cloud applications and infrastructure have the biggest security coverage gaps across industries, with cloud impacting business services, healthcare and technology at 71%, 64% and 64% respectively. Cloud security impacts manufacturing at 64%. 

To address that problem within DoD, the Pentagon awarded the multibillion-dollar Joint Warfighting Cloud Capability contract to establish a common and secure cloud infrastructure. Last year, Chief Information Officer John Sherman instructed the military services to prioritize JWCC for their cloud modernization efforts. So far, less than 2% of the $9 billion contract has been utilized as concerns around security linger. 

Transferring to cloud, however, is essential to modernization efforts. Hurt noted that, in the end, it all goes back to the mission. 

The California statewide automated welfare system, for instance, needed to ensure it delivered benefits for Californians in a highly secure and uninterrupted manner. The agency was able to replace three disparate legacy systems with one single cloud-based platform, which saved over $30 million in taxpayer dollars.

“While they also improved productivity and reduced risks, that’s really the mission that this particular entity was trying to solve for” — safe, consistent access to benefits,” Hurt said. “And I think it’s similar for other agencies. They have their mission, and they’re looking for help to deliver on that.” 

No transformation happens without collaboration

Cloud transformation starts with a strategy and gaining the support of various stakeholders to deliver on the strategy.

“I know that sounds simple, but people want to jump to the capabilities or technologies. But what’s that strategy that you’re trying to align to? And do you have buy-in from not only your leadership but the people that are going to be implementing it — your employees — which I think is equally important,” Hurt said.

Determining the model will depend on each agency or organization’s unique mission needs and ensuring that the model can be scalable and increase as the demands grow. 

“So many customers are going cloud only. Some remain on-prem for unique mission needs. And then there are others that actually operate in a hybrid environment,” Hurt said. “And I don’t think there’s a right or wrong approach, as long as it serves your business needs. And also, as long as it allows you to scale in the future. That’s important.”

She continued: “The other thing I would say is to take a risk-based approach and ensure you have a strong inventory of assets, systems and classification prior to the migration. You might find that everything does not necessarily need to go to the cloud.”

Splunk spends the most time with customers conducting business value assessments to understand the pros and cons of moving to the cloud versus staying on premise, Hurt said.

“It goes back to the mission. What are the things that are mission-critical to your agency? What are the things that you care about most? And where do you want to house them? And what levels of security do you want to put around them? That will dictate whether you keep things on prem versus move to cloud,” she said. “Where are you trying to gain and obtain more efficiencies?”

It’s also important to expand participation in these conversations and bring in “not only your cyber teams but your infrastructure teams, your chief technology officer, your chief information officer,” Hurt said. “It’s really a cross-functional effort that should be considered when you’re building that cloud strategy.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Splunk’s LaLisha Hurt on achieving digital resilience first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-splunks-lalisha-hurt-on-achieving-digital-resilience/feed/ 0
DoD Cloud Exchange 2024: Navy Reserve tech leaders on cloud-enabled access anywhere, anytime https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-navy-reserve-tech-leaders-on-cloud-enabled-access-anywhere-anytime/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-navy-reserve-tech-leaders-on-cloud-enabled-access-anywhere-anytime/#respond Tue, 26 Mar 2024 01:23:40 +0000 https://federalnewsnetwork.com/?p=4939379 The Navy Reserve's far-flung workforce needed secure IT access anywhere, anytime. Now, the reserve's deploying the capability across the fleet.

The post DoD Cloud Exchange 2024: Navy Reserve tech leaders on cloud-enabled access anywhere, anytime first appeared on Federal News Network.

]]>

When the Navy needed to quickly pivot tens of thousands of remote users off of virtual private networks because of the Ivanti security vulnerability, it managed to quadruple its use of a newer solution within the span of week. But it’s unlikely that officials would have had the confidence to make such a large move if the Navy Reserve hadn’t already paved the way.

That newer solution — called Nautilus Virtual Desktop (NVD) — started as a pilot project in the Navy Reserve in 2022. That was a logical starting point, since many of the 60,000 sailors who make up the reserve tended to be the least likely to have physical access to Defense Department networks in their day-to-day lives.

NVD makes that a nonissue. Sailors can access a virtual instantiation of the Navy-Marine Corps Intranet (NMCI) from any computer, including personal ones.

“We have Navy Reserve centers in every single state, and every sailor that is in the Navy Reserve right now is either in a fleet-concentrated area or in a non-fleet-concentrated area,” Cmdr. Stevie Greenway, the reserve’s deputy chief information officer, said during an appearance during Federal News Network’s DoD Cloud Exchange 2024. “For the sailors who are not in a fleet-concentrated area, they don’t have a lot of access to government computers and devices. So the virtual desktop is a perfect opportunity for them to be able to work from their homes and do some of their drill weekend stuff anytime they need to.”

Increased cost effectiveness

That ability has already helped the Navy Reserve cut its IT costs. Although it still maintains government computers at its reserve centers for sailors who are unwilling or unable to use their own computers via NVD, in most cases, there’s no longer a need to maintain large computer labs at those centers just so sailors can check their government email.

But making the transition also involved some up-front funding and technical challenges, said Lt. j.g. Christopher Gregory, command technology director for the Office of the Chief of the Navy Reserve.

“Our number one hurdle was kind of figuring out, ‘OK, each account represents dollars. How are we going to control this? How are we going to parse it out to our force?’ We started out with kind of a rudimentary process that involved a number of steps,” Gregory said. “I came in with that deck plate knowledge because I was the sailor coming in on drill weekends. And I thought to myself, ‘What’s the easiest way for one of my members to get online and to register and to send an email?’ So we designed an automation that removed all barriers. With a blank email sent to our addresses, a sailor is signed up, and they’re good to go with NVD.”

For the broader Navy, the eventual goal is to move about 200,000 users to NVD, which Gregory said offers sailors a better user experience than connecting directly to NMCI on a government-furnished computer.

Next step: Mobile devices

Meanwhile, the reserve is in the second phase of another pilot project to accomplish essentially the same objective with mobile devices. It’s called Mobile Application Management Without Enrollment. Much like the NVD concept, the idea behind MAM-WE is to let sailors use mobile apps to access their Navy accounts from their own devices.

“It’s another game changer,” Greenway said. “It allows you to use things like Microsoft Teams and Outlook, so you can send Navy emails from your personal device. Our big belief is we have to focus on the workforce — that’s one of the pillars in our information strategy. We have young sailors coming in, and we cannot tell them, ‘Hey, you can’t use your $1,000 device.’ We want to maximize that ability wherever they are to be able to work on their personal devices and do the Navy Reserve work they need to. We’re still getting user feedback, but everything so far has been very positive.”

MAM-WE works by keeping government applications in separate, virtualized containers on a user’s phone — isolated from other software that could pose a threat to government networks. And officials consider the risk of data spills to be very low because no government data is stored at rest on personal devices. Rather, it’s kept in a secure cloud environment.

For both NVD and MAM-WE, those types of advancements have helped make Navy security officials comfortable with bring your own device concepts that would have been difficult to swallow a decade ago.

“Honestly, it’s industry technology. The state of the market has risen, and DoD has smartly matched that,” Gregory said. “We’ve embraced zero trust in our architecture and how we build our networks, which is what the rest of the industry is going with. We have conditional accesses, which are kind of a first step. And then there’s artificial intelligence — our favorite buzzword — out there monitoring our networks autonomously and intelligently for the first time. So when a potential adversary tries to log in with a conventional conditional access token, from some area in the world at a certain time, the AI is able to identify that and shut them down. I would love to take the credit, but technology has come so far, so fast, and AI is really protecting our networks day in and day out.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Navy Reserve tech leaders on cloud-enabled access anywhere, anytime first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-navy-reserve-tech-leaders-on-cloud-enabled-access-anywhere-anytime/feed/ 0
DoD Cloud Exchange 2024: USTRANSCOM’s Michael Howard on becoming a more agile-minded organization https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-transcoms-mike-howard-on-becoming-a-more-agile-minded-organization/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-transcoms-mike-howard-on-becoming-a-more-agile-minded-organization/#respond Tue, 26 Mar 2024 00:49:43 +0000 https://federalnewsnetwork.com/?p=4939358 The U.S. Transportation Command will kick off a three-year effort to make it easier for users to access data anywhere, anytime, its transformation chief says.

The post DoD Cloud Exchange 2024: USTRANSCOM’s Michael Howard on becoming a more agile-minded organization first appeared on Federal News Network.

]]>

The U.S. Transportation Command isn’t new to cloud services. The command has put workloads and applications in off-premise compute and store instances since 2016.

Despite its time and experience using cloud, USTRANSCOM continues to take a measured approach in how it expands the use of these capabilities.

Michael Howard, engineering and digital transformation division chief at USTRANSCOM, said about 60% of the command’s working capital fund programs are currently in the cloud, mostly though an infrastructure as a service approach.

Over the coming next few years, the goal is to combine process with technology to become a more agile organization, Howard said during Federal News Network’s DoD Cloud Exchange.

An important step in that direction is an upcoming update to the agency’s memo guiding IT modernization .

“The memo helps us set the tone for all of our IT efforts. It really shows the strategic level importance of modernization in an organization like USTRANSCOM that really needs to strive at being at the leading edge of the transportation industry,” Howard said. “We’re not only shifting our software development processes to be more agile, but at an organizational level, we’re also shifting to be more agile-minded. The modernization memo really projects some goals over the next 12 to 18 months.”

Improving USTRANSCOM’s IT management

USTRANSCOM also wants to further expand its DevSecOps platform and end the use of the waterfall development methodology, he said. It also wants to create a continuous authorization to operate process that is coupled with the agency’s software architecture and containerized microservices.

“We really will finally get after what we know today is as the way things communicate through ports and protocols, and really adopting application programming interface-based communication,” Howard said. “The key ingredient to it is something we added this year when the chief financial officer joined the chief information officer as a signature authority. We’re synergized from a business perspective, and we want to be cost-informed as we strategically move through this memo, certainly over the next 12 to 18 months.”

With the CFO actively involved with technology planning and implementation, Howard said command leaders can better answer questions about the cost to develop and sustain IT programs as well as the prioritization of these initiatives.

“To quote our CFO, she says, ‘One of my biggest problems is managing our IT portfolio.’ It’s the same problem, I think, across the entire DoD. The fortunate thing for USTRANSCOM is we are a working capital fund, whereas the appropriated combatant commands have to demonstrate a lot more scrutiny. We have some flexibility, but we also know that flexibility needs to have some responsibility,” he said. “Our memo before missed an opportunity where we could be more cost-informed. The other thing that does is it now provides some audit capability of that cost, schedule and performance — and then helps us be good stewards of taxpayer dollars as we maneuver through the cloud.”

New USTRANSCOM cloud initiative

A new initiative called USTRANSCOM Anywhere illustrates this more integrated approach to cloud.

The command wants to use Microsoft Azure for cloud hosting to gain some of the capabilities that come with the disconnected Microsoft Azure stack hosted on premise today. Through the Azure stack, USTRANSCOM would deploy capabilities as microservices so the right person could access data at the right time from anywhere through the unclassified network, Howard said.

“What we realized is that once we achieve a platform as a service capability of microsegmented, data-containerized applications, the next thing would be is how can that microsegmented data exist in a denied, degraded, intermittent or limited — DDIL — environment,” he said. “USTRANSCOM Anywhere has a focus to utilize our current [unclassified network] that we provide today and to segment that capability in a continuous integration, continuous delivery fashion.”

The agency will roll out USTRANSCOM Anywhere in a three-phased approach over the next three years.

Howard said this first year is focused on creating the “landing zone” to determine what services users will need most in the environment.

“That culminates with a beta test. Today, I think we looked at about 25 uses. That might increase as we learn more about the environment. It also culminates with 70 of 91 zero trust target-level activities,” he said. “Phase 2 looks like a deployment phase. We will look at the migration of on-premise services that we provide today and house them in the Azure cloud capability. Then, Phase 3 looks like the test and use cases in the disconnected state. We will look to the Azure stack capability to provide some of the microsegmented data in certain parts geographically to try to get it to as close as to what we would provide the warfighter tomorrow.”

Scalability, reliability of cloud services

Howard acknowledged this would be a major culture shift from how USTRANSCOM operates today and has for decades.

As part of the USTRANSCOM Anywhere initiative, the agency expects to introduce a virtual desktop infrastructure and a disconnected capability through the cloud.

“First and foremost, I suppose it’s an understanding of how this will work for global logistics. We wouldn’t want to rush to that,” Howard said. “Secondly, I would say that in the first year, this is really a proof of concept. We, again, with our CFO, are in line to prove out in that first year that this is really something that we want to do, instead of saying, ‘Yeah, we’re all in, and we have no points of return.’ The phasing really helps us get to some decision points to ensure that this is exactly how we want to proceed forward.”

Through all of these updated memos, new procedures and technology pilots, Howard said one of the most important goals is to improve how the command takes advantage of the scalability and reliability of cloud services to improve logistics, especially in a contested environment.

Getting warfighters the data necessary to make better and faster decisions is the most important metric underlying these efforts, he said.

“What’s nice is, today, with our modernization memo, we’re able to somewhat forecast what the probability is of cost, schedule and performance for an application to migrate, whether lift-and-shift or migrate through a DevSecOps platform. What’s nice about that is we are tied in with our enterprise IT portfolio mission area manager or our chief operating officer that’s listed on our modernization memo. And we give quarterly updates. Those quarterly updates actually go into an update to our CEO,” Howard said.

“We’re describing the benefits of a fully modernized platform as a service, where you have microsegmented containerized applications that exist for business function, have immutable code and are really, for lack of better words, defendable. The end state is truly that capability to be business-focused. It’s not that you do zero trust. It’s how you use it, and this is the same thing: It’s not that we’re doing the cloud. It’s how we’re using it.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: USTRANSCOM’s Michael Howard on becoming a more agile-minded organization first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-transcoms-mike-howard-on-becoming-a-more-agile-minded-organization/feed/ 0
DoD Cloud Exchange 2024: DISA’s Korie Seville on crafting cloud products that easily adapt to user need https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-disas-korie-seville-on-crafting-cloud-products-that-easily-adapt-to-user-need/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-disas-korie-seville-on-crafting-cloud-products-that-easily-adapt-to-user-need/#respond Mon, 25 Mar 2024 13:02:02 +0000 https://federalnewsnetwork.com/?p=4938415 The Hosting and Compute Center at DISA creates new services to lower the barrier to quick cloud adoption and scalability.

The post DoD Cloud Exchange 2024: DISA’s Korie Seville on crafting cloud products that easily adapt to user need first appeared on Federal News Network.

]]>

Rank Korie Seville among the Defense Department’s go-to guys for cloud computing smarts.

Seville’s title at the Defense Information Systems Agency might sound a bit cryptic: deputy chief technology officer for compute and senior technical adviser for J9 hosting and compute. But he’s clear about his two-hatted role helping Defense Department agencies succeed in their cloud deployments.

At the J9 — Joint Operations and Plans Directorate — level, “I basically act as an integration point between DISA’s hosting and compute directorate and the rest of the agency from a cloud computing perspective,” Seville said during Federal News Network’s DoD Cloud Exchange 2024.  “At the external level, I basically work with other agencies’ CTO- level engineers and leaders within different agencies to advise them on their cloud portfolios, their migration strategies and their overall hosting and compute strategies.”

Seville said that cloud computing, as agencies move from simply hosting applications to reworking them into microservices, has enhanced DoD’s capability to distribute workloads and create greater resiliency. The evolution also improves computing outside the continental United States (OCONUS), a critical DISA and DoD challenge  generally.

Defense users gain “the capability to expand and get better services, no matter where they are across the world,” he said.

Tapping cloud effectively OCONUS

Asked about DISA’s signature cloud computing contracts, known collectively as the Joint Warfighting Cloud Capability, Seville said the program supports OCONUS needs by easing the buying process for cloud. JWCC “is just an acquisition vehicle,” he said. “It’s a way to purchase cloud, but it doesn’t necessarily by itself solve the OCONUS problem.”

Contractors on JWCC, though, do provide a variety of tactical edge platforms, ranging from modular data centers to backpack-sized computing units.

It’s in how teams want to use the cloud and transform how they deliver the mission that’s an intense focus for DISA and Seville. For instance, what if say two deployed teams focused on the same area of responsibility want to be able to collaborate, he said. How would that happen? How could it be done as an operational expense “instead of having to put a large capital investment in to utilize these cloud capabilities?” Seville said

One answer is Stratus, a government-owned cloud product DISA has deployed in Hawaii, he said. A second option is DISA’s joint operational edge product that uses public clouds. Seville said DISA partners with the DoD chief information officer “to push public cloud capabilities to the OCONUS user community.” A new instance of that capability is under development for Japan and a couple of other locations.

Basically, it consists of one of the commercial cloud services providers, in this case Amazon, providing its hardware housed in DISA secure facilities and operated by government employees. Seville said DISA plans to add the other JWCC suppliers “to be able to get their enterprise-grade deployable solutions, put them in our facilities and have them there for consumption.”

Seville said his group partners closely with DISA’s program executive office for transport, which manages the network connections needed for computing nodes to communicate with one another.

“Their technical director and myself stay very connected. We basically sit together and share roadmaps,” he said, adding that sometimes “my roadmap is going this way, your roadmap is going that way.”

When that happens, the two offices work out “where can we meet and take advantage of some of the resiliency that each of us is building in to make our products operate better together,” Seville said. But they leave the choice of specific transport options to the users, he said.

Providing new DISA common services for the cloud

Another DISA cloud project now under development, and dubbed Olympus, focuses on common services that surround cloud workloads.

“These are things like name resolution, Domain Name System capabilities, certificates, network time — all of these things that are often overlooked in application deployment,” Seville said. “but they’re crucial to getting an application off the ground.”

Olympus will provide these services as needed. Two minimally viable products created so far for Olympus focus on core competencies:

  • Network connectivity and boundary protection
  • A basic suite of common services

The addition of common services elements will result in what Seville called a managed platform, “where the customers can just come in and drop their apps, and we remove the burden, or share the burden, of bringing all of those common services up and operational.”

The basic goal is to help DISA’s customers access meet cloud needs quickly by “really lowering the barrier for entry for getting started in cloud.” He pointed out that the Air Force’s Cloud One and the Navy’s Flank Speed programs provide similar services. But because those service-driven projects are focused on their respective organizations, “we designed Olympus to catch the customers that may have fallen through the cracks,” Seville said.

DISA hosts the pilot version of Olympus in the Microsoft Azure cloud. Seville stressed that the Hosting and Compute Center (HAC) takes an iterative approach informed by customer feedback when crafting products, Olympus included.

“When we develop any of our capabilities, we really try to get away from that five-year plan, 10-year plan, where we know exactly where we’re going to go and nothing can force us to deviate,” he said, and added, “The most important thing to us is our customers, the warfighters. They know their missions better than we do. For us to prescribe where we’re going to go doesn’t make sense if our goal is to support the warfighter.”

Ensuring ‘optionality’ in all DISA cloud offerings

HAC views providing choice as a foundational factor in helping DoD users implement the cloud capabilities they need to meet their specific situations and missions.

“One of the design tenets, and one of the tenets of our entire organization, has been optionality,” Seville said. “And so when I have an OCONUS user who’s trying to build out a capability, we’re going to provide them with a menu of options.”

He used the analogy of a pizza parlor menu, where a customer can choose from a variety of toppings for their pie: “Do they want a combination of tactical edge, operational edge and maybe some data center as a service to give them the ultimate level of resilience? Or do they want to go strictly tactical edge and just maintain local ownership of that computing capability?”

As cloud hosting has taken hold in DoD, Seville said he’s now seeing increased use of the elasticity and flexibility cloud computing offers. An important reason is that early estimates of cost savings from simply shifting workloads failed to pan out.

“People are starting to realize that taking advantage of elastic scale, taking advantage of serverless capabilities, that’s how you’re going to save that money,” he said. To get there, though, application owners will have to go the refactoring or redeveloping route. And he said users will also have to keep rationalizing their application sets, retiring those that won’t work in the cloud.

“There is an app refactor model that has to take place in order for you to effectively take advantage of elastic scale,” Seville said. DISA can partner with users redoing applications to help them fully realize cloud benefits.

By going to containerization and microservices for applications, Seville said, users will get closer to cloud interoperability and easily moving workloads among competing cloud providers. That vision of a  “cloud-agnostic, multicloud, hybrid cloud, pick-up-an-app-and-move-it-wherever-I-want model really relies on that app rationalization, that app modernization framework.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: DISA’s Korie Seville on crafting cloud products that easily adapt to user need first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-disas-korie-seville-on-crafting-cloud-products-that-easily-adapt-to-user-need/feed/ 0
DoD Cloud Exchange 2024: Akamai’s Robert Gordon on streamlining cloud operations at scale https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-akamais-robert-gordon-on-streamlining-cloud-operations-at-scale/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-akamais-robert-gordon-on-streamlining-cloud-operations-at-scale/#respond Fri, 22 Mar 2024 11:39:22 +0000 https://federalnewsnetwork.com/?p=4935680 Managed service providers can tackle the coordination necessary across cloud providers, DoD agencies and multitude of apps, the Akamai systems engineer says.

The post DoD Cloud Exchange 2024: Akamai’s Robert Gordon on streamlining cloud operations at scale first appeared on Federal News Network.

]]>

For the Defense Department, the benefits of hosting applications in the cloud bring some challenges. Chief among them? Avoiding  the cost and time of repeatedly developing computing services common across all applications.

The simple answer is that DoD agencies should instead develop once and use many times, said Robert Gordon, director of engineering for Akamai Defense. The development and deployment of such services offers an ideal way to use managed service providers, he said. MSPs operate in a value-added manner between DoD agencies and primary commercial cloud service providers.

“On one side, there are the mission application teams that are trying to move their workloads to the cloud. On the other side, MSPs sit on top of the commercial clouds, and they try to figure out ways to be able to support these applications at massive scale,” Gordon said during Federal News Network’s DoD Cloud Exchange 2024.

DoD components sometimes have hundreds or thousands of cloud-hosted apps in what he called critical mass. Each, though, needn’t have its own unique services.

For example, every cloud application requires a user access mechanism that’s not related to the operation or logic of the app itself. Access solutions, Gordon said, can be difficult to engineer because of the many DoD rules around security and other characteristics.

Akamai “focuses on those common hard problems because the benefit of solving that problem is multiplied by hundreds or thousands of instances,” he said.

After access comes authentication “and how it fits in with the zero trust initiative that’s sweeping through  DoD and is totally tied in with the cloud is another aspect of this,” he said. “The mission application teams are on their own to try to figure out how to do it, unless there’s a common services layer” providing the service.

Such common services “are the things the MSPs should look for, the things that everyone’s going to have to do,” he added. “Everyone’s going to have to solve this problem.”

Taking advantage of common services at scale

Some common services occur on the back end of applications, such as database calls or network connections among apps, Gordon said. He named single sign-on systems that require connections from, say, an application in the Army to an application in the Defense Information Systems Agency.

“They may not have a plug into the Army, or Air Force or whatever DoD backend that has all the enterprise information,” Gordon said.

Plus, application owners typically face a complicated process to obtain access.

An agency’s tech or development  staff might know how to write identity cloud service or security assertion markup language, Gordon said. “But that’s only part of the puzzle. The back end is equally important,” he said. “You have no way of figuring out what the attributes are that you need to make your decisions. You have no way of enforcing authorization in a common way, using those attributes.”

Migrating data to the cloud and operating data exchanges also provide opportunities for use of common services, he said.

“Whether database access, or even system-to-system communication, most of these are big, complex systems with a lot of trading partners that are used to being able to FTP files to each other,” Gordon said.

That’s because everyone was on the same DoD information network. The cloud complicates those exchanges and communications connections because now systems use the internet and commercial clouds.

“This is another area where the MSPs provide common services to try to streamline that,” Gordon said. “And when they can’t provide common services, MSPs at least provide playbooks so that the application teams that need to do these things know what they need to do it in a compliant, secure and data-aligned way.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Akamai’s Robert Gordon on streamlining cloud operations at scale first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-akamais-robert-gordon-on-streamlining-cloud-operations-at-scale/feed/ 0
DoD Cloud Exchange 2024: Army’s Rob Schadey on applying rigor to gain cloud efficiencies https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-rob-schadey-on-applying-rigor-to-gain-cloud-efficiencies/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-rob-schadey-on-applying-rigor-to-gain-cloud-efficiencies/#respond Fri, 22 Mar 2024 11:14:04 +0000 https://federalnewsnetwork.com/?p=4935662 PEO EIS is shifting to agile software development, applying rigor of FinOps, while also making sure cybersecurity is top of mind.

The post DoD Cloud Exchange 2024: Army’s Rob Schadey on applying rigor to gain cloud efficiencies first appeared on Federal News Network.

]]>

The Army’s Program Executive Office for Enterprise Information Systems is embracing agile software development and delivery in the cloud.

Through the move, Rob Schadey, acting deputy program executive officer at EIS, said his office is shifting many software systems that were previously under the Army’s defense business systems policy into the service’s software pathway policy.

That includes the Global Force Information Management System, ArmyIgnitED, Army Contracting Writing System and several others, Schadey said during Federal News Network’s DoD Cloud Exchange 2024.

“We’re beginning with the flexibilities of software development from an agile perspective,” Schadey said.

The Army earlier this month announced a new policy for software development focused on iterative development and close coordination with users.

Agile software development typically relies on the power of cloud computing. And while there are many approaches to agile development, Schadey said PEO EIS ultimately settled on using the Scaled Agile Framework, or SAFe, method.

“It allows us to deliver capability through agile release trains and program increment, or PI, planning in coordination and cooperation with our customers,” he said.

PEOEIS can incorporate the user and the user experience consistently into software builds, rather than playing a guessing game, as Schadey put it, that comes with traditional waterfall software development.

“More often than not, you’re really focused on the integrated master schedule itself and trying to hit a date, whereas in agile, there’s flexibility there to work with the customer to hit on capability delivery and progress over two-week sprints,” he said. “So instead of playing the guessing game with a date, we’re getting more concrete with delivery of software by leveraging agile SAFe and working with our customer and functional [commands] on the capability and delivery needs that they have.”

FinOps ‘rigor’ key to cloud savings

For the last several years, PEO EIS has also fine-tuned a financial operations practice aimed at maximizing the cost savings opportunities of moving systems and data into the cloud.

Schadey said a team of three people working on FinOps has achieved an $18 million cost avoidance over the last four years.

“That includes leveraging the cost savings plans, modifying storage types, supporting automated patching and catching misconfigurations or inaccurate scripts,” he said. “So an $18 million buy-down to bend the curve over four years to me is a great return on investment. And it really shows that if you put the right rigor and you get the right team in place, you can really identify cost savings to be plugged into other places.”

The Army is also eyeing opportunities to take advantage of the Joint Warfighting Cloud Capability (JWCC) contract. But Schadey said, so far, the contract doesn’t have enough volume to keep costs down.

“We’ve done the cost comparisons, and we’re just not at that point where we could transition because it would actually increase the cost to the Army, based off of those cost differences and volumes that we’ve got in place,” he said. “At some point, it will make absolute sense to help drive down those costs on the next contract vehicle that the Defense Information Systems Agency and JWCC work to put into place.”

Keeping cloud cybersecurity a priority

Schadey also emphasized the importance of maintaining cybersecurity in the cloud and understanding the shared services model that customers enter when they move data to the cloud .

The Army’s recent shift to software as a service includes partnering with industry to receive audit logs and other cyber capabilities from cloud providers, he said.

PEO EIS is also working on a new capability, called Neighborhood Watch, to advance cloud cybersecurity.

“That very much will be a capability where we’re consolidating and collapsing to a security information/event management capability and solution,” Schadey said. “We are working to put that in place within the cloud. And we will apply the same financial operations rigor because it will be in the cloud.”

Although logging capabilities can drive up storage costs, he said it’s not always about saving money.

“Not everything that’s done from a cloud perspective is done with the intent to drive down costs,” Schadey said. “It’s there to enable efficiencies and automations and tap into the services that cloud can offer. And in this scenario, we will be able to enable machine learning and other things over those datasets to help us get better details and logs with the threats that we’re up against.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Army’s Rob Schadey on applying rigor to gain cloud efficiencies first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-rob-schadey-on-applying-rigor-to-gain-cloud-efficiencies/feed/ 0
DoD Cloud Exchange 2024: Navy’s Louis Koplin on service’s digital transformation horizon https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-navys-louis-koplin-on-services-digital-transformation-horizon/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-navys-louis-koplin-on-services-digital-transformation-horizon/#respond Thu, 21 Mar 2024 11:30:55 +0000 https://federalnewsnetwork.com/?p=4933618 Using four-phase "horizon" process, Navy balances user experience and security to bring innovative services to the enterprise.

The post DoD Cloud Exchange 2024: Navy’s Louis Koplin on service’s digital transformation horizon first appeared on Federal News Network.

]]>

The Navy bases its guiding principles for digital transformation on four basic concepts that is calls “horizons.”

Its Naval Identity Services are on the cusp of Horizon 1 — being offered as an enterprise service.

Meanwhile, the service’s virtual desktop pilot aims to make it easier for sailors, seamen, civilians and contractors to bring their own devices onto Navy networks. The service expects the pilot to remain in Horizon 2 for the foreseeable future, said Louis Koplin, executive director of the Navy’s Program Executive Office for Digital and Enterprise Services. The pilot’s team must demonstrate that the approach is ready to move into an enterprisewide service.

There are a host of other digital services that are in Horizon 3, the evaluation stage of whether it even makes sense to test them for enterprise use. And Horizon 0 is when a service or system is set to be retired.

Koplin said moving a project into Horizon 2 depends on several considerations.

The first is obvious: Does something actually need to be an enterprise service?

“If you build something that is lean learning and enabling so it’s efficient, it’s lightweight and it’s responsive to feedback and gets better, and actually lets people do their jobs faster, quicker, more securely and more cheaply, no one’s not going to want to use it. So you build a world-class product, and people are going to adopt it,” Koplin said on Federal News Network’s DoD Cloud Exchange 2024.

“On the flip side, there’s definitely value in the designation — for one thing, the process statements and criteria that really forces things to get to a certain level of maturity. We don’t want to put out something that is not credible. I personally have been very defensive about what we call an enterprise service for that reason. I don’t want to put something out that we announced as an enterprise service mandatory use, and then people go and they literally can’t order it. There’s no website. There’s no rate card, or maybe it’s centrally funded, or maybe it’s not. Maybe it’s on less than a full authority to operate. We’ve seen these attempts in the past, and it really is corrosive to the mindsets we want to have with digital experimentation and scaling, and innovation adoption.”

Finalizing memo on Navy Identity Services

The second useful part about the designation as an enterprise service is that it becomes a part of the Navy’s enterprise architecture. Navy PEO Digital detailed each of the technology horizons as part of a communication to industry.

Take the Naval Identity Services. The Navy has been testing NIS out for several years. It’s a key piece of the service’s zero trust journey and consolidates and standardizes the Navy around identity and access management.

The success of the NIS means Navy leadership is expected to sign a memo designating it as an enterprise service this spring.

“It’s got a bunch of things that are really great. The one that I think people don’t realize that I like the best is it allows us to do away with a system access authorization request, that DD-2875 form. It’s not because we’re not doing those checks and balances, of course, but because there’s a workflow and integration with authoritative data sources,” he explained. “For instance, being able to pull civilian status data from the Defense Civilian Personnel Data System instead of having someone type it in, or from the Defense Manpower Data Center for the official defense manpower data. I mean, that’s just great. It saves time, saves effort and the data quality is higher. Then, knowing that you’ve got access to an enterprise information environment when you onboarded, now we don’t need to fill out more of those PDFs. All we have to do is update the attributes, which roles and which workloads are you going to get access to. We have an automated standardized way to do that.”

Koplin said the benefits of NIS will extend beyond individuals, and help the Navy in  other ways, including with the goal of being financially auditable, federating privileged access management and giving users the ability to work offline, afloat and disconnected.

“Coming up in the next few months, we’ll start work on the Impact Level 6 implementation to go to the secret level,” he said. “We’re working on integration with Marine Corps directory services, synchronization with the Flank Speed attributes. In some cases, we are retiring some legacy identity infrastructure. But in other cases, it’s integrating where it’s already modernized and underpinning our zero trust cloud, and it brings additional capabilities in terms of automation, integration and control.”

Making progress on Navy-wide virtual desktop

The Navy’s other digital transformation efforts are a bit behind NIS.

The service is piloting a virtual desktop for bring your own device users with about 1,000 people. Known as Nautilus, the program aims to take these efforts across the virtual desktop, the physical endpoint and to mobile devices.

“We’re scaling all of them in line roughly with the Windows 11 mandate timeline, so that puts us around 18 months from now. The goal is to have all these composed capabilities that can meet different mission needs based on the persona of the customer that we’re supporting,” Koplin said. “There are some personas that the virtual desktop is great for them because either they don’t get issued a government device or they do today, but we’d like not to issue them a government device. The Naval Reserves have been big users of Nautilus’ virtual desktop.”

Another ideal Nautilus persona is an employee from industry who needs access to the Navy’s networks but can’t have a government-furnished device.

For most Navy servicemembers and civilians, however, the benefits of Nautilus will be two-fold.

“What this really lets us do is reset and go back to industry standard commercial configurations and commercial business processes so that you can log into a device fresh out of the box off the shelf. Just by using your Flank Speed email, your government email, it will log in, connect to the server, configure and register the asset, do all those things. So within, say, 30 minutes, you’re up and running,” Koplin said. “When we talk about moving the dial on those world-class alignment metrics in terms of having a higher customer satisfaction experience and reducing the time from potentially weeks or days to hours or minutes, it knocks them out of the park on many fronts. And that device is fully well managed in our zero trust architecture.”

Accelerating Navy cloud capabilities

Underpinning many of these efforts is the Navy’s cloud management office  runs called Neptune. PEO Digital recently went live with the first incremental upgrade of a servicewide cloud portal, and now it’s going to accelerate the pace to deliver more capabilities, Koplin said.

“2024 is definitely a building year for us, and that’s where we focus our efforts on getting applications migrated and modernized. But absolutely, we need to think about how we do that and that we’re not pushing a rope; that we don’t want to lift and shift an archaic application into a modern cloud environment and not really gain benefits — potentially even incur greater costs — because it’s not really a cloud-compatible architecture,” he said. “We are being very deliberate about how we do that. There are a lot of things in flight now that I expect to see bear fruit over the rest of 2024 into 2025, whether it’s those standardized onboarding processes, cloud maturity frameworks, standardizing some of our service models and service level objectives and agreements, our financial operations capability. It’s really keeping an eye on the business command and control of our cloud environments and how we’re consuming cloud.”

Koplin said the Navy isn’t doing all of these technology initiatives in a vacuum. The end goal isn’t just to implement the technology but rather improve the end user’s experience without sacrificing cybersecurity.

“Typically, we see this trade-off between security and convenience, customer experience versus cybersecurity. But in the case of zero trust, we’re getting more of both. That’s really exciting when you can make the users happier with a more convenient, lower-friction experience and the cyber operators happier because they have greater control, greater visibility and more fine-grained control,” he said. “We are really helping to enable that change in terms of designing, delivering and sustaining this world-class digital experience. One of the ways we do that is with our world-class alignment metrics, things like reducing user time lost, improving customer satisfaction and improving our adaptability and mobility.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Navy’s Louis Koplin on service’s digital transformation horizon first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-navys-louis-koplin-on-services-digital-transformation-horizon/feed/ 0
DoD Cloud Exchange 2024: Army’s Leo Garciga on clearing obstacles to digital transformation https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-leo-garciga-on-clearing-obstacles-to-digital-transformation/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-leo-garciga-on-clearing-obstacles-to-digital-transformation/#respond Thu, 21 Mar 2024 11:30:43 +0000 https://federalnewsnetwork.com/?p=4933567 The Army CIO expects the service's new software development policy will bring better capabilities to soldiers faster.

The post DoD Cloud Exchange 2024: Army’s Leo Garciga on clearing obstacles to digital transformation first appeared on Federal News Network.

]]>

Leonel Garciga has been on a sprint with a bulldozer.

Since becoming chief information officer of the Army last June, Garciga has been clearing policy obstacles, built up over the course of decades, to help spur digital transformation from the private to the general to the service’s secretary.

For most, the job of CIO is a marathon. But Garciga needed to start at a sprinter’s pace to — in the words of Army Secretary Christine Wormuth — “start breaking through the bureaucracy of the department.”

Even so, he acknowledges that the marathon part of his job will begin once the policy obstacles are cleared. But for now, he continues to open the throttle on the bulldozer.

“If it’s a policy challenge, if it’s a standard operating procedure challenge, I’m the guy with the pen. Help me fix that. If you’ve got lessons learned inside industry or in the commercial space, bring those standard operating procedures, bring those policies over, bring those guardrails over. Let’s put it on paper, and let’s get it signed out. Don’t let that prevent us from delivering,” Garciga said during Federal News Network’s DoD Cloud Exchange 2024.

“That’s the big thing that I keep pushing. Instead of saying, ‘Hey, policy doesn’t let me …,’ tell me this is what the policy should say, and let’s get that signed out. Let’s work through the friction and get that done. I continue to tell folks like that’s where we need the most help. We need to make sure that we get that alignment done because right now you’ve got someone who likes moving really fast, and I’m willing to underwrite a significant amount of risk when it makes sense.”

Focusing on 5 software reforms for Army DevSecOps

In March, Garciga pulled out his bulldozer to topple the Army’s approach to software development. Wormuth issued a new agile software policy detailing five changes to reform what she called the institutional processes of the Army.

The software reforms include everything from changing the way the Army writes requirements to emphasizing flexible acquisition approaches and training the workforce in these methods.

Garciga said the policy changes will help the service streamline its ability to build contracts based on agile and DevSecOps methodologies.

“A really big push includes centralizing some of that work at an acquisition digital center of excellence, which will be focused around these core agile contracts that we want to get out the door to support software development efforts,” he said. “The next big piece is really changing our approach to requirements by taking the holistic view we’ve had before to write these large dissertation type requirements and scaling them down to capability needs statements. So what it really does is take that requirements process and bring it down to core functionality versus those [individual systems] and allowing teams to have a little bit more left and right limits as they move forward.”

These changes aren’t just for IT or development teams. Garciga said the acquisition and nonacquisition workforces, as well as the test and evaluation experts, must all move in the same direction to meet the Army’s digital transformation goals. Otherwise, he said, creating a modernized foundation to build off of will be more difficult.

The Army can’t just write a policy and expect change to happen, which is why Garciga said the new digital center of excellence at Aberdeen Proving Ground in Maryland will take the lead in procuring software.

“The center will include subject matter experts who understand software development, who can help customers really flesh out how they want to get from that contract, put it in place in the most agile way that really does include all those requirements for agile development, sprint cycles and all those things that you need expertise in,” he said.

“The other piece, which is a Step 2 that’s happening simultaneously, is a team the CIO’s office is standing up. It’s a very small cell, which is really focused on helping either big programs or really critical programs in the Army run through the wickets of a better software contract. Whether it’s legacy stuff that we have that may need some shaping to get the right agile contract in place or to get the right task orders in place, we would bring our expertise with some software development experts and some engineers to help the command or the program really reshape their contracting efforts in coordination with the center of excellence for digital contracting.”

Turning to industry partners for Army cloud assist

The software expert cell already is working with a handful of Army commands on specific and critical programs. Garciga said the next step is to create a governance structure to help manage expectations and data. He said that will come together this spring.

Garciga expects that the changes will help the service work better with existing and potential vendor partners.

“With the traditional contracting approach, we alienated some of our more leading edge partners because we were telling them to go backwards to deliver,” he said. “I think that this is going to give some flexibility to these companies to bring in some expertise and so they can more healthily compete in the environment. For some of the folks that have been supporting us a long time, are good partners who haven’t had the opportunity to take that next step, this is really going to give them a landing pad to accelerate some of those efforts.”

Along with the new software policy, Garciga has led the effort to update guidance around reciprocity of security authorizations, use of a software container policy and a new software as a service policy.

All of these efforts, of course, are underpinned by the use of cloud services. To that end, Garciga said his office is close to releasing the revamped cArmy platform, with cArmy 2.0 launching possibly in the April.

The service added agility based on all the lessons learned and made the cloud platform bit more user-friendly for Army partners, Garciga said.

“A lot of work is happening in that space. We’re working the AWS side to create a new landing zone. We’ll start to transition some of the existing customers into a new landing zone, which I’m excited about because it’s going to ease a lot of their pain and some of their challenges with just getting day-to-day operations done,” he said. “Then after that, we’ll move on to Microsoft Azure, and we are still looking at where we have opportunity with some of our other cloud service providers.”

Applying lessons from early Army cloud moves

The decision to update C-Army meant the service took a “tactical pause” over the last few months in moving workloads and applications to the cloud.

Garciga said the pause let the Army reevaluate its delivery model around cloud services.

“Like most traditional folks and enterprises who moved to the cloud, we raced in some areas, and we made some mistakes. We did some things that made sense at the time but don’t make as much sense now. And as new cloud services have become available in the regions across all our cloud service providers, it’s really caused us to rethink some of some of the technical work that’s been done,” he said.

“We made some decisions that made sense to do, like physically lifting and shifting a capability and just run the infrastructure as a service. It made sense at the time for the services that were available and for what we were trying to do to overcome some challenges that we had as an Army and in some of our server rooms. But we did that probably in the least optimized way. As we’re now two, three, four years down the road, we’re like, ‘Wow, that’s really suboptimized. Our costs are really high here.’ ”

That’s particularly true for some of the services and systems the Army move to the cloud early on, Garciga said. The end result? The Army created new legacy technology debt in the cloud, he added.

The new C-Army platform should streamline the service’s ability to deliver core enterprise cloud services, reduce the number of trouble tickets the help desk receives and provide standardized templates for vendors and customers alike.

“You can be a little bit more predictable on what kind of capabilities you want to deliver and how you want them delivered. We are really focusing on some foundational things that will allow the acquisition community and our partners to understand what the environment looks like in a more streamlined way,” Garciga said.

“We will streamline onboarding services and really automate as much of the onboarding for customers as we can. We really want to deliver a lot of the information upfront. What does the environment look like? What do our images look like? What baseline managed services are we delivering as an Army to your tenant? Getting that out is hugely important. So our focus is going to be making sure that we make that available to all the folks that are coming into the environment. This will make it a little bit easier for folks to come in.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Army’s Leo Garciga on clearing obstacles to digital transformation first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-leo-garciga-on-clearing-obstacles-to-digital-transformation/feed/ 0
DoD Cloud Exchange 2024: Exiger’s Cameron Holt on SCRM across the hybrid cloud enterprise https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-exigers-cameron-holt-on-scrm-across-the-hybrid-cloud-enterprise/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-exigers-cameron-holt-on-scrm-across-the-hybrid-cloud-enterprise/#respond Wed, 20 Mar 2024 18:14:12 +0000 https://federalnewsnetwork.com/?p=4933016 When combined, cloud, AI and SaaS can be a powerful trio in helping DoD manage supply chain risks, the former Air Force officer says.

The post DoD Cloud Exchange 2024: Exiger’s Cameron Holt on SCRM across the hybrid cloud enterprise first appeared on Federal News Network.

]]>

Cloud computing and software as a service present a special challenge to agencies trying to get a handle on their supply chains. Yet, a cloud-hosted supply chain risk management (SCRM) solution can also give agencies deeper and wider visibility into their supply chains.

That’s according to Cameron Holt,  president of government solutions for Exiger. “Cloud computing allows us to really focus our investment and energies on delivering deep, rich and wide supply chain risk management solutions, rather than having to concentrate on the infrastructure,” Holt said during Federal News Network’s DoD Cloud Exchange 2024.

Data is crucial to effective supply chain risk management, he said. Beyond an agency’s own data about its suppliers, supply chain analysis requires the kind of data that could point to problems such as shell companies or entity-level information — for example, whether a single company uses multiple names.

“Exiger really concentrates on consuming massive amounts of data worldwide, both open source and very customized sources of data, to include data from our clients,” Holt said.

The retired Air Force major general said that as a contracting executive, he used this data analytic technique at the service to see into the supply chain for manufactured medical products during the pandemic. “We did not award a single bad contract out of hundreds of contracts awarded, some of them quite large,” Holt said.

To deepen its SCRM offerings Exiger acquired a company called Ion Channel. That integration gives Exiger the ability to “see where the software came from in its various pieces and do the illumination of software bills of material, which is becoming increasingly important,” he said.

Why to use cloud-based AI for SCRM

Holt urged agencies to get past what he called the government’s gun-shyness about adopting SaaS more widely. One reason? The variety of artificial intelligence services hosted in clouds.

“For supply chain risk management, it’s more beneficial to understand it all the time and have it continuously monitored with the best AI out there,” he said.

Holt added that SaaS products can benefit other cyber capabilities too:

  • Speed engineering design: For instance, an agency could use digital twins in the manner of F1 racing, where “you can do literally thousands of design iterations with the entire supply chain in a government reference architecture and understand how your changes are going to show up on the manufacturing line.”
  • Keep pace with bad cyber actors: “Increasingly, as we face very sophisticated adversaries around the world, we need to understand which capabilities we should invest in that really move the needle in return on investment versus adversary capabilities.”
  • Bring new capital and nontraditional vendors to the defense market: Holt described what he called practice fields, “where you can have venture capital firms that actually have physical locations that are cleared facilities.” The facilities could host “defense contractors, large and small, and also nontraditional contractors to design their own defense products using the toolsets for digital engineering and digital twinning.”

Ultimately, ROI comes from cost avoidance, he said. Cloud computing and SaaS that includes the latest engineering and AI tools will let the government “not try to reinvest or duplicate the cost of hundreds of millions of dollars that have gone into these sophisticated solutions.”

By taking advantage of that investment, “there’s going to be a lot more focused investment from the government in important areas, not just supply chain risk management, but in areas that don’t exist today like funds to actually mitigate those risks,” Hold said.

The post DoD Cloud Exchange 2024: Exiger’s Cameron Holt on SCRM across the hybrid cloud enterprise first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-exigers-cameron-holt-on-scrm-across-the-hybrid-cloud-enterprise/feed/ 0
Federal News Network’s Industry Exchange Cloud 2024 https://federalnewsnetwork.com/cme-event/cloud-computing/federal-news-networks-industry-exchange-cloud-2024/ Tue, 19 Mar 2024 19:09:32 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4931399 In a hybrid, multicloud world, how can your agency deliver services effectively, efficiently and securely?

The post Federal News Network’s Industry Exchange Cloud 2024 first appeared on Federal News Network.

]]>
Is your agency maximizing hybrid and multicloud to meet mission demands?

Join us for Federal News Network’s Industry Exchange Cloud on Nov. 4 to learn about the best tools, tactics and techniques to help you achieve effective, efficient and secure cloud services.

Our 2023 Industry Exchange Cloud event featured speakers from Orca Security, Nutanix, Pluralsight, Commvault, Amazon Web Services, Future Tech Enterprises and Kelyn Technologies.

Register today to save the date on your calendar and receive updates!

The post Federal News Network’s Industry Exchange Cloud 2024 first appeared on Federal News Network.

]]>