Innovation in Government - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Thu, 28 Mar 2024 17:54:30 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Innovation in Government - Federal News Network https://federalnewsnetwork.com 32 32 Using end-to-end observability for cyber, CX improvements https://federalnewsnetwork.com/innovation-in-government/2024/03/using-end-to-end-observability-for-cyber-cx-improvements/ https://federalnewsnetwork.com/innovation-in-government/2024/03/using-end-to-end-observability-for-cyber-cx-improvements/#respond Tue, 19 Mar 2024 02:48:00 +0000 https://federalnewsnetwork.com/?p=4930537 Brian Mikkelsen, the vice president for US public sector at Datadog, said reducing tool complexity helps agencies understand how their systems are working.

The post Using end-to-end observability for cyber, CX improvements first appeared on Federal News Network.

]]>

The Office of Management and Budget’s 2022 IT Operating Plan highlighted the need to reduce complexity of systems to bring down costs. And, of course, it promoted the idea of using data to better drive decisions.

Over the years, agencies and vendors have made their technology environments a little more complex due to too many bespoke tools and a lack of integration of data. With all those challenges that that have come up over the last 20-25 years, OMB has actually pushed agencies toward enterprise services as one way to overcome many of these IT modernization obstacle.

There’s other opportunities for agencies to become more efficient, more secure, and improve how they deliver services decisions. One way: the use of end-to-end observability tools that can help agencies innovate by consolidating the tools that they use, reducing the complexity of those tools, and of course, give them visibility across many of their tools across the technology stack.

Brian Mikkelsen, the vice president and general manager for US public sector at Datadog, said end-to-end observability gives organizations an opportunity to observe or monitor any application, any infrastructure, anywhere. This includes infrastructure and applications no matter if they are on-premise or in the cloud.

“The three pillars of observability at its core context is infrastructure metrics. This is understanding the health of my operating systems, my virtual machines, my containers, all the way up into cloud native serverless functions,” Mikkelsen said on the discussion Innovation in Government, sponsored by Carahsoft. “It’s infrastructure metrics paired with application traces so now I’m starting to think about on top of that infrastructure, where am I running my applications, whether it’s on-premise or in the cloud, but what can I actually see in terms of how my applications are performing? What are they doing from a memory constraints perspective? What’s their overall performance? How much lag time is there between requests and actions? The third component of that three pillars of observability is logs. So it’s the end-to-end observability part is really this idea that we’re creating context for the users of these systems.”

Reducing time to solve problems

One of the biggest benefits of this approach is reducing the number of tools required to monitor networks, mitigate risks and creating context between infrastructure, applications and logs.

“The real benefit is to try and reduce the time to know when I have a problem. And the reduced  time to solving that problem is correlating all that information and not having separate teams working in separate tools, all with a separate perspective,” Mikkelsen said. “One of the key characteristics of a more modern observability and security solution, we talk all the time about the cultural changes of getting people out of individual tools and individual contexts, and giving everybody the same view of the same information. I don’t want to have five tools and five teams looking at it from a different perspective. I want one tool with all the teams in that same tool, folks having the same context so we’re not arguing about what’s happening. We’re observing what’s happening, and we’re solving for it.”

The need to solve problems more quickly is as much about the evolving nature of the cyber threat as it is about meeting the growing expectations of an organization’s customers.

A recent Government Accountability Office report found agencies are struggling to meet the cybersecurity logging requirements as required by President Joe Biden’s May 2021 executive order.

“What it’s really asking you to be able to do is track issues in real time, hold those logs in storage for, I think, a minimum of 12 months in hot storage, and I think 30 months in total,” Mikkelsen said. “The benefit of an end-to-end observability and security company is that we think about logs in multiple perspectives. We can talk about IT infrastructure and application. But here from a cybersecurity perspective, now, we’re really talking about cloud security management.”

Solving mission problems

From a customer experience perspective, end-to-end observability also includes tools that provide digital experience monitoring.

Mikkelsen said the tools help organizations understand the user’s experience from login throughout the entire front-end event.

“They can generally understand what’s working and where are the bottlenecks. What are the challenges with that customer’s front end experience?” he said. “If you think about this from a synthetics [data] point of view, what synthetics allows you to do is proactively understand ‘is that system up and is that front end application up and running the way I want it to? Is it handling requests from various operating systems? Is it working with various browsers?’ And we can actually set up proactive tests so even more important than knowing when you have an issue and fixing it is knowing you have it before it’s a real issue, and resolving it before you have a negative customer experience or citizen experience. This all boils down to the real drive for a lot of our IT mission owners across government: They’re in the business of solving for the mission. A lot of times the mission is improving the citizen’s experience with government.”

Mikkelsen said the Agriculture Department’s Digital Infrastructure Services Center (DISC) took advantage of end-to-end observability tools and saw immediate improvements.

“They had one ongoing problem with memory utilization. The way I think about it was it was an executable loop and every time it fired up, it was causing memory depletion. That same or systematic set of tickets had popped up something in the neighborhood of 700 times in a short period of time,” he said. “They’ve taken that memory utilization challenge down from 700 plus tickets down to zero tickets relatively quickly because we were able to show them what the challenge was. On top of that, they were able to bring, I think, 95% of their target infrastructure up and running with monitors and dashboards from an observability point of view within 75 days. I think that includes over 4,000 containers as part of that infrastructure setup.”

The post Using end-to-end observability for cyber, CX improvements first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2024/03/using-end-to-end-observability-for-cyber-cx-improvements/feed/ 0
In-Q-Tel: The translator between DoD, start-ups https://federalnewsnetwork.com/innovation-in-government/2024/02/in-q-tel-the-translator-between-dod-start-ups/ https://federalnewsnetwork.com/innovation-in-government/2024/02/in-q-tel-the-translator-between-dod-start-ups/#respond Mon, 12 Feb 2024 18:02:34 +0000 https://federalnewsnetwork.com/?p=4887165 Barry Leffew, the vice president of the government platform accelerator at In-Q-Tel, said areas like cybersecurity, enterprise technology, space, lightweight energy sources and biotechnology are among the company’s top investment focus areas.

The post In-Q-Tel: The translator between DoD, start-ups first appeared on Federal News Network.

]]>

The rate of change in the technology market can make anyone’s head spin.

The continued growth and acceptance of artificial intelligence, machine learning and predictive analytics is something the technology community hasn’t seen before.

Add to that, the developments in pushing compute to the edge and the technology sector will continue its whirlwind of change.

The Defense Department and the Intelligence Community must do more to take advantage of all the innovation happening.

One key piece to that preparation is the budget. The Defense Department’s proposed budget for fiscal 2024 includes $145 billion for research and development and $170 billion for procurement. It’s what DoD called a clear indication of its commitment to stay on the cutting edge.

The IC mapped out a four-year investment strategy for emerging technologies in 2022. Its plan shifts how the IC tracks research and development to ensure there is more of a connection to mission goals across the IC.

Barry Leffew, the vice president of the government platform accelerator at In-Q-Tel, also known as IQT, said for DoD, the IC and all national security agencies to take even more advantage of emerging technologies and position their missions for the future, they must be prepared for the impact of artificial intelligence and machine learning.

“We’re seeing a cross connection between AI and ML that is driving advances in other areas such as energy and battery technology, helping enable advances in biotechnology as well as cyber,” Leffew said on Innovation in Government sponsored by Carahsoft. “Together they’re creating an even more flywheel effect to help accelerate the growth of innovation.”

Of course, it’s not all about AI and ML. IQT, which has played a key role over the last 25 years to bring innovative technologies to the national security space through directed investments, is paying close attention to other areas like cybersecurity, enterprise technology, space and the whole trend toward commercializing space as well as more lightweight energy sources and biotechnology.

AI as the enabler

Leffew said that there is so much focus on AI/ML, however, because of how these emerging technologies are acting as strong enablers for cyber or energy or biotechnology innovations.

“Take enterprise technology, for example, it’s advancing in terms of our ability to process and store massive amounts of data and information, being able to then layer AI enables us to do faster and analysis, support decision making and actually identify new ways of solving problems,” he said. “Then AI also becomes a key enabler of cyber because of its ability to rapidly detect and respond to cyber threats.”

DoD already is investing heavily in new cyber tools through its zero trust architecture, but with the ever-changing cyber landscape, the Pentagon is always seeking better and faster tools.

Leffew said one area IQT is seeing a lot of interest in is using cyber tools to fully understand what devices are connected to the network, how they’re connected, what versions of software and firmware those tools are using. That way, Leffew said, if there is a deficiency, DoD can, through the AI tool, rapidly address vulnerabilities.

“Another area is basically what’s called the software bill of materials (SBOM) and really identifying what components are in either government-off-the-shelf (GOTS) or commercial-off-the-shelf (COTS) software, so the government can have a firm understanding of any underlying vulnerabilities,” he said. “Finally, another area is really leveraging AI right to be more reactive, and proactively identify threats that are incoming to help accelerate the ability to respond.”

Quantum is on the horizon

Leffew said IQT’s main focus is to find “dual use technologies,” where advances in commercial technology can be applied to the DoD and national security agencies.

“I think DoD is doing an excellent job of communicating their requirements. We can always do more, but what we’re really positioned to do is to be that bridge between the national security community and the startups,” he said. “There’s sometimes a translation challenge. We are taking the Federal Acquisition Regulations and Defense FARs and converting those to the world of venture capital. IQT is in a very unique position to help translate or help explain to each side how to how to best work with each other.”

One area where IQT, which the CIA created in the 1990s is trying to help with that translation is with quantum computing.

Leffew said IQT understands that quantum represents what could be a very fundamental and incredible change to public and private infrastructure.

“It really could change the way that encryption is done, and will require potentially dramatic changes to the way that we encrypt and protect information,” he said. “Another is material science, using AI to create new and improved materials. And then another area is biotechnology. Luckily, COVID is pretty much in the rearview mirror, but we, as a country, want to be prepared, that if another crisis emerges that we’re better prepared to deal with it in the future.”

Leffew said IQT believes the government is on the cusp of using quantum in limited ways.

“One of the most important things for us is to really address what’s called post quantum encryption, which is to be able to make sure that our encryption devices can’t be broken by our adversaries quantum computers,” he said. “That’s a really big area of focus right now. We are seeing the testing and development of quantum computers, not only very specialized computers, but figuring out how to miniaturize them how to bring a quantum computer so that it can be stored in a standard data center and processed in a regular type of data center environment.”

Listen to the full show:

The post In-Q-Tel: The translator between DoD, start-ups first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2024/02/in-q-tel-the-translator-between-dod-start-ups/feed/ 0
Why SASE is more than a buzzword for zero trust https://federalnewsnetwork.com/innovation-in-government/2023/12/why-sase-is-more-than-a-buzzword-for-zero-trust/ https://federalnewsnetwork.com/innovation-in-government/2023/12/why-sase-is-more-than-a-buzzword-for-zero-trust/#respond Wed, 20 Dec 2023 20:02:42 +0000 https://federalnewsnetwork.com/?p=4827445 Wayne LeRiche, the federal civilian field chief technology officer and solutions architect for Palo Alto Networks Federal, said secure access service edge (SASE) sets a framework for agencies to get more easily implement a zero trust architecture.

The post Why SASE is more than a buzzword for zero trust first appeared on Federal News Network.

]]>

Secure access service edge is one of the latest buzzwords that has emerged as part of the move to zero trust.

Agencies are looking at how they can implement SASE as part of securing their networks and systems.

Gartner predicts that by 2025, at least 60 percent of enterprises will have “explicit strategies and timelines for SASE adoption encompassing user, branch and edge access, up from 10% in 2020.”

Wayne LeRiche, the federal civilian field chief technology officer and solutions architect for Palo Alto Networks Federal, said SASE sets a framework for agencies to more easily implement a zero trust architecture.

“One of the main things about SASE is it’s evolving. It started off as basically a virtual private network (VPN) replacement so you had private access or you had companies that were creating a secure internet gateway,” LeRiche said on the Innovation in Government show, sponsored by Carahsoft. “What SASE really does, though, is it brings in a bunch of different use cases under that umbrella. We’re looking at not only the users working from home, but also users working from a remote branch. We’re cascading in there other things like cost savings, moving away from expensive Multiprotocol Label Switching (MPLS)-based networks. And as applications move from the data center to the cloud, with software-as-a-service (SaaS) things like that SASE can really help address those challenges.”

At the same time, LeRiche said SASE can help agencies meet the compliance goals detailed in recent OMB memos around things like security audit logging and sending data to the cloud aggregation warehouse that is run by the Cybersecurity and Infrastructure Security Agency.

Integration of SASE and SaaS

For many agencies, implementing SASE is a key piece to their zero trust strategy.

Gartner said it expects SASE to provide the necessary agility to deal with new and emerging cyber threats as well as help organizations maintain a standardized set of policies throughout their network environments.

“Zero-trust network access is likely to be a major feature in a SASE deployment. Its use reduces your cloud’s attack footprint. We predict SASE will improve enterprise application availability,” Gartner stated.

LeRiche said by using a SASE framework agencies can get the best of all security worlds.

“Doing things like device posture checking and stitching that user identity to that new session now that I’m not working from home or in the office, you still want to tie that user to that IP and that asset. That’s one thing that we can really do from a security perspective,” he said. “From a user experience perspective, SASE really brings that data plane and middle mile optimization, everything that we’ve done on the back end, and all the billions that have been spent on that cloud delivered architecture, it benefits the user. That’s really the most important thing, I think, when we talk to government customers. Security is important, but user experience is very important too. If it’s not beneficial to the user, they’re not going to use it and they’re going to find ways to get around it.”

Start small, iterate, expand

One of the big benefits of using the SASE framework, LeRiche said, is it’s tunable to meet user’s needs based on the use case.

He said SASE can help eliminate latency that can come with VPN or other split-tunneling architectures.

“In the sense, we can set it up to say, as soon as that device is booted up, the device is connected into the SASE service. Even before that user puts their common access card (CAC) or personal identification verification (PIV) card, and we can actually do some pre-logon work to make sure the device is up to snuff with its security posture with things like patches, maybe make sure the antivirus is turned on and up to date,” he said. “Once we do that, and the user pops in their CAC card, they get their two-factor authentication, and it doesn’t matter if they’re going to the cloud or going to a private app that is backhauled into the data center, or chatting with a colleague or connecting to the internet, they get that seamless user experience with that one device, one load and don’t have to click into different clients.”

LeRiche said clients usually start small with maybe 50 or 100 users, see how it works and then expands based on their experiences.

“The idea in the end is it’s just another consumption model for all the security pieces that we bring for zero trust,” he said. “Once we get past that initial stage, it’s very easy because it is all just software licenses now. You already have the infrastructure built, whether it was for 100 users 500 or 5,000 users. Once we do that, then it’s just a matter of turning that dial up.”

Additionally because SASE is software-based, LeRiche said it helps to future proof network and technology infrastructure against future cyber threats or capabilities.

He said as artificial intelligence, machine learning and other security tools become available, agencies can more easily implement them through the software defined-network approach that SASE runs on.

Listen to the full show:

The post Why SASE is more than a buzzword for zero trust first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/12/why-sase-is-more-than-a-buzzword-for-zero-trust/feed/ 0
Reducing the technology, integration burden of moving to zero trust https://federalnewsnetwork.com/innovation-in-government/2023/12/reducing-the-technology-integration-burden-of-moving-to-zero-trust/ https://federalnewsnetwork.com/innovation-in-government/2023/12/reducing-the-technology-integration-burden-of-moving-to-zero-trust/#respond Tue, 05 Dec 2023 01:18:32 +0000 https://federalnewsnetwork.com/?p=4808750 Herb Kelsey, the Project Fort Zero Team Leader at Dell Technologies, said agencies have the opportunity to focus on the policy and process side and not the technology piece of the zero trust architecture.

The post Reducing the technology, integration burden of moving to zero trust first appeared on Federal News Network.

]]>

This January, believe it or not, it will be two years since the Office of Management and Budget released its zero trust strategy. Agencies continue down the path of meeting the 19 actions outlined in that strategy. The journey is one that may have a destination, but may never truly end as cyber threats continue to evolve.

Agencies will continue to face new threats and therefore need a continuous effort to improve.

OMB is helping agencies face those threats and complete key stops along their zero trust journey by asking for $5.5 billion in fiscal 2024 from Congress for zero trust capabilities alone.

At the same time, OMB and agency leaders are meeting at the senior executive levels to continue to give this initiative the focus and accountability needed to change the way agencies implement cybersecurity.

Herb Kelsey, the Project Fort Zero Team Leader at Dell Technologies, said agencies must understand the technical side of the zero trust equation and ensure their tools and capabilities integrate to meet the zero trust requirements.

“I think the technical discussions are happening across the zero trust spectrum. People really are picking the low hanging fruit where they can start and where they’ve made progress before. I think the real trick is whether that’s enough to be able to thwart the enemy,” Kelsey said on Innovation in Government sponsored by Carahsoft. “For us, zero trust is meeting the standard that the Defense Department has put forward, which they’re going to judge our solution against. In their determination, that’s what it takes to defeat the adversary. The concern that I would have with people starting somewhere like identity management, and not working on the other pillars of zero trust in parallel, is that all they’re doing is pushing the adversary to an area that they’re not focused on right now. So if you take care of the identity management piece, but you don’t take care of the device management piece, that’s a problem. If you don’t do all the logging that’s required, that’s just another gap that the adversaries can exploit. So the conversations that I have with agencies, whether it’s in the U.S. or around the world, is you really need to look at building across all of those pillars in order to be successful.”

Retrofitting doesn’t work well

He said one way to work across all zero trust pillars is to rely on integrated tools that can bring together disparate technologies and handle the security policy requirements.

Too often agencies are finding that they struggle with zero trust because they have too many products that must fit together, and retrofitting their current environment into the new architecture can be more complex and require more effort than they can handle.

Kelsey said removing that integration burden will lead to a quick series of successes around zero trust.

“What’s important for them to understand is that the cost and the operational impact of trying to retrofit to the standard that DoD put forward is cost prohibitive, and it’s time prohibitive. Even as the DoD put out its architecture, the idea of retrofitting existing environment was their first course of action. But they came up with two other courses of the action because the first one was going to take too long and spend too much money,” he said. “The second course of action was to try and get zero trust fit into a cloud architecture. The third was to create private clouds that already were advanced in zero trust capabilities. Those are the progression of options.”

Kelsey said there are challenges with the first one from operational and cost standpoints. He said there are even some challenges in the cloud implementations as agencies start looking at all of the logging that they want to do.

Automation, AI reduces burdens

“The concern about retrofitting is that yes, you believe you’ve already got the zero trust components in place, but I would challenge that those have not been validated by an independent third party. We live in this environment of multiple vendors, and each one having a component of zero trust, and saying that they complete the obligation that DoD wants you to meet. But it has to be validated by a third party,” he said. “It’s getting validated by a DoD red team that says, ‘We agree that this solution [helps] meet either the 91 target activities for zero trust or the 152 advanced activities for zero trust.’ That’s what’s written into the architecture specification, and that’s what we’re going to meet by reducing the integration burden on our customers.”

As part of reducing the integration and implementation burdens, agencies must take advantage of automation and look at emerging artificial intelligence capabilities.

Kelsey said the shared security model with industry partners and large cloud providers will help bring these tools to bear more quickly and protect your network and data.

“It’s a challenge to get all of that telemetry data and logging data, and combine it together. I think that has to be understood and worked on. That may mean maybe changing the business model. I think that’s one where a lot of agencies haven’t really come to grips with what that means,” he said. “Our understanding is zero trust has a destination, which is to meet an objective standard as laid out by DoD. It is absolutely about protecting the data. It’s about achieving outcomes. I think that we have an opportunity to do a much better job of protecting data within our infrastructure because we have a defined set of practices to follow. I believe we’ll have better outcomes if we create solutions that allow organizations to focus on the policy side and focus on the process side, and not have to focus so much on the technology and the integration, and the integration burden. I think that’s a better situation for the majority of our customers, as we’ve discussed with them.”

Get in touch with Dell Technologies by reaching out to projectfortzero@dell.com.

Listen to the full show:

The post Reducing the technology, integration burden of moving to zero trust first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/12/reducing-the-technology-integration-burden-of-moving-to-zero-trust/feed/ 0
Moving to a cyber platform can drive further tool rationalization, consolidation https://federalnewsnetwork.com/innovation-in-government/2023/09/moving-to-a-cyber-platform-can-drive-further-tool-rationalization-consolidation/ https://federalnewsnetwork.com/innovation-in-government/2023/09/moving-to-a-cyber-platform-can-drive-further-tool-rationalization-consolidation/#respond Fri, 29 Sep 2023 18:36:38 +0000 https://federalnewsnetwork.com/?p=4730003 Drew Epperson, the vice president of federal engineering at Palo Alto Networks federal, said agencies need to think about modernizing their cyber tools as they transform their entire IT infrastructure.

The post Moving to a cyber platform can drive further tool rationalization, consolidation first appeared on Federal News Network.

]]>

The ever-evolving cyber threats are causing agencies to more than just change their defense strategies. The move to zero trust requires a whole new level of thinking.

That change of thinking revolves around people, process and technology.

Agencies are moving away from one and done type products and toward a more holistic approach to cybersecurity.

Gone are the days where agencies are buying tools to solve a single challenge. Zero trust and advanced cyber approaches require agencies to use integrated platforms to improve their ability to beat back the bad actors.

Drew Epperson, the vice president of federal engineering at Palo Alto Networks Federal, said the broader use of cloud services is opening the door wider for agencies to rationalize and consolidate cybersecurity tools across their ecosystem.

“We all realize that the attack service keeps changing and the adversaries keep evolving. So we keep creating things in the industry to close out those gaps. Eventually, you hit this inflection point where people don’t want to have 15 agents on an endpoint or 35 network security vendors sitting at the perimeter or internal in their data center,” Epperson said on the Innovation in Government show, sponsored by Carahsoft. “I think Gartner and Forrester, whichever one you want to follow, all are recognizing that, especially in cloud, instead of launching all these different cloud security focused platforms, they now just have seen an app as the super set of capabilities that are there. They’re essentially taking vendors and saying, ‘if you want to compete in the cloud space, you really need to have a complete offering around a platform that secures all things cloud.’”

Epperson said agencies have spent years building their security toolsets, both inside their network and security operations center, but also now through disparate cloud instances.

Turning on cyber capabilities

Now many agencies, and really all organizations, are spending a lot of money on disparate products that struggle to integrate and are inconsistent in applying policy rules across the network and applications.

Epperson said the move to cloud-native platforms that can work across all cloud platforms will bring a plethora of capabilities from container security to data governance to infrastructure-as-code.

“Now it’s how fast can you turn on those features on the platform to address those new security concerns compared to the 15 to 18 different startups spinning up at any one time and buying all of them so that they each individually do their one niche thing,” he said. “I just don’t think that’s going to be the way that the industry moves going forward, and we can see that by Gartner qualifying what a cloud native application protection (CNAPP) is, which is all things you need for cloud, and then rating people on their ability to deliver all of them, not just one portion of them.”

There are several benefits to moving toward a cyber platform approach.

Epperson said agencies will save money, both because they will not have to buy and maintain an assortment of tools and around training employees to manage and use the tools. He said with the zero trust mandate, agencies will get a consistent enforcement all the way through the digital transaction.

“That becomes increasingly hard when you have five or six things on the endpoint or 10, 15 or 20 things on the network,” he said. “I think we’re getting to the point where people just want consistent policy enforcement, regardless of who the user is, what device they’re on, where they’re going or what application they’re engaging with in order to deliver on platforms that provide a more efficient and a more streamlined and consistent way to do it.”

For example, Epperson said agencies are spending $15 or $20 a month for a subscription for many of these tools that they are using only occasionally. If they pivot toward a platform approach, the capabilities are turned on or off as the threat changes and new capabilities are added to address emerging threats or risks.

Opportunities to modernize more than just cyber

“I think that tool rationalization is not only something that naturally happens, but it’s also something that I think is appropriate and probably good for organizations to consider on a regular basis. What are we spending? What are we getting out of it? And then how do we convert that if there’s a better way to optimize that spend into something that can provide us more at a lower cost?” Epperson said. “Over the years, we’ve  found ourselves in scenarios where we’ll be talking to customers and partners, and they’ll say something like, ‘we know we need to get off this platform, but that platform is integrated into these mission-critical applications. And there’s a little bit of concern and risk about migrating away from it just because we don’t necessarily know all the ties into it that we might even have.’ I think one of the things that we’ve tried to articulate to people is that anytime you have a modernization or a transformation project where you’re building something new, creating new applications, migrating legacy systems to the cloud or a hybrid data center, those are usually good times to reevaluate instead of bringing the legacy security infrastructure with it.”

He added agencies can pivot some of their current security investments into platform-centric tools where over time, they dial one side up and the other one down.

Agencies also can accelerate the use of automation and orchestration tools through the platform approach.

Epperson said by enabling simple policy configurations, they could protect 60% to 70% of the threats that agencies face every day.

“I think the guidance to anyone who has an investment in a platform is look to that platform to tell you how it should be used best because most of them have data and telemetry in them that will tell you exactly what you are using and what you’re not using, and how the gap between the two could be activated at low to no cost and then drive a better security outcome,” he said.

Listen to the full show:

The post Moving to a cyber platform can drive further tool rationalization, consolidation first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/09/moving-to-a-cyber-platform-can-drive-further-tool-rationalization-consolidation/feed/ 0
Increasing the value of data starts with further breaking down silos https://federalnewsnetwork.com/innovation-in-government/2023/08/increasing-the-value-of-data-starts-with-further-breaking-down-silos/ https://federalnewsnetwork.com/innovation-in-government/2023/08/increasing-the-value-of-data-starts-with-further-breaking-down-silos/#respond Tue, 29 Aug 2023 18:43:01 +0000 https://federalnewsnetwork.com/?p=4692660 Sudhir Hasbe, the chief product officer for Neo4j, said applying graphing technology can help agencies better understand relationships between people, processes and data.

The post Increasing the value of data starts with further breaking down silos first appeared on Federal News Network.

]]>

Agencies have understood the importance of data for every federal mission area for years. But at the same time, the volume, veracity and variety of that data makes it challenging for every federal mission to gain the valuable insights necessary to drive decisions.

It’s clear that agencies still need to break down data silos to make better data-driven decisions. This will lead to improved efficiency and better cost effectiveness for each decision and for each mission area.

Sudhir Hasbe, the chief product officer for Neo4j, said agencies must unify and mobilize complex data to improve mission critical decision making through better data.

“The first step is break the silos and get the data. The second step is can you make sense of that data or relationships? And the third thing, I think, is the democratization of data and insights within the agency. Do you have access to the right insight at the right place for every individual who’s making the decision? I think there is a lot more work we can do in that regard,” Hasbe said on the Innovation in Government show, sponsored by Carahsoft. “This is where newer technologies can help a lot. There is the whole wave of generative artificial intelligence technologies coming in, which can give you a natural language interface to your agency’s data. And if you had all the data mapped, let’s say in a graph, leveraging generative AI technology to use natural language processing on that and ask questions and get answers, will just democratize access to those insights and will actually help everybody within all the agencies and within all the programs.”

Visualizing data and relationships

Hasbe said a good example of this is at the Department of Veterans Affairs, which is using graph technology to track assets, people and services, and address challenges of processing records backlogs.

“In that kind of an environment, there may be more opportunities to get efficiency by bringing all of this data assets into systems like graphs and get better efficiency across the whole agency,” he said. “We see different use cases using technologies like graphs to map out the physical world into a software world and do more of these analyses. We are seeing various agencies becoming better at not just collecting data, but also understanding it.”

Graphing technology, as Hasbe described, is a way to take stored data and visualize it to see relationships between key entities and their associated data sets.

It can be people focused, product focused or combine several different data points to see across all relationships.

“Another example is the supply chain. If you think about supply chain, it is billions of products and each product may be related to another product. And that relationship between products, in a  simpler way, is what graph technology does,” he said. “Another great example is bill of material. If you look at Lockheed Martin, they manufacture for different missions and they have massive bill of material. If you need to build a rocket, imagine how many millions and millions of parts are required, so all the parts are linked to each other. What happens when a specific part is missing in the whole process and you need to procure it? What will be the delay for the whole program?  You will basically be able to analyze that. Graphs are a mechanism of storing and analyzing information in form of entities and their relationships, and then dependencies across them.”

The need to break down data silos is not a new challenge. While agencies have improved their data sharing, Hasbe said with the increased use of connected devices and internet of things devices as well as an ever-increasing volume of data from more traditional sources, it’s easy for silos to stand in the way of decision making.

Generative AI tools help see value

He said it’s a never-ending challenge for agencies, or any organization, to figure out what data is most important, and make it available to the right people at the right time to spur action.

“We may be combining the data into a single platform and we may be making it accessible and shareable with each other. But do you really understand the relationships between the different types of data that you have? Can you understand, for example, the cyber threat? Or, if you’re looking at cyber intelligence and cyber threat, there is data that’s coming from the web assets and you need to understand where threats are coming there. You need to understand how the networks are designed and how different users are using the platform,” he said. “It’s not just about how you can break the silo and put it into a single system, but can you build, for example, a graph on top of it so you understand the entities and their relationships? It’s not just having more data and breaking down the silos, but also understanding the relationships between these data assets is critical for all the agencies.”

Hasbe said breaking down the data silos, applying graphing technology and generative AI tools creates a powerful set of capabilities for agencies as they strive to make better and faster decisions.

But, he said agencies must be sure the answers they are getting from generative AI are accurate and valuable.

“The most important thing I always suggest is to start small with a few use cases and implement it. Then, learn from it and figure out how you want to do it,” he said. “Once you have learned with few smaller use cases, scaling across the agency is possible and more doable. I would take one program, figure out what you learned from that and then expand it across for the whole agency.”

Listen to the full show:

The post Increasing the value of data starts with further breaking down silos first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/08/increasing-the-value-of-data-starts-with-further-breaking-down-silos/feed/ 0
The evolution of the hybrid work environment depends on unified communications https://federalnewsnetwork.com/innovation-in-government/2023/08/the-evolution-of-the-hybrid-work-environment-depends-on-unified-communications/ https://federalnewsnetwork.com/innovation-in-government/2023/08/the-evolution-of-the-hybrid-work-environment-depends-on-unified-communications/#respond Mon, 14 Aug 2023 14:11:45 +0000 https://federalnewsnetwork.com/?p=4676024 Matt Mandrgoc, the head of Public Sector at Zoom, said there are several considerations agencies must keep in mind, including legacy technology and complexity as they move further into the hybrid work environment.

The post The evolution of the hybrid work environment depends on unified communications first appeared on Federal News Network.

]]>

Insights from the series, Innovation in Government.

This fall more federal employees will return to the office. Every week it seems another agency is drawing the line of how often employees must be in person and how often they can work remotely.

No matter where that line in the sand is drawn – four days a pay period for some agencies and six days for others seems to be the trend – one thing is definitely clear the hybrid workplace is here to stay.

And despite agencies, and the private sector, being three years into this hybrid work environment, agencies continue to experience complexity in their technologies to support their workforce and their culture to

As traditional office space evolved into a hybrid working environment, the need for new technology that can create that flexibility for cultural change remains a key factor in an agency’s current and future success.

As agencies continue to adapt to the needs of their workforce and their customers, they also must innovate, evolve and ensure proper cybersecurity controls are in place.

Matt Mandrgoc, the head of Public Sector at Zoom, said there are three considerations agencies must keep in mind as they continue to evolve in the hybrid work environment.

The first is forced technology, where employees are required to use certain tools and that impacts their productivity.

Second is adding complexity to the user’s ability to work across the agency or among agencies because of technology that doesn’t integrate or impacts productivity.

And third, the continued reliance on legacy technology that impacts users, security and adds complexity.

“As the government continues with IT modernization, they need to look at how their systems allow these individuals to work in the hybrid environment. We’ve seen one agency that was spending almost $5 million-to-$7 million just for equipment that was sitting there because as people were working in a hybrid environment, it didn’t scale. Now as the government is making these decisions, they have to look at total cost of ownership,” Mandrgoc said on the Innovation in Government show sponsored by Carahsoft. “The other issue is the recruitment, retaining workers and the productivity of the of the government end user. As you look across these executive orders around customer experience, it really allows for them to take a look at this much differently. How do you start to look at, understand and get input from these end users, bringing it back and utilizing the technology or the solutions or the platform to enable them to do their job in this hybrid environment?”

Strive for a unified communications platform

Agencies have to evaluate all their technology investments and analyze what drives up productivity, what may be an inhibitor to productivity and what easily integrates across the environment.

Mandrgoc said agencies also have to be aware of supporting two different, and not necessarily equal, technology environments – one for the leadership and one for everyone else.

He said agencies should strive to create a unified communications platform that serves all their needs whether in the office or remotely. There are five factors that must be part of any communications platform.

“The first is simplicity, being able to assemble it and it works. We saw this happen with the Department of Treasury when it had to pull lawmakers together to make some decisions on the economy. We saw them being able to quickly move and pivot around the end user,” Mandrgoc said. “The second important piece is around scalability, being able to scale quickly across an organization, being able to add people and move around events or meetings. That scalability is important without having to go through gyrations and gyrations of work to put that together.”

The third piece of this unified communications platform is innovation.

Mandrgoc said having the ability to easily move into and out of breakout rooms or and ensuring the accessibility of access for all end users.

“The fourth is around extensibility, being able to leverage investments that you’ve already made. A great example is in a FedRAMP environment and what we’ve done around with ServiceNow, which is to be able provision phones automatically, where it was a manual process before. Now it cuts down and takes away that manual process and allows that IT team to be more productive as they’re helping their end users, even getting into a self-service component to be able to automate a process that was manual before,” he said. “The last is around security. When you start to dig further into security, it is not just what the certifications people have, but really down to how do you have that same capability around an administrator level? Will you control that down at the user level? Security really moves to the forefront on that as well.”

Accessibility becoming more important

Mandrgoc added the overarching factor agencies need to consider when developing a unified communication platform is the total cost of ownership. He cautioned agencies against believing they are getting tools or capabilities for free as part of a package.

He said there is always a cost to integrate with other technologies and platforms as well as the challenge of guarding against shadow IT.

“A lot of when at unified communications, and especially in the hybrid environment, people have been so focused around what that accessibility looks like in the office. But how does it look like from home? How do we really do a better job in industry of highlighting what we’re doing around accessibility? A lot of times decisions get made on technology and accessibility sometimes becomes an afterthought. Here’s the opportunity now to say for industry to come forward and highlight what we’re doing,” he said. “Accessibility is not a checkbox. It’s something that’s evolving. There’s constantly there’s things we’re doing around movement in a meeting. There’s all kinds of different capabilities out there that we have industry really need to highlight more for our individuals and agencies.”

To learn more from the Innovation in Government series, click here.

Listen to the full show:

The post The evolution of the hybrid work environment depends on unified communications first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/08/the-evolution-of-the-hybrid-work-environment-depends-on-unified-communications/feed/ 0
Preparing for advanced AI begins by getting your data ready https://federalnewsnetwork.com/innovation-in-government/2023/08/preparing-for-advanced-ai-begins-by-getting-your-data-ready/ https://federalnewsnetwork.com/innovation-in-government/2023/08/preparing-for-advanced-ai-begins-by-getting-your-data-ready/#respond Fri, 04 Aug 2023 16:12:56 +0000 https://federalnewsnetwork.com/?p=4666452 Chris Townsend, the vice president of U.S. public sector sales at Elastic, said for agencies to operationalize their data to drive better decisions, they need to break down siloes and make it easier to search.

The post Preparing for advanced AI begins by getting your data ready first appeared on Federal News Network.

]]>

Insights from the series, Innovation in Government.

June was the four year anniversary of the Federal Data Strategy and it’s clear how much progress agencies have made since OMB released that vision.

Every agency has a chief data officer and their own data strategies.

Governance and frameworks have been taking shape over the last few years and agencies now are starting to see the impact of all this work.

The Federal Data Strategy calls for agencies to use data to drive decisions, or operationalize their data.

When you add to that the advancements in artificial intelligence, machine learning and automation, having the right data at the right time and ensuring it’s accessible by the right people becomes more important than ever.

The IT Innovation Foundation found in a report from June that agencies need help to advance in the maturity model outlined in the 2021 update to the federal data strategy.

ITIF says agencies still need a lot of help to better understand which tools and techniques are most appropriate for them and their customers and how to prioritize data insights that are most important to their own missions.

Chris Townsend, the vice president of U.S. public sector sales at Elastic, said whether it’s effectively delivering health care process claims, performing fraud detection or better enabling the warfighter, agencies are trying to understand how to operationalize their data in the most efficient and most cost effective way possible.

“The scale at which agencies operate with data in multiple clouds and data on premise as well as  unstructured data and structured data, the challenge is how do they effectively operationalize that data across all those silos?” Townsend said on the Innovation in Government show sponsored by Carahsoft. “Some large agencies that have individual organizations within that agency, each of those sub agencies have their own data stores as well, and they’re not easily accessible. So being able to operationalize data at scale, and across all of these multiple silos is a real challenge.”

The solution to that challenge, Townsend said, is to take search to the data rather than bring the data to the search tools.

“The thinking for a long time is we have to bring all this data back to a common data store or centralized location like a data warehouse or a data lake. But that’s just not practical for a lot of our large complex public sector customers,” he said. “You hear the Defense Department talk a lot now about their data mesh strategy, which is the idea of being able to take the analytics in the search to the data, where the data resides, across all these different silos, and then bring just the relevant information back to a centralized location.”

CISA, DoD examples

Townsend said a good example of that approach is the Continuous Diagnostics and Mitigation (CDM) dashboard run by the Cybersecurity and Infrastructure Security Agency.

ECS won the $276 million CDM dashboard contract in 2019 to implement the Elastic technology.

Townsend said CISA is bringing together data from more than 100 disparate agencies and performing data analysis.

“The whole idea of operationalizing data at scale to support a mission outcome, whether it’s improving your cybersecurity posture or improving your threat hunting, the more data that you can get into that environment, access and normalize, the better your results are and the better the outcomes are,” he said. “In addition to that, if you’re trying to implement a better decision making process and analytics process across all of the Department of Defense, the Navy’s got its own data store, the Army’s got its own data store, the Air Force has their own, so how can you query all of that data in a common way in a common framework and be able to garner results across that entire environment.”

Townsend said another example is the DoD’s Joint Regional Security Stack (JRSS) effort that now includes, for the first time, the Army bringing together data at both the strategic and tactical environments through a unified architecture.

“You can’t replicate all that data and bring it back to a centralized location. You need to be able to search that data out at the edge, and then just bring the relevant information back in with JRSS. You’ve got these massive amounts of data flowing through those Joint Regional Security Stacks, and if you want to provide that dashboard and threat analytics, you can’t expect to bring all that data back to a centralized location because that just doesn’t work,” he said. “We’re talking a lot about security, but that applies to everything else to whether it’s artificial intelligence operations or in any data analytics or search function. We are increasingly doing more with government agencies around the executive order around improving customer experience. There’s a lot that we could do to improve interaction with government websites, and leveraging things like large language models on ChatGPT to make data more accessible to the citizenry.”

Data to fuel AI tools

Agencies still are dipping their toes into generative AI, but to prepare for a possible future they have to change the way they work with the data.

Townsend said generative AI tools search for information in the past, using keywords and providing meaning and context.

“I think everyone is obviously trying to understand how to best use that technology in a secure way,” he said. “Agencies recognize the tremendous potential benefit of being able to access data to do generative AI, and things like improving security posture, doing threat hunting, or providing better access to the citizen citizenry around the, executive order of improving customer experience. There are tons of applications and we’re just scratching the surface of these technologies.”

Townsend added the benefits of applying AI tools to securing data and protecting systems are another attractive aspect.

But to prepare for using AI in any of the potential areas, Townsend said agencies have to continue to get their data houses in order. He said while the federal data strategy has helped create some momentum, there are still things agencies can do.

“We’re starting to see a lot of convergence in building more cohesive agency-wide data strategies. I think we saw the use of data in pockets. If you had an operations group over here that was doing fraud detection, they may be indexing and using their data for something over here. The cybersecurity folks may be using data too and the customer experience folks may be using different data,” he said. “But now agencies are looking what should they be doing agency-wide and enterprise-wide in terms of their data strategy. What tools should they be consolidating? How are they indexing their data and whether they are duplicating their data and paying for multiple storage solutions. Are they paying for multiple tools to index the same data repeatedly? I think we’re seeing a lot of consolidation around data and seeing a lot of consolidation of the tool sets so that they can buy one tool set and be able to use multiple third-party solutions that can sit on top of a platform that can use that data in different ways.”

To learn more from the Innovation in Government series, click here.

Listen to the full show:

The post Preparing for advanced AI begins by getting your data ready first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/08/preparing-for-advanced-ai-begins-by-getting-your-data-ready/feed/ 0
People plus technology: Building a resilient federal cyber workforce https://federalnewsnetwork.com/cme-event/federal-insights/people-plus-technology-building-a-resilient-federal-cyber-workforce/ Tue, 18 Jul 2023 17:42:41 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4646361 In our new ebook, we look at sweeping federal initiatives and agency-specific efforts to empower the government’s cyber workforce — featuring from Air Force, FEMA, DoD, DHS, Navy and OPM leaders.

The post People plus technology: Building a resilient federal cyber workforce first appeared on Federal News Network.

]]>
Why can’t federal agencies fill cyber jobs?

The challenges are many, but government agencies are taking aim. In our exclusive ebook, we look at sweeping federal initiatives and agency-specific efforts to empower the government’s cyber workforce.

You’ll hear from:

  • C.Q. Brown Lt. and Gen. Leah Lauderback at the Air Force
  • Patrick Johnson at the Defense Department
  • Charles Armstrong at Federal Emergency Management Agency
  • Jen Easterly at the Homeland Security Department
  • Sarah Nather at the Navy
  • Jason Barke and Sarah Brickner at the Office of Personnel Management

Plus, industry security leaders — ExtraHop’s Mark Bowling, Fortinet’s Felipe Fernandez and Ivanti’s Mareike Fondufe — address the connection between technology and workforce.

The post People plus technology: Building a resilient federal cyber workforce first appeared on Federal News Network.

]]>
Critical factors in the continued cloud modernization journey https://federalnewsnetwork.com/innovation-in-government/2023/06/critical-factors-in-the-continued-cloud-modernization-journey/ https://federalnewsnetwork.com/innovation-in-government/2023/06/critical-factors-in-the-continued-cloud-modernization-journey/#respond Thu, 15 Jun 2023 19:21:29 +0000 https://federalnewsnetwork.com/?p=4611997 Jay Fohs, the senior customer advocate for financial and healthcare agencies at Veritas, said agencies must focus more heavily on data protection and resiliency as they move into a multi-cloud environment.

The post Critical factors in the continued cloud modernization journey first appeared on Federal News Network.

]]>

As agencies continue to further down the digital transformation path, they are finding among their biggest challenges the need to integrate and manage enterprise data.

Over the last decade, agencies have realized how important data is to drive mission successes and outcomes.

But without an enterprise data management strategy, agencies can struggle to take advantage of their data to make decisions and improve services.

The good news is the tools and technologies are better than a decade ago and are continually improving. The use of cloud services and better cybersecurity capabilities are helping agencies meet the long-held goals of secure information sharing.

Jay Fohs, the senior customer advocate for financial and healthcare agencies at Veritas, said for many public facing applications, the easy button is software-as-a-service (SaaS).

“A lot of applications that are used by agencies are somewhat common. You’ll look at your human resources, your payroll, some of your e-discovery, those type of things are very common throughout a lot of the agencies. If you put the responsibility right on the cloud provider, with that SaaS architecture, it does relieve a good amount of burden on the IT organization,” Fohs said on the discussion Innovation in Government sponsored by Carahsoft. “I think they want to focus more on some of the mission critical applications that drives services to the consumer. Those are the things that we see them trying to focus on a little bit more.”

At the same time, not every application can be moved to the cloud and digitally transformed. This is why Fohs said as agencies continue to develop their modernization plans, they need to consider two critical factors: Security of the data and the criticality of the application itself to the mission.

“Federal agencies need to ask what is their risk factor for taking this application from their data center where they manage it and are responsible for it, and giving it to somebody else?” he said. “When you look at cloud providers and a lot of the native tools that they provide, how well do they do from an availability standpoint? Those tools could be coming from a lot of different place so how are those infrastructures secure?”

What is driving modernization efforts?

That cybersecurity challenge will continue to drive a lot of the modernization decisions.

Fohs said public and private sector organizations continue to be concerned about the ever-changing cyber threat environment. He said agencies want more details about how cloud providers are developing their in-depth and protection strategies.

“We have to ensure that we maintain a specific level of security within our own rights, so that our customers are confident that there are no vulnerabilities there,” he said. “There’s so many different solutions and capabilities out there from a lot of different security vendors, it becomes challenging. Where do you prioritize that in- depth defense aspect of securing your data?”

Part of that in-depth cyber strategy is knowing your data and ensuring you can be resilient when a cyber incident does happen.

Fohs said agencies should rely more than ever on technologies such as encrypted data at rest, snapshotting and/or deduplication, and replication to ensure they can recover from an attack.

“It’s very critical that you look at the way that data protection solution is architected. Zero trust is a component of a framework,” he said. “When we talk about developing applications or utilizing tools that meet zero trust, there’s some common questions and some common variables that you should ask your vendors. It’s more important now than ever that as the data protection vendor or a storage vendor that you’re paying attention to that threat vector. If they can infiltrate your data protection system, and you don’t have a resilient platform to recover your data, game over.”

Understanding cost is key

A big consideration that underlies the second critical factor that drives modernization, the criticality of the application itself to the mission, is the cost to move it to the cloud.

Agencies have come to realize over the last decade that cloud services aren’t always cheaper, but do provide better services at the same cost.

“It’s very difficult to estimate the amount of resources you will need. It’s challenging sometimes to understand your ingress and your egress costs as it relates to your applications. Those are really, really important things moving forward,” Fohs said. “There are tools out there to help you plan that as well. Within today’s cloud provider world, you should be able to use resources when you need them and let them go when you don’t. You should understand, most importantly, the shared responsibility models with the cloud providers. I think that’s critical.”

The post Critical factors in the continued cloud modernization journey first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/06/critical-factors-in-the-continued-cloud-modernization-journey/feed/ 0
Managing multi cloud through the ‘by design’ approach https://federalnewsnetwork.com/innovation-in-government/2023/05/managing-multi-cloud-through-the-by-design-approach/ https://federalnewsnetwork.com/innovation-in-government/2023/05/managing-multi-cloud-through-the-by-design-approach/#respond Mon, 15 May 2023 16:19:32 +0000 https://federalnewsnetwork.com/?p=4575385 Kelsey Monaghan, the lead for federal strategic programs and partnerships for cloud and edge at Dell Technologies, said agencies need the ability to govern and provide workload flexibility across all individual cloud deployments to ensure consistency.

The post Managing multi cloud through the ‘by design’ approach first appeared on Federal News Network.

]]>

Agencies are in year 12 of the cloud strategy. The Office of Management and Budget launched its cloud first strategy back in December 2010, and June will be the four year anniversary of the shift from cloud first to cloud smart.

Over the last 12 plus years, agencies have made a lot of progress to take advantage of cloud services.

For example, the FedRAMP cloud security authorization program reached a milestone of having more than 300 approved cloud services. A majority of them are in the software-as-a-service category, meaning agencies are moving toward a more advanced stage of cloud services where infrastructure is mostly in the cloud and now they are looking at applications and workloads.

The challenge now, of course, is most agencies are living in a multi-cloud environment. This is where agencies are integrating services from multiple cloud providers to meet their mission needs.

Kelsey Monaghan, the lead for federal strategic programs and partnerships for cloud and edge at Dell Technologies, said while cloud services aren’t new for agencies, what is changing is the definition of what cloud is continues to expand as the operating models evolves.

“We’re seeing that definition and that use case really expand to the core and out to the edge so that continuum of cloud is really expanding,” Monaghan said on the discussion Innovation in Government. “As we look at that, the need to have it by design approach to avoid some of those silos, and improve that operational benefit is absolutely where we’re seeing not only a focus of the federal government, but also of industry in that deep partnership.”

Portability, flexibility are key

The concept of “by design” related directly to the use of more than one cloud provider whether public or private or hybrid instantiation.

Monaghan said agencies need to understand their workloads and which cloud makes sense to optimize those efforts. She said key factors include portability between cloud providers and flexibility within those vendors.

“When we talk about multi cloud by design, we’re really talking about an ability to govern and provide workload flexibility across all those individual cloud deployments so that consistency is really the underpinning of that by design methodology,” she said. “When we talk about workload optimization, or when we talk about by design, what we really mean is understanding those with our agency partners and bringing the best of bread from an industry perspective to place those workloads in either the public, hosted or co-located cloud deployments or even edge deployments that would best fit that mission or that outcome.”

Agencies have come to realize over the last decade that not all workloads are created equal and the cloud isn’t always the only answer based on consumption trends.

Monaghan said understanding your workloads and applications becomes more important as agencies push data and services to the edge.

She said more agencies are looking at a distributed model for services to the edge.

“There’s analytics use cases and those time-to-value discussions are continuing to drive cloud discussions, but in a slightly different fashion,” Monaghan said. “What we’re really seeing is that in some cases, there are repatriation of workloads or the movement of workloads from the public cloud back to on premise or in many cases to co-located or to different hosted environments. Some of those are because they weren’t necessarily entirely cloud ready. They weren’t re-platformed to really enhance and utilize the benefits that the public cloud providers and those tool sets really provided to them day one. So to get those better cost and operational efficiencies that we have seen some of that movement.”

Multiple clouds vs. multi cloud

Another factor that agencies are becoming smarter about is the different between using multiple clouds and taking a multi-cloud approach.

Multi-cloud is about portability of workloads, consistent governance and management of cloud deployments and creating a continuum of services from the core to the edge.

“When we say multi-cloud, we’re talking about whether you have applications consistency in the data layer, in the fabric and the connection strategy across these cloud environments. That really is an ecosystem approach,” Monaghan said. “When we look at these discussions with our agency partners, we’re looking at industry and what each provider really brings from a best of breed perspective. We believe it’s not a public cloud or edge cloud or co-location or public, it really isn’t a discussion, and because of that we’ve partnered not only from a cloud native perspective around bringing those tool sets right across that continuum, but also from a data layer perspective to provide that consistency.”

She added by relying on a broker of brokers approach to manage a multi-cloud approach, agencies can improve their experience and ensure operational efficiencies.

“What we look at is understanding if that workload today is really supporting the outcome or the mission of the agencies. What we hear a lot about today is access to data or collaboration experiences as different areas of the organization are using IT,” Monaghan said. “Some of the challenges that we’re hearing are absolutely around proximity, access and collaboration tool sets, but also around latency and the time to value discussion. Some of those can be indicators if a workload is really running the best place. Others can be around the application itself. Is it legacy? Has it been re-platformed? And is it running an environment where it’s taking advantage of the best capability that an agency is paying for? You really can look at both that operational and financial discussion as well as that operational right and technical capability. Is the workload running in a place where it can utilize and really integrate with the necessary back end resources that it needs.”

Listen to the full interview:

The post Managing multi cloud through the ‘by design’ approach first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/05/managing-multi-cloud-through-the-by-design-approach/feed/ 0
Three trends driving digital transformation, improving trust, transparency https://federalnewsnetwork.com/innovation-in-government/2023/04/three-trends-driving-digital-transformation-improving-trust-transparency/ https://federalnewsnetwork.com/innovation-in-government/2023/04/three-trends-driving-digital-transformation-improving-trust-transparency/#respond Fri, 14 Apr 2023 19:16:08 +0000 https://federalnewsnetwork.com/?p=4538482 Mike Shortino, a principal digital strategist for federal civilian government at Salesforce, said by taking a platform approach to bring together data to make better decisions is leading to more trust and transparency in government.

The post Three trends driving digital transformation, improving trust, transparency first appeared on Federal News Network.

]]>

Trust and transparency are at the heart of any agency’s digital transformation strategy.

It’s part of how agencies use data. It’s part of how they secure their systems. And it’s part of how they serve their mission area.

Given the pace of change across the federal landscape over the last few years, agencies must take advantage of this rare opportunity to transform their services to service internal and external customers differently, better and in a way that engenders both trust and transparency.

The January 2023 report from the American Customer Service Index found federal government services surges in 2022, up 4.6% from 63.4 to 66.3 on the 100-point scale.

ACSI found among these drivers of satisfaction, two of the four increased substantially between 2021 and 2022: efficiency and ease of government processes and ease of accessing and clarity of information.

The other two primary drivers of satisfaction—the courtesy and professionalism of customer service and perceptions of government website quality decline slightly.

Mike Shortino, a principal digital strategist for federal civilian government at Salesforce, said agencies must continue to improve customer experience through digital transformation by connecting service siloes across the government.

“There’s an aspect of leveraging the masses there that’s inherent in scalable cloud and software-as-a-service. But even with platform tools, the more folks that are using it, the more feedback we get as an organization, the better we can make those tools for our end users,” Shortino said on the discussion Innovation in Government sponsored by Carahsoft. “By recognizing that and adopting out of the box principles, where they make sense for the organization, has been a big shift. A lot of what we do is connect to those capabilities, but also the patterns of success that agencies have achieved by leveraging those capabilities. So thinking about what we call a platform mindset to leverage first, then code or create second as a way to really speed time to value for your constituents.”

This platform approach to bring together data to make better decisions is leading to more trust and transparency in government.

Three trends pushing transformation forward

Shortino said there are several trends driving these changes.

First, he said agencies need to recognize that change is constant and they have to be prepared for the next crisis or the next opportunity.

“Now, there’s layers of our architecture in our organization where we can predict where that change is going to be more acute. Where you touch your constituents, I think we’re finding those expectations are changing,” Shortino said. “That’s a natural layer of change within your organization that you need to be attuned to. So by systematically investing in the experience layer, we can really impact that trust over time.”

A second trend is the need to collaborate across agencies, across governments and with the private sector.

He said agencies need to lean on a wider variety of partners to accomplish their mission goals.

A third trend is to take advantage of the ever-changing technology that industry is bringing to the government. This includes everything from robotics process automation (RPA) to advanced artificial intelligence to enhanced cloud platforms.

“Transparency is important. But so is trust in the use of data and how it’s being leveraged as it relates to someone’s service experience and their knowledge of their personal service experience. Transparency is like a great idea and it can build trust,” Shortino said. “At the same time, there is risks that can be associated with too much transparency or transparency where we haven’t been given the right to extend it. I think that nuanced view of how we view data and in terms of the experience, and how that being really protective, and a little bit risk averse, in one way can help build trust, at the same time, extending transparency in other areas can also help build trust.”

In the end, Shortino said providing an improved digital experience for the customer is, like any process improvement, as much about culture change as anything else.

“I’m also a big believer in the fact that culture is built on success. The culture exists because it’s a set of behaviors that that worked in the past for us. So the only way to change that culture is to show that there’s a better way, or at least as good away, doing it a different way,” he said. “The government often is challenged with its culture. But I think we need to honor that culture and see how we can leverage that culture for an advantage. That’s a consistent headwind. But I think your more enlightened leaders are starting to turn that into a tailwind.”

Listen to the full show:

The post Three trends driving digital transformation, improving trust, transparency first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/04/three-trends-driving-digital-transformation-improving-trust-transparency/feed/ 0
2 tactics to help agencies make data-driven decisions faster https://federalnewsnetwork.com/innovation-in-government/2023/02/2-tactics-to-remove-obstacles-to-making-data-driven-decisions/ https://federalnewsnetwork.com/innovation-in-government/2023/02/2-tactics-to-remove-obstacles-to-making-data-driven-decisions/#respond Fri, 24 Feb 2023 19:06:53 +0000 https://federalnewsnetwork.com/?p=4477192 Winston Chang, the federal chief technology officer at Snowflake, said the next generation of cloud services are giving agencies the tools to focus on business needs rather than the technology.

The post 2 tactics to help agencies make data-driven decisions faster first appeared on Federal News Network.

]]>

Data is the engine that powers agency IT modernization efforts, powers improvements in customer experience and, of course, powers better cybersecurity.

The challenge for all agencies is the ability to harness that data and how they can use the cloud to drive better and faster decisions. Agencies need help to break down information silos and ensure secure data sharing is the rule, not the exception.

One example of this comes from the Veterans Affairs Department. It is creating a new Master Data Management system. The goal of that back-end system is to take data elements, create a centralized data repository for the entire agency and then share them across the department.

The General Services Administration’s Data and Analytics Center of Excellence identifies three steps agencies should consider to move in this direction:

  • Develop a roadmap
  • Create a resourcing plan
  • Determine your governance approach, whether centralized, decentralized or federated

As agencies continue to develop their data-centric approaches, there are several considerations to use the next generation of technologies to improve data management, said Winston Chang, federal chief technology officer at Snowflake.

“All of the infrastructure that is in the cloud has now been used and some of the first order problems have been attacked and understood,” Chang said on the Innovation in Government show. That work led to abstractions built on top of the cloud, which in turn has led to the development of purpose-built services ideal for difference federal uses, he added.

The end result? Data management now is seamless across different cloud vendors. “It essentially can bring a network effect to how data is utilized,” whether within a single organization, across an agency or event at locations globally, Chang said.

He shared two approaches agencies can use to take advantage of cloud to tap data to  inform their decision-making.

Tactic 1: Tie data and applications to mission functions

Agencies should focus on the data layer instead of the application layer of the network architecture, which will simplify how an organization ties data and applications to a specific business function, he said.

“A great example of something that’s been very hot recently is the low-code, no-code development. It’s because it allows an abstraction of not having to do full stack development, where you really can just align to your business need,” Chang said. “Similarly, when we deal with data, which is sort of our bailiwick, the abstraction allows the architecture, the data management, the governance and all of those things to be closer to how the agency needs to operate and less about the technical things.”

The other benefit of abstracting the data layer is better experience across multiple cloud instances. Agencies can then focus on solving problems and not worry about where specific applications and data sets reside, he  said.

Expanded government use of software as a service (SaaS) is also leading to abstraction of data. “So much can be automated, and it can be automated at scale — all those processes, all the infrastructure — agencies can just pay for as a service,” Chang said. “You don’t have to build it,” which makes that initial piece of the build versus buy discussion simpler too because there’s zero infrastructure cost.

Tactic 2: Enhance collaboration, vertically and horizontally

With the move to SaaS and by abstracting data from siloes and legacy systems, agencies can improve internal data sharing initially and then externally with other agencies and partners, including suppliers. They can also bring in open source data to further the understanding of agency mission needs and to help inform decision-making.

It can also aid in collaboration so that agencies make decisions at multiple levels, Chang said.  By layering in code, an agency can tailor who sees what and how different organizations and people within an agency interact with the data and one another.

“This can work for the Defense Department. This can work for Department of Commerce or for any agency within the federal government. They can have those fine-grained, fine-tuned definitions, which allow for the collaboration to be optimized,” he said. “These sharing policies and the sharing can really reduce the risk of sharing because you can put on top of it policies that automatically check permissions — and you can watch the metadata that moves back and forth.”

This approach to collaboration can speed decision-making because “we can actually reduce the process procedure policy piece,” Chang said. “That completely changes the entire game.”

Listen to the full show:

The post 2 tactics to help agencies make data-driven decisions faster first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2023/02/2-tactics-to-remove-obstacles-to-making-data-driven-decisions/feed/ 0
Understanding the critical role of UX to zero trust https://federalnewsnetwork.com/cme-event/innovation-in-government/understanding-the-critical-role-of-ux-to-zero-trust/ Thu, 15 Dec 2022 16:59:46 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4393623 Cyber leaders at Customs and Border Protection, Education, FDIC and USCIS along with experts from Crowdstrike, Okta and Zscaler share their thinking on how to layer in security for zero trust while also minimizing friction on users.

The post Understanding the critical role of UX to zero trust first appeared on Federal News Network.

]]>
How can you keep user experience at the center of zero trust? Is there such a thing as “smart friction?”

In this executive briefing, we tap a panel of cyber leaders from government and industry to share their thinking on how to layer in security for zero trust while also minimizing friction on users:

  • Shane Barney of Citizenship and Immigration Services
  • Zachary Brown of Federal Deposit Insurance Corporation
  • Scott Davis of Customs and Border Protection
  • Sean Frazier of Okta
  • Steven Hernandez of Education Department
  • Ned Miller of Crowdstrike
  • Jose Padin of Zscaler

Discover more now!

The post Understanding the critical role of UX to zero trust first appeared on Federal News Network.

]]>
Zero Trust: Case Study https://federalnewsnetwork.com/innovation-in-government/2022/12/zero-trust-case-study/ https://federalnewsnetwork.com/innovation-in-government/2022/12/zero-trust-case-study/#respond Tue, 13 Dec 2022 20:42:15 +0000 https://federalnewsnetwork.com/?p=4391119 Agencies have made their way down the zero trust path, but how are they working through the challenges? Jason Miller gets an industry perspective from Okta, CrowdStrike and Zscaler.

The post Zero Trust: Case Study first appeared on Federal News Network.

]]>

The post Zero Trust: Case Study first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/innovation-in-government/2022/12/zero-trust-case-study/feed/ 0