Four IT trends for 2020 agencies need to prepare for

This content is provided by Red Hat.

Change is a constant, especially in the world of technology. To keep up with the population they serve, federal agencies have to be ready to embrace that change by building a culture of flexibility that can capitalize on technologies and ideas that don’t even exist yet. And while that’s easier said than done in an environment where disruption is the name of the game, agencies should at the very least be keeping a weather eye on what’s just around the corner in the next quarter, the next fiscal year, the next calendar year.

Toward that end, here’s four projections for 2020 from David Egts, chief technologist of North America Public Sector for Red Hat.

FedRAMP is going to get easier            

Egts said he’s heard through various channels that it’s getting easier to get certified through FedRAMP, which means more companies will pursue that certification. That’s especially true for software-as-a-service providers, due to the fast track program known as FedRAMP Tailored. That makes it far more likely that the government will be able to start adopting the kinds of low impact SaaS technologies being used in the private sector.

“Having the speed of government at parity with the speed of businesses, that’s something I’m looking forward to,” Egts said. “FedRAMP is seen as a de facto standard for cloud security, not just as a standard within the federal government, but state and local agencies, as well as other companies that work with the federal government, and even other governments look at FedRAMP as being this gold seal of approval of due diligence has been done, a third party has looked at it.”

That will mean more choices for government, and more opportunities for the integration of services. But it’s not all good news: that also increases an agency’s risk for vendor abandonment. After all, Egts said, 90% of startups fail. That’s why agencies also need to have a cloud exit strategy in place before they begin their cloud journey. They have to prepare – and budget – for the possibility that they may need to move their data between cloud providers or back on premise.

“Having a data management plan as part of your cloud strategy is really important. So if you’re generating tons of data, are you budgeting for that?” Egts said. “How do you retire your data? How long are you going to keep your data? Especially in the government world, where agencies are afraid, due to policy, to keep their data for too long, or afraid to delete it at all?”

Increased tempo of ATO and CDM will require automation of infrastructure, compliance

Continuous Diagnostics and Mitigation, the Department of Homeland Security’s cybersecurity dashboard program, is picking up steam as more agencies launch and complete CDM pilots. In the past, cybersecurity audits involved documents printed out and tucked into a binder, once per year. CDM is changing that.

“I compare it to you go to your annual doctor visit, they take your blood pressure, and that’s one reading at one point in time,” Egts said. “But that may not be an accurate representation over a year. Compare that to having a Fitbit, whether it’s tracking your heart rate, sleep quality, or whatever your health statistics are that you want to measure, and it’s doing that continuously and alerting you of anomalies.”

The only way this is going to be feasible, Egts said, is to remove humans from the loop.

“The only way to continuously check your security posture is through automation,” he said.

Hybrid and multi-cloud are here, and they’re not going away

The Cloud Smart policy is an evolution of Cloud First. Agencies are realizing that not everything needs to go to the cloud, and not every cloud is right for every application. Agencies need a cloud strategy, and they need an open substrate which spans from on premise to multiple public clouds. This helps them accelerate their adoption of the cloud by not having to to cross-train people on technologies specific to certain cloud providers, and will be able to run certain workloads on public clouds, and others in private data centers.

Because every cloud provider does containers differently. So standardizing on an underlying platform gives agencies more options, but with fewer functional variations.

“If an agency goes all in with one particular cloud, and they’re locked into that cloud, the strength of that agency’s hand for negotiation is weakened,” Egts said. “By making multicloud a part of your cloud strategy, you’ll be able to ensure the cloud providers are delivering value, at the right price, and the right features and the right capability, or you’re free to go elsewhere.”

Products won’t be a panacea. Agencies need to focus on people and process too

Cloud Smart, the Federal Cyber Reskilling Academy, and the Executive Order on Maintaining American Leadership in Artificial Intelligence recognize what that it’s not just about the technology, it’s about the people as well. Getting the right programs won’t help if employees aren’t able to use them to their fullest potential.

Egts points to : “organizations which design systems … are constrained to produce designs which are copies of the communication structures of these organizations.” So if agencies have siloed, top-down communications styles, that’s what their systems will wind up looking like as well.

Instead, agencies need to adopt an open culture, revolving around agile principles and DevSecOps. The most engaged agencies, like those found in the Federal Employees Viewpoint Survey, are the ones that embody those principles, communicate effectively, and empower their workforce.

“Establishing guiding principles at the top and empowering employees at all levels is the only way for agencies to scale as agency expectations go higher and higher and technologies move faster and faster,” Egts said.

More from WTOP

Log in to your WTOP account for notifications and alerts customized for you.

Sign up