Amid new mandates for government data use and cloud migration, data management is top of mind for agency leaders. But cost, infrastructure modernization and threat prevention remain a challenge. MeriTalk spoke with David Bailey, senior director of U.S. Public Sector Technical Sales at Veritas, about how agencies can plan their first steps toward a cohesive, sustainable data strategy, as well as the factors behind today’s unprecedented data growth.

MeriTalk: Federal agencies are collecting data at an ever-growing rate, and they must manage it across complex, hybrid infrastructures. What are the most important considerations for IT leaders as they modernize their data strategy?

David Bailey: The most important consideration is determining what’s crucial and how to treat the data stream in the long term. Hybrid infrastructures present a challenge as they provide many more locations for mission data to exist, which inherently increases management complexity.

With data, we tend to have a desire to keep everything, forever, at an unassigned “impact” level. I think of data storage like the stuff in your attic: You’re not sure if you’ll ever need it, but you keep shoving stuff up there, and eventually you run out of room and have to find another place for it. Ultimately this increases chance of loss, as you move things around and potentially move valuable items to less-secure locations.Bailey

We can apply the same attic and “stuff” analogy to how government agencies manage data. The limitations of data storage are commonly expressed as physical or logical space, budget, risk and compliance. You could face an accidental loss of critical data, or expose critical data that’s in the wrong location. Data is also coming at us from many more sources, and our ability to manually manage it has reached a limit, especially as we become more multitenant with our data. So, to extend our analogy – imagine the management problem if all your neighbors also began to put their stuff in your attic, but left it to you to manage?

By identifying what data is critical and applying policies to that data, agencies can start to get the data flood under control. The location itself is less important. IT must consider how it will manage the data influx while ensuring availability to stakeholders. IT needs to gather metadata on the data, establish clear guidelines and policies, and automate those processes as much as possible.

MeriTalk: Do agencies today have the “attic” mindset or do you see them shifting more toward automation?

David Bailey: In truth, both. However, a lot of agencies are struggling with where to start. There is a fear of, “what if I delete the wrong thing?” or “what if I move data and expose myself to risk?” They spend a lot of time examining the problem, while the data continues to grow and the problem only gets bigger.

The sooner agencies embark on a plan, the better. Recognize that managing your data stream poses less risk to the organization than allowing the status quo to persist. You can still implement increasing levels of automation to allow personnel to focus on what’s most important to the organization, and to form better policies and procedures for dealing with hybrid data topologies and new data sources.

MeriTalk: Why is data more vulnerable today?

David Bailey: Data is spread out over more locations, and there are many more data generators. The latter has caused a dramatic increase in the data types and locations of data that may not be properly tagged or understood, making it hard for agencies to properly protect it. With the variety of sources as well as locations, organizations have limited direct control on some fundamental elements that may exist in a single on-premises data center.

Moreover, when data overflows the original location and into new locations – as in the earlier “attic” example – we tend to apply uniform protection policies that ultimately may be insufficient for the value (or risk) the data brings. It’s critical for organizations to implement strong controls and visibility measures on overall data management.

MeriTalk: We’ve seen a surge in ransomware attacks and weather-related outages, especially at the local level. How can agencies make sure they are prepared from a protection and recovery perspective?

David Bailey: Agencies must assume an event – like a major storm, a fire, a flood or a ransomware attack – will occur. It’s not “if” but “when.” Some organizations view data protection as a kind of insurance policy, but it goes beyond that; protecting data is key to business or mission enablement. Also, data protection or disaster recovery functions are not automatic functions, even with cloud.

We must continuously ask ourselves the hard questions around how we will restore our mission functions should we lose connectivity to any major site or application. We must also practice recovery operations and rehearse disaster recovery plans. Knowing what is in our data and where it specifically resides becomes crucial to that capability. By thinking through the problem completely, agencies can protect the organization against what’s bound to happen and guarantee continuity.

MeriTalk: Developments such as the Office of Management and Budget’s Federal Data Strategy, the Evidence Based Policymaking Act and Open Government Data Act are bringing data management front and center for the public sector. What new challenges do agencies face in addressing these mandates?

David Bailey: I view these policies as a growing continuation of some things that have been present for a while – for example, the notion that data must be accessible, yet protected. What I think is changing is the concept that data should be relevant to specific purposes, as opposed to be being kept for non-specific reasons. There is also growing awareness that data impacting individuals may belong more to the individual than to the entity that collected it.

The move to more open data also adds complexity. Since we are striving to protect sensitive data, citizens and organizations must be able to use data for societal improvement and ensure transparent government operations.

The biggest issue that government entities will face is in determining what and how to make data available, while still protecting data that could be sensitive to individuals and organizations. How do we establish clear policies in how the data stream is managed or accessed? Government must become much more agile and automated in the way it delivers, stores and accesses data.

MeriTalk: How should agencies approach a cohesive data strategy as opposed to a pure cyber or cloud strategy?

David Bailey: Locking down data as tightly as possible, or just moving it to the cloud will not solve data challenges. It can actually exacerbate it. For example, moving everything to the cloud might generate a cost structure that might not be tenable in the long term. Similarly, overly aggressive “locking it down” may limit data usefulness. Without planning, these approaches may restrict the ability to make meaningful business decisions using data.

A comprehensive, cohesive data strategy takes into account IT’s major requirements for proper data management and cybersecurity: making data available to meet the mission, protecting it from unknown circumstances and enabling stakeholders to make decisions on the data that are in line with mission needs. Agencies should view the expanded locations now available to them as increased capabilities with dynamic security boundaries, while still applying decades of lessons learned in keeping data structures secure.

Unlock your data potential at Veritas Public Sector Vision Day. Learn More

MeriTalk: This year, Veritas rolled out its enterprise data services platform, encompassing availability, protection and insights. What’s the thought process behind this specific approach?

David Bailey: Our approach addresses the key foundations organizations need to have from a data management perspective.

First, “Availability” ensures that data is available for the mission function. For instance, we’ve seen several Federal customers challenged to move to the cloud to comply with Cloud First or Cloud Smart. We helped them move to the cloud faster by re-pointing part of their protection stream and ensuring availability in the same construct.

The second pillar – “Protection” – ensures agencies can protect information as it moves through the data lifecycle. For example, when ransomware does occur in an environment, how can government entities take their mission data and re-present it in a protected way? That’s core to Veritas’ protection approach.

Third, our “Insights” pillar is all about gleaning real value from massive data stores. With so much data captured today, organizations must be able to make intelligent decisions on data by assigning characteristics through metadata. Let’s say you are focused on cost. An automated policy would use the metadata to show what type of data is more cost-efficient in certain locations or constructs.

As we tie availability, protection and insights together, agencies can benefit from a uniform platform approach that lets them manage their data holistically for long-term mission success.

To learn more about how to take control of your data, register for the Veritas Public Sector Vision Day, taking place on December 10 at the Newseum in Washington, D.C.

Read More About
About
MeriTalk Staff
Tags