Big data analytics are helping Federal agencies enable their users to do more to serve citizens, but agencies have yet to harness the vast amount of data in the Federal space, said IT leaders during a panel at the ATARC Data & Analytics Summit on Tuesday.
“I’ll tell you what I’ve learned: in my big data journey over the last six years, we’ve had a pretty robust set of users on it. Users are smarter than they think. They’re a lot smarter than they think,” said Leonel Garciga, CTO at the Joint Improvised Threat Defense Organization within the Department of Defense.
He stressed the importance of building platforms that accommodate and support users and how they use the platform, instead of building it around expected uses. Garciga also emphasized the importance of not ignoring “cutting room floor data,” a mindset he attributed to his experience with the intelligence community. He noted that it helped shrink the time to deployment, and that the agency’s philosophy is to let users “build out the analytics they need,” while his office works to push those analytics out to the edge.
“I will tell you, I don’t think there is this thing called shadow IT. I know every CIO just cringed, (but) I think it is my job to deliver a platform that lets the user do what they need to do. Shadow IT is a lack of capability to innovate,” said Garciga. “The big part about how you get the security piece and the policy piece around it, that really is building out an ecosystem that’s specifically built to provide the maximum amount of support and capability to the user,” he added, likening the ideal ecosystem to Apple’s app developer ecosystem.
Harnessing data can also help improve government efficiency, a major component of the President’s Management Agenda (PMA), especially when it comes to IT.
“One of my challenges is to help manage the spend across IT,” said Bill Spencer, an IT category management program manager at the General Services Administration (GSA). “For those of you who go onto OMB’s (the Office of Management and Budget’s) website and take a look at what the IT budget is, it says about 58 billion dollars. That’s a planned budget, that’s not actually spent. What I specifically do is dissect that information in such a manner to help address and integrate specific questions for CIOs to be more efficient in their IT acquisitions.”
Spencer noted that his role is to support Cross-Agency Priority Goal 7, which aims to reduce fragmentation in government spending. He said that GSA is working to create a simple framework and questions that need to be answered to help understand the landscape of IT. He noted the importance of having interoperable data to find what different agencies are doing and compare the information, but that it remains a complex issue.
“We are the single largest IT buyer on the globe, but we buy like we are thousands upon thousands of small companies,” Spencer said. “The ability to make sense of all those base contracts and help decisionmakers with a level of confidence, and those of you in IT know that happens to be through executive language…, that’s what I do, is create language in such a manner with data to allow those people making decisions to actually give them the information they need to make decisions,” he added.
Spencer shared some insights that GSA has found along the way, both through data and discussions.
“People are generating data through acquisitions in very diverse manners,” he said. “It varies from collecting no data, to using PDFs, to very finite data elements. We’re finding various anomalies that result in different price points for commodities. What we find in the data is (bulk) is not generally the best indicator,” he added, citing a conversation with an executive who told him that late fiscal year orders disrupt the supply chain and add cost.
While some agencies may be fast movers in the race to build big data platforms, others are carefully laying the groundwork for truly impactful initiatives.
When it comes to electronic health records (EHR) at the Pentagon, “we haven’t been able aggregate, make interoperable, or do machine learning on that data. My job is to try and help fix that,” said Colonel John Scott, data manager at the Defense Health Agency. As a clinician himself, Scott noted the frustration around those lack of deeper insights. “The biggest challenge is to get all of the data in an integrated data platform, where we really understand it and make it work together. We’re doing that in the DoD, and the Veterans Administration (VA) is also doing that, and if we can do that very smartly together, we’ll have one of the largest, most powerful clinical datasets in the world…but we’re struggling to do that.”
Scott emphasized that DoD was not helpless at the moment. “It’s much better than it would appear if you only watched the congressional hearings,” he said. He pointed to the capabilities of the common standards in the department’s Joint Legacy Viewer, and the common electronic filing cabinet standards. “But what we can’t do, which is really important, is that we don’t have big data analytics.”
The potential of an interoperable DoD-VA EHR database presents some extremely compelling use cases.
“We are going to study everybody who has had to leave the military because of healthcare concerns, look at their records, look at what might have been in their medical records that could have predicted that, and then we get into prevention and emerging risk,” said Scott. “We were asked, based on an executive order, to do a better job of identifying veterans who are at risk of negative outcomes after leaving the military. We collaborated with VA trying to link all of our data together, and looking at those persons’ records and finding out what we can in their military record and target for prevention in their last year in the military and in the transition. We have a lot of promise.”