With emerging technologies like artificial intelligence (AI) and blockchain continuing to reveal their capabilities to the marketplace, Federal IT leaders discussed the potential–and the pitfalls–of implementing new technology in government during a Thursday session at an event hosted by the Armed Forces Communications and Electronics Association (AFCEA).
One of the main topics of discussion was the use of AI in cybersecurity, particularly within the framework of the Continuous Diagnostics and Mitigation (CDM) framework administered by the Department of Homeland Security (DHS) to improve Federal civilian agency security.
“You see a lot of human resources are being leveraged to normalize this data, and say, ‘This is really good, this is bad’. I think what AI will give us is the ability to reduce a lot of that counting aspect, as I call it,” said Kevin Graber, CISO at the U.S. Secret Service.
“If I’m chasing down a series of assets every month that don’t quite fit the mold or are not in some sort of FISMA [Federal Information Security Management Act] system, then that’s really a waste, and a repetitive process, and now I have to go find the processes allowing for that and fix it. Hopefully I can minimize this work,” he added, though he voiced doubts that the human element would be removed from the CDM process.
George Chambers, deputy CIO at the Department of Health and Human Services, noted that his department had recently completed Phase I of CDM and is looking at the potential of AI in the next step of the program.
“I think there are opportunities for some AI, even with the infrastructure side of it. We’re using multiple discovery tools, you’re getting different data at any point in time, there’s probably an opportunity for AI to look at the anomalies,” he said. “There’s opportunities for AI in particular between the aggregation and the display. There’s that opportunity to make sense of it all,” he added.
Chambers added that even though AI may not be a part of the standard CDM implementation today, there are still opportunities to incorporate new technologies in the program.
“If you don’t do the proscribed DHS platform, they give you opportunities to go outside of that. You can have a conversation with DHS…as long as the end product can get to them in the format that they’re looking for, everyone will be ecstatic,” he said.
Chambers also discussed the potential for AI to pose its own security challenges.
“As you see AI, RPA [robotic process automation], deep learning applications, and these robotic scripts being implemented into your environment, there’s got to be a control and understanding of the configuration, and how that impacts the applications they’re playing in,” said Chambers. “I look at it almost like interfaces. They’re now growing within the environment, but there’s no central view of what configuration they have. As I’m updating and upgrading my applications, am I leaving vulnerabilities in these scripts? Am I leaving vulnerabilities within my network, and how am I keeping track of them?” he asked.
Other cybersecurity threats associated with emerging technologies surfaced in the discussion. Michel Cukier, director of the Advanced Cybersecurity for Students program at the University of Maryland, noted the need to approach internet of things (IoT) security in a realistic way.
“You cannot really stop it, but you can figure out the risks and how you can teach people to figure out the solutions they should not implement, basically trying to avoid the worst-case scenario,” he said. Cukier brought up a worst-case example of hacking into an internet-connected oven and burning down a building. “I can see the problem of ‘oh, this will be bad, but how can we mitigate risk,’” he said.
No matter where emerging technology may head, panelists agreed that agencies still need to address the weakest link in cybersecurity – the people.
“We should really focus on the human side,” said Cukier. “You could come up with extremely expensive technologies that will be useless if your weakest point becomes the human.”
“We do need to get past these human elements, the weakest links in this process. When you look at all the cybercrime that’s out there, the majority of it was all done based upon some human failing,” said Graber.