Ninety-five percent of Federal IT executives say their agencies are investing in in-house AI skills development – but in the past year, half of agencies had an AI project fail due to lack of in-house expertise, according to research from MeriTalk and Future Tech.

Martin Stanley, the strategic technology branch chief at the Cybersecurity and Infrastructure Security Agency (CISA), said that in order to close that skills gap in the Federal workforce, agencies must bring diverse AI talent in, as well as educate IT leadership on the importance of AI and other emerging technologies.

“One of the biggest challenges that we have is making sure that we set ourselves up for success,” Stanley said during MeriTalk’s “Identifying and Closing the AI Gap” webinar on April 18. “At CISA we just released, or opened up, our cyber talent management system, which gives us the opportunity to bring folks in through an alternate path and pay them the market rate.”

“We’ve got a lot of opportunities. On the training side, there’s all kinds of free training that’s really good out there for folks. So, I don’t think that that’s an issue,” Stanley said. “I think the pipeline maybe a little bit of an issue.”

Dan Chaney, Future Tech’s vice president of enterprise AI and data science solutions, said that partnering with academic institutions is a “very strong, powerful, successful model” for  developing long-term solutions for the AI skills gap in the Federal workforce.

“There’s a war for talent, because everybody wants to do AI,” Chaney said. “It’s not a trivial skill, so anyone who can do it, and has a proven portfolio, they’re in high demand. So, you have to retain the talent where you can by keeping it interesting.”

He continued, “But also you have to go to where the talent is. Sometimes that’s in the academic community … whatever works best for your specific needs. If you have long-term needs, I find that academic institutions do a little bit better. But again, it’s an ongoing conversation. As the problem evolves, so, too, will your skill set requirements.”

Chaney noted that a shorter-term solution to remedy Federal agencies’ gap in AI skills would be to bring in outside AI talent.

“Bringing in high-level AI talent – sharpshooters or architects – early on in the phase of development” is important, he said. “So that when you’re defining your problem, you’re finding your initial solution, you’re doing the initial model training – those are key points where maybe you need a little bit more expertise than you would need just for the running and the care and feeding of the solution once it’s up and running.”

“We find that bringing in high talent for a short-term burn to ensure that your solution is as robust, and scalable, and applicable as possible is a great way to make sure you get what you need, but you don’t take on the recurring cost of some very high end and very expensive data science talent,” Chaney said.

“It’s a conversation, it’s communication and collaboration,” he continued, adding, “Understanding what your project is intended to do, what your project needs are, and then just being open and honest about here’s where we have strengths, here’s where we don’t, here’s our long-term fix, let’s bring in some folks on the short term to address and provide bridging support.”

CISA’s Stanley said that the biggest challenge in all of this is the culture change. Educating leadership to help them understand that their agency needs diverse talent isn’t any easy task.

“The biggest challenge is educating the leadership that they need different folks, and they need to work differently, and they need to be more transparent about the information that they have,” Stanley said. “We’ve got to articulate those needs in a way that produce outputs, that close the gaps that we have in our mission space – because that’s how we’re going to be able to best serve our stakeholders.”

Stanley cited legislation approved by Congress last year that requires reporting of cyber incidents to CISA, and said that’s the type of automation project he wants to take on.

“We have lots and lots of cyber incidents coming in. We’ve got to be able to respond and gain knowledge out of those and triage those as quickly as possible,” Stanley explained. “That’s hard to do if you’re doing it manually.”

He continued, adding, “This is precisely the kind of application that we want to look at, but we also want to make sure that we’re doing it in the right way that keeps all of our values sound.”

The CISA official offered a final word of advice when it comes to agencies working to close the AI skills gap. Humans, he said, will always need to be in the loop.

“Make it clear to leadership [that] human-machine teaming has to be created, and to identify the right roles for the humans in the process and the right roles for the technology in the process,” Stanley said. “That’s where we’re going to have the most success.”

“You get all kinds of concerns that people have had since the beginning of industrialization,” Stanley explained. “I’m going to lose my job to a machine – people worry about that.”

He continued, adding, “We have so many other critical needs to apply humans to, and particularly working with other humans. That’s a really good thing for humans to do – to free them up from a lot of the manual repetitive tasks.”

“It’s change, and we all know how hard it is to get folks to want to change. So, I think that’s probably the best part and the most important part to start with is, what is your plan for introducing this kind of change into your organization? And then how do you connect everybody to the outcome,” Stanley said. “I think there’s a lot of opportunity here for the existing humans to be part of that oversight process.”

“What we’re talking about is bringing the technology in and transforming the organization in such a way that you still use those skill sets that are critical for the mission, and you don’t lose that,” Stanley said.

Read More About
About
Cate Burgan
Cate Burgan
Cate Burgan is a MeriTalk Senior Technology Reporter covering the intersection of government and technology.
Tags