stackArmor, a provider of approval-to-operate (ATO) acceleration services for government organizations, today released its Approval To Operate (ATO) for AI accelerator.

ATO for AI is a governance model that the company said will help public sector and government organizations “rapidly implement security and governance controls to manage risks associated with Generative AI and General AI Systems” as defined by the National Institute of Standards and Technology (NIST).

“ATO for AI extends NIST AI RMF risk categories to NIST SP 800-53 security controls to accelerate the implementation of policies, procedures, plans and security controls necessary to accelerate safe AI systems adoption by public sector organizations and regulated industries,” the company explained.

“The security and compliance experts at stackArmor have developed a unique suite of AI overlays for NIST 800-53 controls that are directly to NIST AI RMF risk categories to allow agencies to authorize AI systems rapidly,” it said.

ATO for AI, stackArmor said, “builds on the decades of experience in managing digital and information systems risk using open NIST standards like NIST RMF, NIST SP 800-53 and NIST SP 800-171 and integrates them with emerging frameworks like NIST AI RMF specifically tailored to manage AI risk.”

The company explained that ATO for AI tracks with policy steps being taken by the Federal government and in California aimed at responsible AI development.

“The government agencies recognize the numerous benefits available from using AI but also understand it’s not without its own unique set of challenges and risks,” commented Gaurav “GP” Pal, stackArmor’s founder and chief executive officer.

“The recent executive order from [California] Governor Newsom and the upcoming federal executive order on AI demonstrate the urgency of driving safe and secure AI-adoption. ATO for AI™ uniquely accelerates the adoption of secure and safe AI by agencies,” the CEO said.

The company said that ATO for AI “addresses obstacles to large-scale AI adoption by public sector organizations including the lack of an actionable framework to assess and manage risk, an evolving understanding of risk controls and a shortage of skilled resources, tools and methods to implement a safe solution.”

“Extending cyber risk management programs like FedRAMP, FISMA/RMF, DOD CC SRG, and StateRAMP, ATO for AI™ provides a ready to use blueprint consisting of risk categories, security controls and a governance model that includes independent assessments, continuous monitoring and application of technical controls,” the company said.

AI Risk Management Center of Excellence

At the same time, stackArmor pointed to its recently established AI Risk Management Center of Excellence (CoE) that includes several Federal government technology veterans including: former Federal Deputy CIO Maria Roat, former Federal Acquisition Service Commissioner Alan Thomas, former Department of Homeland Security CIO Richard Spires, and former Amazon Web Services executive Teresa Carlson.

“The adoption of risk-based methods for managing and governing AI systems that leverage security controls defined in NIST SP 800-53 Rev 5, as well as established governance programs like FedRAMP can help agencies adopt AI more rapidly,” commented Roat. “Reducing the time and cost burden on agencies and supporting contractors by enhancing existing protocols is critical to ensuring timely deployment of AI systems for supporting the government mission.”

“The unique combination of AI-enabled applications on cloud-computing powered services offers a once-in-a-generation opportunity to truly enable a digital-first government,” said Carlson. “Transforming legacy applications at scale by using accelerators that deliver safe and secure AI-native applications developed by innovative ISVs on FedRAMP accredited cloud service providers can help us dramatically shorten the time and cost of AI adoption.”

Read More About
About
John Curran
John Curran
John Curran is MeriTalk's Managing Editor covering the intersection of government and technology.
Tags