Sen. John Hickenlooper, D-Colo., announced Monday his plans to introduce legislation that will direct the National Institute of Standards and Technology (NIST) to develop detailed guidelines for third-party evaluators to work with AI companies to provide independent external verification of their systems.

Sen. Hickenlooper – Chair of the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security – said he will formally introduce the Validation and Evaluation for Trustworthy (VET) AI Act when the Senate returns from recess on July 8.

“AI is moving faster than any of us thought it would two years ago,” Sen. Hickenlooper said in a July 1 statement. “But we have to move just as fast to get sensible guardrails in place to develop AI responsibly before it’s too late. Otherwise, AI could bring more harm than good to our lives.”

Sen. Hickenlooper’s office said his forthcoming bill would create a pathway for independent evaluators – with a function similar to those in the financial industry and other sectors – to work with companies as a neutral third party to verify their development, testing, and use of AI is in compliance with established guardrails.

Specifically, the VET AI Act would direct NIST, in coordination with the Department of Energy and National Science Foundation, to develop voluntary specifications and guidelines for developers and deployers of AI systems to conduct internal assurance and work with third parties on external assurance regarding the verification and red teaming of AI systems.

The senator said such specifications from NIST would require considerations for data privacy protections, mitigations against potential harms to individuals from an AI system, dataset quality, and governance and communications processes of a developer or deployer throughout the AI systems’ development lifecycles.

The bill would also establish a collaborative advisory committee to review and recommend criteria for individuals or organizations seeking to obtain certification of their ability to conduct internal or external assurance for AI systems.

Finally, the forthcoming bill will require NIST to conduct a study examining various aspects of the ecosystem of AI assurance, including the current capabilities and methodologies used, facilities or resources needed, and overall market demand for internal and external AI assurance.

Read More About
About
Cate Burgan
Cate Burgan
Cate Burgan is a MeriTalk Senior Technology Reporter covering the intersection of government and technology.
Tags