Working Group
Autonomous Action Runtime Management (AARM)
AARM is an open system category specification for securing AI-driven actions at runtime. Build systems that intercept, authorize, and audit autonomous actions before they execute.
Working Group Leadership

Josh Buker
Research Analyst, CSA
Working Group Co-Chairs

Herman Errico
| Publications in Review | Open Until |
|---|---|
| Confidential Computing: Simplifying Trust in the Modern Enterprise | May 20, 2026 |
| CSA Zero Trust Program Management Guidance | May 31, 2026 |
Who can join?
Anyone can join a working group, whether you have years of experience or want to just participate as a fly on the wall.
What is the time commitment?
The time commitment for this group varies depending on the project. You can spend a 15 minutes helping review a publication that's nearly finished or help author a publication from start to finish.
Virtual Meetings
Attend our next meeting. You can just listen in to decide if this group is a good for you or you can choose to actively participate. During these calls we discuss current projects, and well as share ideas for new projects. This is a good way to meet the other members of the group. You can view all research meetings here.
Open Peer Reviews
Peer reviews allow security professionals from around the world to provide feedback on CSA research before it is published.
Premier AI Safety Ambassadors
Premier AI Safety Ambassadors play a leading role in promoting AI safety within their organization, advocating for responsible AI practices and promoting pragmatic solutions to manage AI risks. Contact [email protected] to learn how your organization could participate and take a seat at the forefront of AI safety best practices.