
Microsoft recently signed testing and evaluation agreements with the governments of the UK and U.S., the latter of which also sealed deals with Google DeepMind and xAI. The company went on to claim that AI regulation would be a boon for development of frontier systems.
The U.S. software giant used a blog to explain government scrutiny drives it to improve its advanced AI systems.
“Well-constructed tests help us understand whether our systems are working as intended” and keeps the company alert to “risks, such as AI-driven cyberattacks and other criminal misuses”.
Microsoft announced it is working with the AI Security Institute (AISI) in the UK and U.S. Centre for AI Standards and Innovation (CAISI) to assess its cutting-edge models and prepare protections.
CAISI is a unit of the U.S. Department of Commerce which acts as the primary point of contact with the government for matters including testing, research and developing best-practices.
It signed safety and evaluation deals with Anthropic and OpenAI in 2024.
Spat
The U.S., in particular, sparked headlines in recent weeks in a well-publicised spat with Anthropic, so the latest moves by each government appear timely.
CAISI’s arrangements grant it early access to Microsoft, Google DeepMind and xAI’s models which director Chris Fall said is “essential” to perceiving the potential dangers posed by their technologies. He said the deals come at a “critical moment”.
The companies committed to work on improving the US government’s understanding of what AI can do, along with potential national security risks and the level of global competition.
CAISI clocked a tally of at least 40 evaluations to-date.
In the 2025 edition of its Frontier AI Trends Report, AISI noted “longer, more sophisticated attacks” were required to jailbreak AI models with the most rigorous protections “for certain malicious request categories”, though cautioned the “efficacy of safeguards varies between models” and no system tested was free of vulnerabilities.
Source: Mobile World Live
Image Credit: Stock Image





