The US government is rolling out some new house rules for AI developers.
Driving the news: US President Joe Biden signed a first-of-its-kind executive order to regulate development in the AI industry. It includes guardrails to protect consumers and is the most significant regulation that the world's leading AI developers have faced.
- The new rules — enforced by the Justice Department — will force developers to share the results of safety tests with the government before releasing new products.
- They will only apply to new AI models that could be considered a threat to national security, and will not be applied to tools that are already in the marketplace.
Catch-up: Until now, the US has taken a hands-off approach to regulating AI, instead prioritizing economic growth. Given that it’s home to the industry’s leading developers, this executive order — and the legislation that won’t be far behind — could stifle that growth.
- The White House is buying time so that regulators — not only in the US but internationally — can play catch up before rolling out more robust legislation.
Why it matters: Despite it being a stopgap measure until AI legislation passes through Congress, the order goes well beyond any of the voluntary commitments signed so far when it comes to holding AI developers accountable for the safety of their products.—LA