AI-powered weapons stoke new fears

Remember the Transformers movies? Experts from around the world are looking for ways (aside from hiring Shia LaBeouf) to keep it from becoming a reality. 

What happened: Civilian, military, and technology leaders from more than 140 countries met in Vienna yesterday to discuss how to check the growing risks associated with autonomous and AI-powered weapons. 

  • One of the conference's hosts, Austrian Foreign Minister Alexander Schallenberg, said that confronting the challenge of this technology is “the Oppenheimer Moment of our generation.”

Catch-up: The UN has hosted discussions about controlling the use of AI in weapons systems for years, but because of the broad scope of the technology, regulation has proved difficult to agree on. 

  • Experts say that a traditional approach to arms control is unlikely to work because it’s not just one specific weapon that needs to be regulated.

  • The use cases for AI-powered hardware are also broader than traditional weapons — unlike, say, a cruise missile, an AI-powered drone could be used for fighting wildfires or fighting wars.

Why it matters: Researchers have warned that even the best autonomous weapons have the potential to make serious (and fatal) mistakes, and that passing military decisions to AI systems could have catastrophic consequences.

  • Ukraine is already using autonomous drones that experts say are being powered by AI, and Israel has recently been accused of using AI to determine bombing targets in Gaza.

  • Meanwhile, the Pentagon — which has already invested millions into AI startups — has poured US$1 billion so far into a program dedicated to building a fleet of small, weaponized autonomous vehicles.

Bottom line: Defence contractors — like any industry these days — are heavily incentivized by investors to crank up the AI dial. As the development of these weapons kicks into high gear and stock valuations continue to skyrocket, it’s hard to see this trend slowing down without intervention.—LA