Inside AI Policy

May 9, 2024

AI Daily News

Former NSC counsel says nuclear arms controls is not a model for mitigating AI risks

By Rick Weber  / November 20, 2023

A former National Security Council lawyer is pushing back against calls by foreign policy analysts to apply the Cold War era process for negotiating nuclear arms controls to international efforts for mitigating emerging risks from artificial intelligence, arguing key differences including the inability to quantify AI systems and the lack of government monopolies or control over the increasingly pervasive technology.

“I was maybe a little surprised to encounter a lot of analogies to nuclear arms control, to the Non-Proliferation Treaty, for example. And I think that the idea that is motivating that comparison is the idea that artificial intelligence is really, really significant and impactful, and it's going to change the way in which major states relate to each other in the military space,” Ashley Deeks, former deputy legal adviser to President Biden’s National Security Council, said Nov. 17 at the American Bar Association’s national security law conference.

Deeks told the several hundred lawyers at the ABA event, hosted by the Standing Committee on Law and National Security, that the stakes are “very high” for controlling AI risks, similar to the nuclear threat, where governments “were willing to negotiate controls, regulations over those weapons.”

Ashley Deeks

Ashley Deeks, Professor, University of Virginia Law School

“So, I think the argument goes, if states could do that in the nuclear space, well, then surely states should be able to do that in the in the AI space,” she added.

But Deeks said there are key differences between AI and nuclear weapons, making the analogy less useful than perhaps learning from the bilateral and other agreements and systems put in place for cybersecurity.

“So, in thinking about how those comparisons didn't really work very well, at least in my mind, I did notice that there were a fair number of similarities with the cyber sphere, and what's happened in the international space with cyber,” she said.

In both cases, Deeks said, you have a “broad range of users” with governments looking to use AI as a “potential offensive tool,” similar to nations developing cyber offensive capabilities.

Also, there are “a large mix of public and private actors involved,” similar to cyber activities, she said.

“That also means that the number of states in the room for a conversation about a cyber treaty or an AI treaty have to be bigger than the conversation about nuclear weapons were,” she said, noting “sometimes it was just the U.S. and the USSR or Russia.”

Yet, the “misalignment” of goals and regulations for using AI by the U.S. and other Western powers versus authoritarian regimes such as China and Russia may complicate reaching any global agreements on mitigating AI risks, Deeks said.

“I think Russia and China may be unwilling to give up the battlefield advantage of some types of national security AI,” she said.

“Russia and China might be unwilling to take some targets off the table, if that was something you were thinking about in the treaty space,” Deeks said, noting “the way in which something like deep fakes would affect Russian or Chinese society versus how it would affect U.S. society are also pretty different because we have different levels of control for our population and internet.”

“And I think there are also verification challenges,” she added. “So, all that is to say I think cyber is the better paradigm” for addressing AI risks than nuclear disarmament, Deeks asserted.

Deeks’ comments come as foreign policy experts including former Secretary of State Henry Kissinger have been touting decades-old agreements and protocols for avoiding a nuclear war as the basis for discussions on managing emerging AI risks.

As global leaders confront these AI risks, they should draw on the “lessons learned in the nuclear era [that] can inform their decisions,” wrote Kissinger and Harvard government professor Graham Allison in an Oct. 13 article in Foreign Affairs.

“The challenges presented by AI today are not simply a second chapter of the nuclear age. History is not a cookbook with recipes that can be followed to produce a soufflé. The differences between AI and nuclear weapons are at least as significant as the similarities. Properly understood and adapted, however, lessons learned in shaping an international order that has produced nearly eight decades without great-power war offer the best guidance available for leaders confronting AI today,” wrote Kissinger and Allison in Foreign Affairs.

Deeks spoke at the ABA event as part of a panel on the national security implications of AI which included retired Marine Corps Lt. Gen. Michael Groen, who was commander of the Joint AI Center, and Jocelyn Aqua, a former privacy official at the Justice Department and principal for data risk, privacy and AI governance at PricewaterhouseCoopers. Deeks is an associate professor at the University of Virginia School of Law.