Regulating New Weapons Technology
in The Impact of Emerging Technologies on the Law of Armed Conflict (Eric Talbot Jensen & Ronald T.P. Alcala, eds.) (forthcoming 2019)
When confronted with a new weapons technology, international law scholars, military lawyers, and civil society activists regularly ask two questions: Are new regulations needed? And are they needed now? This paper reviews the main categories of technology-fostered legal disruption; tackles the question of whether a given technology will require new law; and weighs the respective benefits of precautionary bans, a wait-and-see approach, and proactive regulation.
Jurisprudential Space Junk: Treaties and New Technology
in Resolving Conflicts in the Law: Essays in Honour of Lea Brilmayer 106 (Chiara Giorgetti & Natalie Klein, eds.) (2019)
New technologies have fundamentally altered the ways in which international law develops, evolves, and sometimes inappropriately persists. This chapter discusses the problem of “jurisprudential space junk”: treaty laws that are theoretically in force but actually simply clutter the relevant legal regime. After reviewing problems associated with multilateral treaties regulating new technologies, this chapter suggests that other, more flexible forms of international lawmaking—namely, soft law and customary international law—will sometimes be far better suited to international technological governance.
The Malicious Use of Artificial Intelligence
Report (2018) (with Miles Brundige, Sharhar Avin, Jack Clark, Helen Toner, Peter Eckersley, Ben Garfinkel, Allan Dafoe, Paul Scharre, Thomas Zeitzoff, Bobby Filar, Hyrum Anderson, Heather Roff, Gregory C. Allen, Jacob Steinhardt, Carrick Flynn, Sean O hEigeartaigh, Simon Beard, Hydn Belfield, Sebastian Farquhar, Clare Lyle, Owain Evans, Michael Page, Joanna Bryson, Roman Yampolskiy, and Dario Amodei)
Artificial intelligence and machine learning capabilities are growing at an unprecedented rate. These technologies have many widely beneficial applications, ranging from machine translation to medical image analysis. Countless more such applications are being developed and can be expected over the long term. Less attention has historically been paid to the ways in which artificial intelligence can be used maliciously. This report surveys the landscape of potential security threats from malicious uses of artificial intelligence technologies, and proposes ways to better forecast, prevent, and mitigate these threats. We analyze, but do not conclusively resolve, the question of what the long-term equilibrium between attackers and defenders will be. We focus instead on what sorts of attacks we are likely to see soon if adequate defenses are not developed.
A Meaningful Floor for "Meaningful Human Control"
30 Temple Int’l & Comp. L.J. 53 (2016) (invited workshop contribution)
The broad support for “meaningful human control” of autonomous weapon systems comes at a familiar legislative cost: there is no consensus as to what this principle requires. This paper describes attempts to define the concept; discusses benefits of retaining imprecision in a standard intended to regulate new technology through international consensus; and argues for an interpretative floor grounded on existing humanitarian protections.
War, Responsibility, and Killer Robots
40 N.C. J. Int’l L. 909 (2015) (invited symposium contribution)
Although many are concerned that autonomous weapon systems may make war “too easy,” no one has discussed how their use may affect the constitutional war power. When conventional weaponry required boots on the ground, popular outrage at the loss of American lives incentivized Congress to check presidential warmongering. But as human troops are augmented and supplanted by robotic ones, it will be politically easier to justify using force, especially for short-term military engagements. Like drones and cyber operations, autonomous weapon systems will contribute to the growing concentration of the war power in the hands of the Executive, with implications for the doctrine of humanitarian intervention.
The Varied Law of Autonomous Weapon Systems
in NATO Allied Command Transformation, Autonomous Systems: Issues for Defence Policy Makers (Andrew P. Williams & Paul D. Scharre, eds., 2015)
What law governs autonomous weapon systems? Those who have addressed this subject tend to focus on the law of armed conflict. But the international laws applicable to the development or use of autonomous weapon systems are hardly limited to rules regarding the conduct of hostilities. Other legal regimes—including international human rights law, the law of the sea, space law, and the law of state responsibility—may also be relevant to how states may lawfully create or employ autonomous weapon systems, resulting in a complex and evolving web of international governance.
Judicious Influence: Non-Self-Executing Treaties and the Charming Betsy Canon
Note, 120 Yale L.J. 1784 (2011) (cited in Brownlie's Principles of Public International Law 79-80 (James Crawford, ed., 8th ed. 2012))
Non-self-executing treaties are commonly, and inappropriately, dismissed as irrelevant in domestic law. This note examines how judges employ the Charming Betsy canon to interpret ambiguous statutes to accord with U.S. international obligations, including those expressed in non-self-executing treaties. The Note concludes that this practice is justified and beneficial.