Autonomous weapons systems (AWS) have been described as the “third revolution of warfare,” after gunpowder and nuclear weapons. Currently in development, these weapons systems are powered by advanced algorithms that can make decisions to target and use lethal force against enemy soldiers on their own, without human intervention. Countries around the world are eager to be the first to develop and capture the advantages of AWS, while scholars and activists have sounded the alarm on the legal and ethical issues of delegating the decision to kill an enemy soldier to algorithms. Described as the dehumanization of war, the unique nature of AWS highlights an unresolved international law issue of whether and how international humanitarian law and human rights law can operate concurrently in armed conflict. Specifically, AWS raise the question of whether international humanitarian law, specialized law that governs the armed conflicts in which AWS would be deployed, would be the sole body of international law that regulates AWS, or whether human rights law would also govern the use of AWS in armed conflict. This Note argues that: 1) Human rights law applies to the use of AWS and prevails over international humanitarian law where the two bodies of law conflict, and 2) AWS’ use of lethal force violates human rights law’s prohibition against arbitrary deprivations of life.