Thankfully, the biggest story to come out Monday’s lengthy shooter standoff in East Dallas, was that of Sgt. Robert Watson, the Dallas Police Officer who pulled wounded Dallas Fire and Rescue Paramedic William An into the back of his patrol car and rushed him to Baylor, likely saving the man’s life. Otherwise, the shooting itself was merely another episode of senseless violence erupting from unknown, though undoubtedly troubled circumstances. Derick Lamont Brown shot and killed his godfather before wounding An and a neighbor, finally taking his own life in his East Dallas home. Brown was 36. He had a criminal record, was previously involved in African-American activist causes, and was under investigation from the FBI. He was found dead after DPD SWAT deployed a robot to survey the scene inside the house.
Throughout the day, breaking news about a shooter armed with an automatic weapon could not help but conjure memories of the terrible events of last summer. But it was the end of Monday’s standoff – and a conspicuous player in its conclusion – that bore most similarity with last July’s tragic police shooting. Just as last year’s nightmare episode was ended when Dallas police used a robot to deliver a package of explosives to shooter Micah Johnson, who was holed up in the El Centro Community College parking garage, on Monday, a robot entered the East Dallas home to find that this latest shooter had already taken his own life.
Robots are increasingly becoming a more common tool for local police departments, and it is not just in Dallas, which served as the cinematic backdrop for the prescient sci-fi satire, Robocop, which is celebrating the 30th anniversary of its theatrical release this year. Around the world, police forces are increasingly using robots to survey crime scenes, enforce crowd control, and even walk street patrols. This month, Dubai is expected to roll out its first robot patrolman, a wheeled officer with a touchscreen chest that allows people to report crimes and pay parking tickets. This Robocop even salutes.
After Dallas police used a robot to kill Micah Johnson last July, there was much discussion about the ethical implications of using robots to do policework, and particularly, to execute suspects. The advantages robots offer police in these kinds of situations are obvious: it reduces the need for cops to put their own bodies in harm’s way. But Rasha Abdul Rahim, an arms control adviser with Amnesty International who advocates for an international ban on killer robots, argued that these advantages come at a cost. What will happen when future advances in robotic policing technologies create situations in which non-human law enforcers are faced with situations that are not as black and white as the letter of the law and the official call of duty?
“For example,” Rahim argues, “during mass protests in Egypt in January 2011 the army refused to fire on protesters, an action that required innate human compassion and respect for the rule of law.” How would robotic crowd control devises programmed to merely enforce the law enforced have responded in a similar riot, protest, or revolutionary circumstance?
This may sound like a concern tied to technologies that don’t yet exist. After all, the Dallas’ police robots are basically expensive remote control cars still operated by people. But more advanced – and potential problematic – police robotics already do exist. This July 2016 article in Wired surveys some of the robots that are being used today around the world. For example, in India, riot-control drones can shower crowds with pepper spray and paintball. Prisons in South Korea are using robotic guards that are equipped with 3D cameras and are programed to monitor inmates and detect fights and escapes. Israeli police have the so-called “Deadly Rover,” which looks like a remote-control car with a 9 mm Glock attached to the top. The “Drone-Catching Drone” in Japan uses nets to catch rogue drones. Traffic Robocops are already on the streets in the Congo; the humanoids function just like our red-light cameras, but they also waive their arms to direct traffic. And Poland’s lightweight “Tactical Bot” weighs around four pounds, and it can be hand thrown into buildings to conduct surveillance or deliver stun grenades.
Reading about these real-life Robocops can be unnerving, especially when considered in light of the rapid advancements in Artificial Intelligence, the rapidly evolving means and methods of violence, and Barrett Brown’s recent dealings with the Bureau of Prisons. If the broken, autocratic institutions that already exist in our society are equipped with deadly devices that possess their own capacity for decision making, learning, and enforcing codes and regulations, dystopian visions of the order and magnitude of Robocop no longer feel outlandish. Questions like those raised in the wake of last year’s shootings become not merely pertinent, but pressing and necessary.
But fears of new technology shouldn’t cloud our vision for the potential benefits of using robotic technology in policework. Monday’s events demonstrated a clear example in which new machines can keep our police officers safe. Then there is Emily, a robot being used by the Greek coast guard to help with the Mediterranean refugee crisis. Developed by researchers at Texas A&M university, Emily (which stands for Emergency Integrated Lifesaving Lanyard) is a nautical drone equipped with a life vest and tethered to a rescue boat. She can speed across the water at up to 20 miles per hour to deliver her flotation device to drowning refugees.
Emily is a reminder that science and technology, as such, are themselves morally neutral fields. Whether particular technologies are developed, and whether or not they are used to harm or benefit society – these are the results of the human decisions and the human systems that produce and deploy new technology. Whatever perceived threat we may see in the expansion of robots into law enforcement, the real threat is, ultimately, a human one. As Rahim reminds us, it is up to those who possess human compassion and social awareness to get out in front of emerging technologies and ensure that robots we will make serve, and not destroy, the social good.