When nine members of the ethics board resigned, Axon, the company that developed the Taser, announced Sunday it would halt plans to develop a stun-gun drone that could be used to prevent mass shootings.
After the mass shootings in Buffalo and Uvalde, Texas, last month, Rick Smith, the founder and chief executive of Axon, announced a proposal for a non-lethal Taser drone that schools and other locations could use to prevent mass shootings. . The drones, Mr Smith said, could “play the same role as sprinklers and other firefighting equipment for firefighters: preventing a catastrophic event, or at least mitigating its worst effects.”
The announcement, on Thursday, came weeks after a two-thirds majority of Axon’s ethics board voted to recommend the company not proceed with a pilot study looking to explore the concept for Taser-equipped drones.
The Ethics Committee quickly issued a public statement on Thursday, saying it had not had time to review the proposal and that Axon’s decision was “deeply regrettable.”
Three days later, on Sunday, nine of the 13 members of the ethics board informed Mr Smith that they would resign. Smith said in a statement Sunday that Axon would pause its plans for the drone project. It was unclear whether the decision to halt the project was made before or after the board members told Mr Smith they intended to step down.
“It is unfortunate that some members of Axon’s ethics advisory panel chose to withdraw from being directly involved with these issues before we had heard their technical questions or had a chance to answer them,” said Mr. Smith. “We respect their choice and will continue to seek different perspectives to challenge our thinking and help guide other technology options we should consider.”
The nine board members who resigned said in a statement Monday that “none of us expected the announcement”.
“We all feel the desperate need to do something to address our epidemic of mass shootings,” they said. “But Axon’s proposal to elevate a tech-and-policing response when there are far less harmful alternatives is not the solution. Before the announcement of Axon, we begged the company to pull out. But the company rushed forward in a way that struck many of us as acting on the tragedy of the Uvalde and Buffalo shootings.”
In Axon’s announcement of the concept, Mr. Smith said, “I know it sounds a little ridiculous to some.” He offered three warnings: that non-lethal drones should not have the ability to kill; that people, not the drones, should determine what the drone does; and that the drones would need “rigorous surveillance.”
“For example, if a gunman comes into a church and a drone is deployed and puts the gunman down, we can’t just cheer that success,” said Mr. Smith. “We need to examine the video carefully and thoroughly.”
The board members who stepped down said in their statement that the ethics board had warned the company for years against using products that can monitor people in real time.
“This kind of surveillance will undoubtedly be harmful to communities of color and others who are overburdened, and probably much beyond,” they said. “The Taser-equipped drone also has no realistic chance of solving the mass shooting problem that Axon is now prescribing it, and only distracting society from real solutions to a tragic problem.”
One of the board members who resigned, Barry Friedman, the director of the Policing Project at the New York University School of Law, said in an interview that he was pleased that Axon was abandoning its plans for the drone project, and that he hoped the company would that would. let it go completely.
“I think it’s very important that we find a way to curb the adoption of technologies, which often happens with very little concern for damage to privacy, damage to racial justice or concerns about how much data the government has about all of us. preserves, and what is accessible to the government,” he said.
One of the four board members who decided not to step down, Giles Herdale, said he hoped that by staying on the board he “could try to mitigate the damage caused by developments like these”.
“What we need to do is try to provide perspectives to make them think,” said Mr Herdale, an associate fellow at the Royal United Services Institute, a London think tank that specializes in security issues.
“Because the idea of arming drones, or any other autonomous robot, is such a far-reaching decision,” he said, “we would like to have really careful, careful consideration and a lot of guardrails around deploying those kinds of technologies.”