Nine of 12 members of an ethics board appointed by Axon to advise its technology decisions have resigned. The departing members wrote that the company has failed to embrace the values that they have tried to instill. We have lost faith in Axon.

In recent years, Axon has grown into a powerhouse of law enforcement software and hardware, including electric weapons, body cameras, and digital platforms for evidence management. Setting aside for now the inherent risks of privatization, Axon has been rather thoughtful with its tech, soliciting the advice of the communities these tools will be used in as well as the cops who will wear or wield them.

A few years ago it became clear that machine learning was an extremely valuable tool but also one that could easily be poorly built and abused. The board, made up of experts, academics and industry professionals, would give a cautious perspective on the tech that suggested safeguards, accountability measures and so on.

The resigning members said it had a good start.

Each of us joined this Board in the belief that we could influence the direction of the company in ways that would help to mitigate the harms that policing technology can sow and better capture any benefits. For a time, we saw that influence play out in some of Axon’s decisions. From not equipping any of its products with facial recognition capabilities, to withdrawing a new software tool to collect data from social media websites, to promoting desperately needed legislation to bring the use of license plate readers under control, we observed tangible evidence of the difference we were making.

Rich Smith, the CEO, had a refreshingly honest take on the question of whether tech is the answer to an ongoing policing crisis.

Tech isn't a cure. He said that it wouldn't solve the problems for them. He said that some of the problems will be insoluble without technology. Body cameras and other digital tracking of police encounters is not always a good idea. The companies that make these tools are the ones who will define them, and that's what Axon is trying to do.

For Seattle’s cop-free protest zone, tech is both a revolutionary asset and disastrous liability

It may have gone too far in the matter of how much tech should be brought to bear in order to deter mass shootings.

Axon wants to develop drones that can be used for mass and school shootings, as well as encircle those targets with real-time streaming capabilities, according to the board.

There are so many questions to be answered about the use of this kind of equipment that the board decided to pilot it with strict control. A former board member and advocacy director for NYU's Policing Project said that the board was notified with very short notice that the tool would be announced as a widespread concept. We had to ask ourselves what we were doing here if it was that easy to move the Board to the side.

There would be resignations if they pursued this. They quit.

Smith wrote a post acknowledging the company may have gotten ahead of itself after the backlash from the community.

In light of feedback, we are pausing work on this project and refocusing to further engage with key constituencies to explore the best path forward It is an idea, not a product, and it is a long way away. There is a lot of work to be done to see if this technology is viable and if public concerns can be adequately addressed before moving forward.

The process of collecting alternative opinions seems to have steamrolled over its existing system according to him. Enhancements would prevent it from doing the same thing to anyone, however well staffed, that is in a purely advisory role. When I asked about the future of the board, Axon didn't reply.

Smith claims that the resigning members of the ethics board have decided to withdraw from directly engaging with these issues before we heard or had a chance to address their technical questions.

For over a year, the ethics board engaged with Axon to discuss the parameters of a narrow pilot program, implying that Smith's account has it backwards. The company broke its promise to consult the Ethics Board before making such momentous decisions and it is not sufficiently committed to develop this technology in a responsible way. In another message, he said that they "pleaded" with the idea not to be made public.

The board weighed in on a stun gun-equipped drone for police, not the one that was supposed to be used in schools. It's a bit thin to say the board didn't weigh in on the issue of police deployment in schools. The resignation letter noted that Axon didn't give them much time to reply.

Whatever the case, the stun gun plan is on ice and Axon may think twice before jumping into a debate. Tech has an important role in safety and law enforcement, but it doesn't do anyone any good to move fast.