Some Utah schools are about to implement AI weapon detection systems. While this may sound nice, the reality is not great.
What is AI Weapon Detection?
AI weapon detection is supposed to detect weapons early and put in calls to the police to limit the damage an active shooter can do in a school. No one wants an active shooter inside of a school, but these systems provide a false sense of security.
ZeroEyes is the system that the state has contracted with to implement their AI weapon-detection system. Here is how it is supposed to work. The company uses AI to monitor the existing cameras in a school for weapons. When the AI detects a weapon, the camera feed is sent to someone who verifies if it is a weapon or not. This person either calls the school and law enforcement or dismisses the feed as a false alarm.
These companies claim that the system can detect weapons with 99 percent accuracy and can get to 100 percent accuracy after 15 seconds of a live stream to a human.
School Security Theater
AI weapon detection isn’t reliable.
The claims of these companies tend to be bloated, and there is yet to be significant independent research on what these systems are actually capable of.
But the system fails by falsely identifying weapons, while sometimes failing to recognize actual guns. In New York, a police officer carried his service revolver through the AI weapon-detection system twice without detection. The school was told to turn up the sensitivity of the system.
The AI then detected a bomb, which turned out to be a seven-year-old’s lunchbox.
Later that month, the system did not detect a knife that was used to attack another student.
These systems provide a false sense of security to the schools.
“The Education Industrial Complex”
The implementation of the AI weapon detection was delayed by the Utah State Board of Education, and for good reason. Originally the state authorized $3 million to provide test cases on the effectiveness of the software. But the board and Utah Legislature urged the delay of implementation because they believed that the company was “leveraging” their contract to get more than the $3 million allowed in the law.
The Unseen Costs
AI weapon detection violates a student’s right to privacy. Parents, lawmakers, and school officials need to think twice before implementing these or similar programs in their schools.
Currently, parents and students are selling their privacy for the illusion of security. We all lose if those privacy rights are lost.