Colleges around the country have frozen in fear over school shooting threats that eventually reveal themselves as “false alarms”. Villanova University was first to report false alarms related to a possible armed threat at the beginning of their fall semester in August.
Six other universities reported false alarms on Aug. 25 and two other campuses in Texas reported similar incidents the following day.
A few days later on Aug. 30, The New York Times (NYT) reported that an online group called “Purgatory” was behind some of these false alarms.
“The online group is suspected of being connected to several of the episodes, including reports of shootings, according to cybersecurity experts, law enforcement agencies and the group members’ own posts in a social media chat. The group’s claims could not be independently verified,” said NYT.
Purgatory is aiming to profit off these incidents. The group is charging $95 for a school swatting, $120 for a mall, $140 for an airport and $150 for a hospital. The group makes a mockery of causing mass panic, and uses artificial intelligence (AI) for these destructive actions.
Since AI is a recent invention, crimes of this nature are now appearing more frequently. Due to a lack of regulation on AI, it is increasingly harder to convict guilty parties or get justice for victims. The development of new scams and crimes with the help of AI is also gaining traction and people believe that lawmakers are not acting in accordance.
At this point, the question is not if lawmakers are acting fast enough, but if they can act faster than AI. The technology has proven just how quickly it can evolve, and that it can work in loopholes around current laws and policies. Many remain worried about AI’s future, especially as its full capabilities remain unknown.
CMU’s Associated Student Government (ASG) President Leilani Domingo tackled these concerns during the most recent ASG cabinet meeting on Wednesday, Aug. 25. The leaders of each student organization from Student Life were in attendance.
“We just need to be ready [in such an event] and remember we represent our organizations,” said Domingo.
If a false alarm were to occur at CMU, she reminded student organization leaders to maintain a collective standard and time for their responses. Campus Safety at CMU is also aware of this recent trend of false shooting threats and will aim for a quick response time as well.
Recently identified as a potential resource by KKCO 11, “Eagle Eye Networks” is a camera sharing software that would allow police access to cameras on college campuses. The technology would provide a direct livestream for local law enforcement to look for active threats.
With this software in place at CMU, Grand Junction police would not spend valuable time and resources chasing after false reports.
Propellor Insights, a Los Angeles based market research firm, conducted a survey in 2024 to gauge public support for the implementation of Eagle Eye Networks.
“86% of parents feel safer with live video security connected to 911,” said Propellor Insights.
At this time, CMU does not have Eagle Eye Networks, and it is unclear whether there is any plan in place for their implementation in the future. However, it is just one possible method CMU can use to stay ahead of false alarm incidents other universities have encountered nationwide.
In an era of rapidly developing AI, it is important for CMU students to stay informed on new false reporting methods and scams, as well as methods to determine what is real and what is not. For now, students should stay informed and look out for others, as empathy is what separates artificial from human intelligence.