Schools Using AI to Send Police to Students’ Homes

Share This Post


“It was one of the worst experiences of her life.”

Worst Experience

Schools are employing dubious AI-powered software to accuse teenagers of wanting to harm themselves and sending the cops to their homes as a result — with often chaotic and traumatic results.

As the New York Times reports, software being installed on high school students’ school-issued devices tracks every word they type. An algorithm then analyzes the language for evidence of teenagers wanting to harm themselves.

Unsurprisingly, the software can get it wrong by woefully misinterpreting what the students are actually trying to say. A 17-year-old in Neosho, Missouri, for instance, was woken up by the police in the middle of the night.

As it turns out, a poem she had written years ago triggered the alarms of a software called GoGuardian Beacon, which its maker describes as a way to “safeguard students from physical harm.”

“It was one of the worst experiences of her life,” the teen’s mother told the NYT.

Wellness Check

Internet safety software employed by educational tech companies took off during the COVID-19 shutdowns, leading to widespread surveillance of students in their own homes.

Many of these systems are designed to flag keywords or phrases to figure out if a teen is planning to hurt themselves.

But as the NYT reports, we have no idea if they’re at all effective or accurate, since the companies have yet to release any data.

Besides false alarms, schools have reported that the systems have allowed them to intervene in time before they’re at imminent risk at least some of the time.

However, the software remains highly invasive and could represent a massive intrusion of privacy. Civil rights groups have criticized the tech, arguing that in most cases, law enforcement shouldn’t be involved, according to the NYT.

In short, is this really the best weapon against teen suicides, which have emerged as the second leading cause of death among individuals aged five to 24 in the US?

“There are a lot of false alerts,” Ryan West, chief of the police department in charge of the school of the 17-year-old, told the NYT. “But if we can save one kid, it’s worth a lot of false alerts.”

Others, however, tend to disagree with that assessment.

“Given the total lack of information on outcomes, it’s not really possible for me to evaluate the system’s usage,” Baltimore city councilman Ryan Dorsey, who has criticized these systems in the past, told the newspaper. “I think it’s terribly misguided to send police — especially knowing what I know and believe of school police in general — to children’s homes.”

More on AI: Suspected Assassin of Insurance CEO Studied Artificial Intelligence, Spoke of “Singularity”



Source link

Related Posts

- Advertisement -spot_img