The article mentions counselors, but how many are actually using law enforcement to police this?
Soon after I got out of school, not long after cell phones were allowed, it went from principals doing the discipline (plus security guards at night) to having officers on staff at all times. I watched it become a police state of it's own and this is no surprise to me.
I don't think the technology is inherently bad though, but should be approached differently. Let the AI engage the child and talk to it with privacy. Obviously not Google's, because it might tell them to kill themselves, but an LLM is more than capable of deescalating a situation by engaging multiple students about a situation in private, while informing them that they do have help available to them should they want it, in which case they can have the conversation sent to a counselor.
ChatGPT has handled the majority of psychology questions I have thrown at it and I believe it would be great at handling 3-way conversations and de-escalating situations like that if trained to do so. Maybe some software utilizing the API. Privacy is important as is not incarcerating them for bad decisions forever hindering their education.
On top of that, when communicating with other students a Moderation API should be invoked that talks to them about their message before letting it send, offering nicer ways to say the same thing while still letting them express their emotions, whether that be frustration, anger, love, etc.
All it looks like we are doing is preparing them for the surveillance that they will be facing in the real world not long after graduation. Google, the CIA, Facebook, the FBI, etc. all have, or will have, tools to psychologically profile you and raise red flags automatically. Thanks to CISA/CISPA that passed integrating government data a long time ago, it's all available for AI to consume. It's only a matter of time. The internet is no longer safe to use unless you live a certain life online or use end to end encrypted communications.