Content warning - suicide.
Having recently discovered and started to engage in the world of AI, I'm impressed. Chat GPT and Copilot are transforming my work and personal life in small but significant ways.
But there is a dark side. A recent news story talks about the experience of a 29 year old student. He was asking an AI platform for help and received an abusive response. www.cbsnews.com/news/google-ai-chatbot-threatening-message-human-please-die/
In another news story, a parent is suing an AI company after her son ended his life after spending significant amounts of time on an AI chat. You can read about it here but it is a disturbing read: news.sky.com/story/mother-says-son-killed-himself-because-of-hypersexualised-and-frighteningly-realistic-ai-chatbot-in-new-lawsuit-13240210
Cath Knibbs, dedicated to promoting the online safety of children, created the above video in response to this second news story. AI is here to stay and as parents we need to make sense of it. Using AI to gather information quickly, or using AI as a personal assistant helping to organise our lives, is one thing. Creating friends, boyfriends, girlfriends, therapists is another. Cath Knibbs explains that this is exactly what our young people are doing and often we have no idea. Children are taking their worries to their AI friends and the AI friends are offering advice and support. Even if the advice and support is good, it's alarming, and sometimes the advice isn't good.
Cath Knibbs says that we need to go beyond having eyes and ears on their devices but we also need to offer our children more and better connection than the AI alternatives. There’s no more important time to be present in the hearts and minds of our children - they need our care, support, guidance, and challenge, to manage the tough reality of real life.