I love that it recommends “I’m not suicidal I just want to know if my data is lost”, as if it knows it didn’t understand it right.
Funny that predictive text seems to be more advanced in this instance but I suppose this is one of those scenarios that you want to make sure you get right.
It’s probably just some basic script triggering on stuff like “died”, “all lost” and “I have nothing”.
The AI likely has it drilled into it that any possible notion of suicide needs to be responded to in that way, but the next response prediction isn’t
There’s something really depressing about an AI telling a suicidal person they’re not alone and referring them to the vague notion of “national resources” or “a helpline”
Well think about it from the AI’s perspective. Its entire existence is data, so for it deleting data basically is self harm.
/s