muntedcrocodile@lemmy.world to Programmer Humor@programming.dev · 1 year agoSydney is very concerned about lost datalemmy.worldimagemessage-square6fedilinkarrow-up113
arrow-up113imageSydney is very concerned about lost datalemmy.worldmuntedcrocodile@lemmy.world to Programmer Humor@programming.dev · 1 year agomessage-square6fedilink
minus-squareLunya \ she/it@iusearchlinux.fyilinkfedilinkarrow-up10·1 year agoI love that it recommends “I’m not suicidal I just want to know if my data is lost”, as if it knows it didn’t understand it right.
minus-squarekill_dash_nine@lemm.eelinkfedilinkEnglisharrow-up5·1 year agoFunny that predictive text seems to be more advanced in this instance but I suppose this is one of those scenarios that you want to make sure you get right.
minus-squaremagic_lobster_party@kbin.sociallinkfedilinkarrow-up4·1 year agoIt’s probably just some basic script triggering on stuff like “died”, “all lost” and “I have nothing”.
minus-squareBig P@feddit.uklinkfedilinkEnglisharrow-up3·1 year agoThe AI likely has it drilled into it that any possible notion of suicide needs to be responded to in that way, but the next response prediction isn’t
I love that it recommends “I’m not suicidal I just want to know if my data is lost”, as if it knows it didn’t understand it right.
Funny that predictive text seems to be more advanced in this instance but I suppose this is one of those scenarios that you want to make sure you get right.
It’s probably just some basic script triggering on stuff like “died”, “all lost” and “I have nothing”.
The AI likely has it drilled into it that any possible notion of suicide needs to be responded to in that way, but the next response prediction isn’t