The moderators of a pro-artificial intelligence Reddit community announced that they have been quietly banning “a bunch of schizoposters” who believe “they’ve made some sort of incredible discovery or created a god or become a god,” highlighting a new type of chatbot-fueled delusion that started getting attention in early May.
tfw its no longer just the AI hallucinating
Yeah, there’s been an article shared on lemmy a few months ago about couples or families destroyed by AI.
Like the husband thinks he discovered some new truth, kinda religious level about how the world is working and stuff. The he becomes an annoying guru and ruins his social life.
Kind of Qanon people but with chatGPT…
Turns out it doesn’t really matter what the medium is, people will abuse it if they don’t have a stable mental foundation. I’m not shocked at all that a person who would believe a flat earth shitpost would also believe AI hallucinations.
I dunno, I think there’s credence to considering it as a worry.
Like with an addictive substance: yeah, some people are going to be dangerously susceptible to it, but that doesn’t mean there shouldn’t be any protections in place…
Now what the protections would be, I’ve got no clue. But I think a blanket, “They’d fall into psychosis anyway” is a little reductive.
I don’t think I suggested it wasn’t worrisome, just that it’s expected.
If you think about it, AI is tuned using RLHF, or Reinforcement Learning from Human Feedback. That means the only thing AI is optimizing for is “convincingness”. It doesn’t optimize for intelligence, anything seems like intelligence is literally just a side effect as it forever marches onward towards becoming convincing to humans.
“Hey, I’ve seen this one before!” You might say. Indeed, this is exactly what happened to social media. They optimized for “engagement”, not truth, and now it’s eroding the minds of lots of people everywhere. AI will do the same thing if run by corporations in search of profits.
Left unchecked, it’s entirely possible that AI will become the most addictive, seductive technology in history.
Ah, I see what you’re saying – that’s a great point. It’s designed to be entrancing AND designed to actively try to be more entrancing.
This feels a bit like PTA-driven panic about kids eating Tide Pods when like one person did it. Or razor blades in Halloween candy. Or kids making toilet hooch with their juice boxes. Or the choking game sweeping playgrounds.
But also, man on internet with no sense of mental health … sounds almost feasible.
I directly work with one of these people - they admit to spending all of their free time talking to the LLM chatbots.
On our work forums, I see it’s not uncommon at all. If it makes you feel any better, AI loving is highly correlated with people you shouldn’t ever listen to in the first place.
What an absolutely pathetic life that is holy shit
The Internet is a pretty big place. There’s no such thing as an idea that is too stupid. There’s always at least a few people who will turn that idea into a central tenet of their life. It could be too stupid for 99.999% of the population, but that still leaves about 5 000 people who are totally into it.
And the glory of the interwebz is that those 5000 people are bound to find each other and start a movement around it, where just 25 years ago they would have been laughed out of the local pub as a raving idiot…
And that’s exactly why we have flat-earthers, antivaxxers and “truthers” of various kinds. Although, due to the same phenomenon, we also have communities like !WhatsThisRock@lemmy.world, !capybara@lemmy.smeargle.fans, !NatureIsMetal@kbin.social, !captionthis@hilariouschaos.com, !HandmadeMarketplace and so many other interesting and quirky places. You win some, you loose some.
And that’s not even getting started on “ai girlfriends”, that are isolating vulnerable people to a terrifying degree. And since they are garbage at context, they do things like that case last year where it could seem like it was encouraging a suicidal teen.
Crazy.
It never was. Hallucinations are in no way unique to LLMs.
I always wanted to take drugs with my computer!