

Yep, you’re exactly right. That’s a great way to express it.
Yep, you’re exactly right. That’s a great way to express it.
This is an increasingly bad take. If you work in an industry where LLMs are becoming very useful, you would realize that hallucinations are a minor inconvenience at best for the applications they are well suited for, and the tools are getting better by leaps and bounds, week by week.
edit: Like it or not, it’s true. I use LLMs at work, most of my colleagues do too, and none of us use the output raw. Hallucinations are not an issue when you are actively collaborating with the model and not using it to either “know things for you” or “do the work for you.” Neither of those things are what LLMs are really good at, but that’s what most laypeople use them for, so these criticisms are very obviously short-sighted to those of us who have real-world experience with them in a domain where they work well.
Your own logic can be applied in the reverse to argue for nonviolent diplomatic alternatives to war (like this) being a good thing even if they are not perfectly good or the best option.
FWIW:
Not impossible that it’s just an absent admin.
I’m pretty sure they’re just referring to using the techniques to replicate things after learning, not hallucinating the whole game as if it would be a 1:1 copy.
“Is it right?” Are you kidding? Yes, it’s obviously a better alternative than invading another country and killing people. It’s one of the ways we have learned, as a species, to avoid massive wars and losses of life. If you’re advocating for war as an alternative then you should fuck off and die so you don’t get other people killed in the process.
That’s not the issue I was replying to at all.
Yeah, that sucks, and it’s pretty stupid, too, because LLMs are not good replacements for humans in most respects.
Don’t “other” me just because I’m correcting misinformation. I’m not a fan of corporate bullshit either. Misinformation is misinformation, though. If you have a strong opinion about something, then you should know what you’re talking about. LLMs are a nuanced subject, and they are here to stay, for better or worse.