

4·
8 days agoAnd that’s making them larger and “think."
Isn’t that the two big strings to the bow of LLM development these days? If those don’t work, how isn’t it the case that hallucinations “are here to stay”?
Sure, it might theoretically happen that some new trick is devised that fixes the issue, and I’m sure that will happen eventually, but there’s no promise of it being anytime even remotely soon.
Don’t feed the troll, folks.