An interesting development, but I doubt it’ll be a good thing, especially at first. This looks like the kind of thing that will be an entirely new threat vector and a huge liability, even when used in the most secure way possible, but especially when used in a haphazard way that we’ll certainly see from some of the early adoptors.

Just because you can do a thing, does not mean that you should.

I almost feel like this should have an NSFW tag because this will almost certainly not be safe for work.

Edit: looks like the article preview is failing to load… I’ll try to fix it. … Nope. Couldn’t fix.

  • ɔiƚoxɘup@beehaw.orgOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    23 hours ago

    It makes sense that you don’t buy it. LLMs are built on simplified renditions of neural structure. They’re totally rudimentary.