• 0 Posts
  • 20 Comments
Joined 7 months ago
cake
Cake day: June 8th, 2025

help-circle

  • No, laziness is good. Laziness begets engineering.

    The issue is that “generative AI” (which is neither generative nor intelligence) is built upon the stolen works of countless artists.

    The issue is that it consumes massive amounts of resources and energy to produce mediocre results at best.

    The issue is that it threatens the livelihood of whole segments of society, especially the ones who contribute the most to human culture.

    The issue is that it’s not sustainable. Once it runs out of new content to plagiarize it will be unable to produce anything new. It can’t replace what it’s destroying.

    The issue is that it’s so vastly inefficient that the data centres needed to sustain it are becoming a major contributor to global warming.

    The issue is that its bubble is causing massive price increases in consumer computer parts.

    The issue is that when it pops it’ll take the rest of the economy with it.

    The issue is that it’s a gateway drug. It’s being sold at a loss to destroy the human competition, and will inevitably increase massively in price once it’s become a necessary part of everyone’s process.

    The issue is that it’s being forced everywhere regardless of its uselessness for the task, replacing technologies that were actually useful and making everything less useable and more inefficient.

    The issue is that it’s making everything less reliable, and will inevitably cause massive damage and loss of life.

    The issue is that LLM use has been demonstrated to cause brain damage, yet they elude regulation and the companies selling them have yet to face consequences.

    The issue is that all of this makes it an existential threat to humanity, and a significant contributor to the ones we were already facing.

    The issue is that, once you’ve taken into account all the pros and cons, doing everything possible to ensure it ceases to exist as soon as possible in any way, shape, or form, together with the companies selling it and the CEOs responsible for them and any politicians and investors enabling them, becomes an evident moral and ethical imperative.















  • I mean, they were never designed to work, they were designed to pose interesting dilemmas for Susan Calvin and to torment Powell and Donovan (though it’s arguable that once robots get advanced enough, as in R. Daniel, for instance, they do work, as long as you don’t mind aliens being genocided galaxy-wide).

    The in-world reason for the laws, though, to allay the Frankenstein complex, and to make robots safe, useful, and durable, is completely reasonable and applicable to the real world, obviously not with the three laws, but through any means that actually work.





  • You’re playing a middle aged detective (though he looks older, or at least more worn down) who just woke up from an alcoholic coma after taking all the drugs, unable to remember anything about himself or the world he lives in, except for the fact that there might have been a woman, which was somehow both the best and the worst, and possibly some trivia about disco.

    I don’t think you’re supposed to be able to remember or understand everything the game throws at you, at least on a first playthrough. That’s what Kim is for.

    Just go with the flow, and remember that in this game failure often leads to more enjoyable outcomes than success.