• 0 Posts
  • 21 Comments
Joined 3 months ago
cake
Cake day: February 10th, 2025

help-circle

  • Load it and it fingerprints your browser. You can add a signature to that fingerprint.

    Make whatever changes you want to make to resist fingerprinting and reload the page. If it displays your signature then it has identified you, if not then your changes worked.

    Ideally, every page refresh would generate a new unique fingerprint so the page can’t link you to the last time you loaded the page (which is what tracking is, essentially)

    The site also displays all of the data that it can see, for advanced users






  • The CVE system protects everyone that uses computers. It is a public service that forms the core of cybersecurity in the US and many other places. It does not cost the database any more money if people use it to provide services to clients.

    Letting a private corporation take it over and put it behind a paywall now means that security, like so many other things, will only be available to people with money. It will make software and hardware more expensive by adding yet another license fee or subscription if you want software that gets security updates.

    In addition, a closed database is just less useful. This system works because when one person notifies the system of an exploit then every other person now knows. That kind of system is much higher quality if you have more people that are able to access it.

    An industry being created and earning money by providing cybersecurity services shows how useful such a system is for everyone. There are good paying jobs that depend on this data being freely available. New startups only need to provide service, they don’t need to raise the funds to buy into the security database because it is a public service. They also pay taxes (a significant amount if they’re charging $30,000 per audit), more than enough profit for the government to operate a database.




  • Otherwise I think that the idea of deleting all IP laws is just wishful (and naive) thinking, assuming people would cooperate and build on each other’s inventions/creations.

    Given the state the world is currently in, I don’t see that happening soon.

    There are plenty of examples of open sharing systems that are functional.

    Science, for example. Nobody ‘owns’ the formulas that calculate orbits or the underlying mathematics that AI models are built on like Transformer networks or convolutional networks. The information is openly shared and given away to everyone that wants it and it is so powerful it has completely reshaped society everywhere on the Earth (except the Sentinel Islands).

    Open Source projects, like Linux, are the foundation of the modern tech world. The ‘IP’ is freely available and you can copy or modify it as much as you’d like. Linus ‘owns’ the Linux project but anyone is free to take a copy of the Linux source code and modify it to whatever extent that they would like and form their own project.

    Much of the software and services that people use are built on top of open source tools made by volunteers, for free; and most of the useful knowledge and progress for human society results from breakthroughs made in the sciences, who’s discoveries are also free and openly shared.







  • What’s the follow on effect from making generated images illegal?

    Do you want your freedom to be at stake where the question before the Jury is “How old is this image of a person (that doesn’t exist?)”. “Is this fake person TOO child-like?”

    When that happens, how do you tell which images are AI generated and which are real? How do you know who is peddling real CP and who isn’t if AI-generated CP is legal?

    You won’t be able to tell, we can assume that this is a given.

    So the real question is:

    Who are you trying to arrest and put in jail and how are you going to write that difference into law so that innocent people are not harmed by the justice system?

    To me, the evil people are the ones harming actual children. Trying to blur the line between them and people who generate images is a morally confused position.

    There’s a clear distinction between the two groups and that distinction is that one group is harming people.


  • Child Sexual Abuse Material is abhorrent because children were literally abused to create it.

    AI generated content, though disgusting, is not even remotely on the same level.

    The moral panic around AI that leads to implying that these things are the same thing is absurd.

    Go after the people filming themselves literally gang raping toddlers, not the people typing forbidden words into an image generator.

    Don’t dilute the horror of the production CSAM by equating it to fake pictures.