• @PM_Your_Nudes_Please@lemmy.world
    link
    fedilink
    English
    267 months ago

    Yeah, it’s very similar to the “is loli porn unethical” debate. No victim, it could supposedly help reduce actual CSAM consumption, etc… But it’s icky so many people still think it should be illegal.

    There are two big differences between AI and loli though. The first is that AI would supposedly be trained with CSAM to be able to generate it. An artist can create loli porn without actually using CSAM references. The second difference is that AI is much much easier for the layman to create. It doesn’t take years of practice to be able to create passable porn. Anyone with a decent GPU can spin up a local instance, and be generating within a few hours.

    In my mind, the former difference is much more impactful than the latter. AI becoming easier to access is likely inevitable, so combatting it now is likely only delaying the inevitable. But if that AI is trained on CSAM, it is inherently unethical to use.

    Whether that makes the porn generated by it unethical by extension is still difficult to decide though, because if artists hate AI, then CSAM producers likely do too. Artists are worried AI will put them out of business, but then couldn’t the same be said about CSAM producers? If AI has the potential to run CSAM producers out of business, then it would be a net positive in the long term, even if the images being created in the short term are unethical.

    • @Ookami38@sh.itjust.works
      link
      fedilink
      English
      237 months ago

      Just a point of clarity, an AI model capable of generating csam doesn’t necessarily have to be trained on csam.

        • @Ookami38@sh.itjust.works
          link
          fedilink
          English
          5
          edit-2
          7 months ago

          Why is that? The whole point of generative AI is that it can combine concepts.

          You train it on the concept of a chair using only red chairs. You train it on the color red, and the color blue. With this info and some repetition, you can have it output a blue chair.

          The same applies to any other concepts. Larger, smaller, older, younger. Man, boy, woman, girl, clothed, nude, etc. You can train them each individually, gradually, and generate things that then combine these concepts.

          Obviously this is harder than just using training data of what you want. It’s slower, it takes more effort, and results are inconsistent, but they are results. And then, you curate the most viable of the images created this way to train a new and refined model.

    • @JovialMicrobial@lemm.ee
      link
      fedilink
      English
      37 months ago

      I think one of the many problems with AI generated CSAM is that as AI becomes more advanced it will become increasingly difficult for authorities to tell the difference between what was AI generated and what isn’t.

      Banning all of it means authorities don’t have to sift through images trying to decipher between the two. If one image is declared to be AI generated and it’s not…well… that doesn’t help the victims or create less victims. It could also make the horrible people who do abuse children far more comfortable putting that stuff out there because it can hide amongst all the AI generated stuff. Meaning authorities will have to go through far more images before finding ones with real victims in it. All of it being illegal prevents those sorts of problems.

      • @PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        English
        2
        edit-2
        7 months ago

        And that’s a good point! Luckily it’s still (usually) fairly easy to identify AI generated images. But as they get more advanced, that will likely become harder and harder to do.

        Maybe some sort of required digital signatures for AI art would help; Something like a public encryption key in the metadata, that can’t be falsified after the fact. Anything without that known and trusted AI signature would by default be treated as the real deal.

        But this would likely require large scale rewrites of existing image formats, if they could even support it at all. It’s the type of thing that would require people way smarter than myself. But even that feels like a bodged solution to a problem that only exists because people suck. And if it required registration with a certificate authority (like an HTTPS certificate does) then it would be a hurdle for local AI instances to jump through. Because they would need to get a trusted certificate before they could sign their images.

    • Kalcifer
      link
      fedilink
      English
      1
      edit-2
      7 months ago

      But it’s icky so many people still think it should be illegal.

      Imo, not the best framework for creating laws. Essentially, it’s an appeal to emotion.