• JovialMicrobial@lemm.ee
    link
    fedilink
    arrow-up
    3
    ·
    6 months ago

    I’m a professional artist and have no issue banning ai generated CSAM. People can call it self expression if they want, but that doesn’t change the real world consequences of it.

    Allowing ai generated CSAM basically creates camouflage for real CSAM. As ai gets more advanced it will become harder to tell the difference. The scum making real CSAM will be emboldened to make even more because they can hide it amongst the increasing amounts of ai generated versions, or simply tag it as AI generated. Now authorities will have to sift through all of it trying to decipher what’s artifical and what isn’t.

    The liklihood of them being able to identify, trace, and convict child abusers will become even more difficult as more and more of that material is generated and uploaded to various sites with real CSAM mixed in.

    Even with hyper realistic paintings you can still tell it’s a painting. Anime loli stuff can never be mistaken for real CSAM. Do I find that sort of art distasteful? Yep. But it’s not creating an environment where real abusers can distribute CSAM and have a higher possibility of getting away with it.

    • jeremyparker@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      I guess my question is, why would anyone continue to “consume” – or create – real csam? If fake and real are both illegal, but one involves minimal risk and 0 children, the only reason to create real csam is for the cruelty – and while I’m sure there’s a market for that, it’s got to be a much smaller market. My guess is the vast majority of “consumers” of this content would opt for the fake stuff if it took some of the risk off the table.

      I can’t imagine a world where we didn’t ban ai generated csam, like, imagine being a politician and explaining that policy to your constituents. It’s just not happening. And i get the core point of that kind of legislation – the whole concept of csam needs the aura of prosecution to keep it from being normalized – and normalization would embolden worse crimes. But imagine if ai made real csam too much trouble to produce.

      AI generated csam could put real csam out of business. If possession of fake csam had a lesser penalty than the real thing, the real stuff would be much harder to share, much less monetize. I don’t think we have the data to confirm this but my guess is that most pedophiles aren’t sociopaths and recognize their desires are wrong, and if you gave them a way to deal with it that didn’t actually hurt chicken, that would be huge. And you could seriously throw the book at anyone still going after the real thing when ai content exists.

      Obviously that was supposed to be children not chicken but my phone preferred chicken and I’m leaving it.

      • JovialMicrobial@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        5 months ago

        I try to think about it this way. Simulated rape porn exists, and yet terrible people still upload actual recordings of rapes to porn sites. And despite the copious amounts of the fake stuff available all over the internet… rape statistics haven’t gone down and there’s still sexual assaults happening.

        I don’t think porn causes rape btw, but I don’t think it prevents it either. It’s the same with CSAM.

        Criminally horrible people are going to be horrible.