A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • WaxedWookie@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    3 months ago

    What’s hard to understand is why you skipped the question I asked, and answered a different one instead.

    The creation of the CSAM is unquestionably far more harmful, but I wasn’t talking about the *creation *- I was talking about the possession. The harm of the creation is already done, and whether or not the material exists after that does nothing to undo that harm.

    Again, is your prescription the same as it relates to the possession, not generation of CSAM?