A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • 0x0@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    I wouldn’t put actual non-consensual pornography and fake pornography of any kind in the same bag but, geez, I’m not a Dr.

    Deep fakes do improve on the (technical) realism over 90s photoshop for sure. Doesn’t that still qualify as slander? (Also not a lawyer.)

    • eatthecake@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      The question is why, with an internet full of porn, do men want non consensual pornography that they know women are opposed to. It’s as if the hurtfulness, the lack of consent and the control over the woman in the video are actually the point.

      • 0x0@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Non-consensual pornography is called rape and it’s a crime in most of the world.