A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • @Jrockwar@feddit.uk
    link
    fedilink
    English
    8
    edit-2
    7 months ago

    And so is straight male-focused porn. We men seemingly are not attractive, other than for perfume ads. It’s unbelievable gender roles are still so strongly coded in 20204. Women must be pretty, men must buy products where women look pretty in ads. Men don’t look pretty and women don’t buy products - they clean the house and care for the kids.

    I’m aware of how much I’m extrapolating, but a lot of this is the subtext under “they’ll make porn of your sisters and daughters” but leaving out of the thought train your good looking brother/son, when that’d be just as hurtful for them and yourself.

    • @lud@lemm.ee
      link
      fedilink
      English
      57 months ago

      Or your bad looking brother or the bad looking myself.

      Imo people making ai fakes for themselves isn’t the end of the world but the real problem is in distribution and blackmail.

      You can get blackmailed no matter your gender and it will happen to both genders.