Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

    • chatokun@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      2
      ·
      8 months ago

      There isn’t, but emphasis on why it’s an issue is always a good thing to do. Same reason people get upset when some articles say “had sex with a minor” or “involved in a relationship with a minor” when the accurate crime is “raped a minor.”

      • Hello Hotel@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        8 months ago

        If you (the news) are going to use flowery language, at least imply its a crime!

        • “Sexually coersed a minor”
        • or “groomed a minor for sex”
        • or “had a relationship where the power dynamics were so 1 sided that the child could not give consent”
        • or mabe just say “raped a minor”

        Its not that hard!

    • lengau@midwest.social
      link
      fedilink
      English
      arrow-up
      14
      ·
      8 months ago

      Theoretically any of these apps could be used with consent.

      In practice I can’t imagine that would be a particularly large part of their market…

      • Schadrach@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        8 months ago

        Now I have this image of an OnlyFans girl who just fake nudes all her pictures. Would make doing public nudity style pictures a lot easier.

      • dream_weasel@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        ·
        8 months ago

        “hey send me some nudes!”

        “Ugh… I’m already on the couch in my pajamas. Here’s a pic of me at the coffee shop today, just use the app, it’s close enough.”

    • Alien Nathan Edward@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      ·
      8 months ago

      I guess it’s really in whether you use it with consent. I used one on my own picture just to see how it worked. It gave me huge tits but other than that was scarily accurate.

        • Alien Nathan Edward@lemm.ee
          link
          fedilink
          English
          arrow-up
          7
          ·
          8 months ago

          I’m a dude, it’s just a clever name. It’ll do dudes, it’s just gonna give you huge tits. What you’re into is, of course, your business.

          • evranch@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 months ago

            My interest in this topic just went from 0 to 10 upon realizing the humour potential of passing it around to see all my bros with huge tits, but only if it worked like a Snapchat filter.

            Also I have a friend who already has huge tits, and I’ve seen them IRL so I’m curious what it would do

            • Schadrach@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              8 months ago

              Also I have a friend who already has huge tits, and I’ve seen them IRL so I’m curious what it would do

              Being serious for a moment, it depends on the source image. If it can tell where the contours of the tits are in the source image, they’ll be closer to the right size and shape - otherwise it’s going to find something it thinks are the contours and map out tits that match those, then generic torso that matches the shape of where it thinks the torso is and skintone of the face. It’s not magic, it’s just automating what a horndog with photoshop, a photo of you and a big enough porn collection to find someone with a similar body type could do back in the 90s.

              • evranch@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 months ago

                I’m familiar with how ML works so it’s not magic to me either, but the actual result is what would intrigue me. Since she has big naturals obviously they hang pretty heavy when they’re set free.

                But if I fed it a picture of her wearing a tight push-up bra, which could easily give off the impression that she had implants, would I get a pair of bolt-ons back? Or would it be able to pick up on the signs of real tits and add some sag?

                Seeing how it’ll put tits on men it’s obviously not an exact science lol

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      3
      ·
      edit-2
      8 months ago

      I assume that’s what you’d call OnlyFans.

      That said, the irony of these apps is that its not the nudity that’s the problem, strictly speaking. Its taking someone’s likeness and plastering it on a digital manikin. What social media has done has become the online equivalent of going through a girl’s trash to find an old comb, pulling the hair off, and putting it on a barbie doll that you then use to jerk/jill off.

      What was the domain of 1980s perverts from comedies about awkward high schoolers has now become a commodity we’re supposed to treat as normal.

      • CaptainEffort@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 months ago

        Idk how many people are viewing this as normal, I think most of us recognize all of this as being incredibly weird and creepy.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          8 months ago

          Idk how many people are viewing this as normal

          Maybe not “Lemmy” us. But the folks who went hog wild during The Fappening, combined with younger people who are coming into contact with pornography for the first time, make a ripe base of users who will consider this the new normal.

          • CaptainEffort@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 months ago

            Yeah damn, that’s true.

            An obvious answer would be to talk to younger people about it, to explain how gross and violating it is. Even if it doesn’t become illegal, there are plenty of legal things that people avoid and recognize are bad because they were taught correctly.

            Unfortunately, due to how puritan our society is, I can’t imagine many parents would be willing to talk to their kids about stuff like this.