An in-depth police report obtained by 404 Media shows how a school, and then the police, investigated a wave of AI-powered “nudify” apps in a high school.

  • abhibeckert@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    10 months ago

    Um… the Taylor Swift porn deepfakes were Dall-e.

    Sure - they try to prevent that stuff, but it’s hardly perfect. And not all bullying is easily spotted. Imagine a deepfake of a kid sending a text message, but the bubbles are green. Or maybe they’re smiling at someone they hate.

    Also, stable diffusion is more than good enough for this stuff. It’s free and any decent gaming laptop can run that. Takes mine 20 seconds to produce a decent deepfake… I’ve used it to touch up my own photos.

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      the Taylor Swift porn deepfakes were Dall-e.

      Got a source for that? I only have experience with DALL-E 3, but that is really picky about nudity, copyright and portrait right.