• Whoresradish@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    1 year ago

    Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.

    The point being is you don’t need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        You don’t know what you’re talking about.

        In this case the guy did have real images but you don’t need them. AI is kind of intelligent in a sort of hard to define way, it picks up on stuff.

        It picked up that people like younger individuals in pornography, so it took that to the logical extreme. AI is weird because it’s intelligence without any actual thought. But it can totally generate variations on things it’s already seen, and a kid is just a variation on a young adult.

        • mayoi@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Ah yes, I don’t know what I’m talking about it’s just that guy happened to have real images just like they do every time because it’s impossible to get your garbage model to produce cp otherwise.