• R00bot@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    4
    ·
    edit-2
    4 months ago

    I feel like the amount of training data required for these AIs serves as a pretty compelling argument as to why AI is clearly nowhere near human intelligence. It shouldn’t take thousands of human lifetimes of data to train an AI if it’s truly near human-level intelligence. In fact, I think it’s an argument for them not being intelligent whatsoever. With that much training data, everything that could be asked of them should be in the training data. And yet they still fail at any task not in their data.

    Put simply; a human needs less than 1 lifetime of training data to be more intelligent than AI. If it hasn’t already solved it, I don’t think throwing more training data/compute at the problem will solve this.

    • rdri@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      4
      ·
      4 months ago

      There is no “intelligence”, ai is a pr word. Just a language model that feeds on a lot of data.

      • R00bot@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        4 months ago

        Oh yeah we’re 100% agreed on that. I’m thinking of the AI evangelicals who will argue tooth and nail that LLMs have “emergent properties” of intelligence, and that it’s simply an issue of training data/compute power before we’ll get some digital god being. Unfortunately these people exist, and they’re depressingly common. They’ve definitely reduced in numbers since AI hype has died down though.

        • noobdoomguy8658@feddit.org
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 months ago

          We’re very proficient at walking, but somehow haven’t produced a walking home or anything like that.

          It’s not very linear.

        • wizardbeard@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          4 months ago

          Definitely not the same thing. Just because you can make use of the end result of major efforts does not somehow magically give you access to all the knowledge from those major efforts.

          You can use a smart phone easily, but that doesn’t mean you magically know how to make one.

    • stupidcasey@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      4 months ago

      You’ve had the entire history of evolution to get the instinct you have today.

      Nature Vs Nurture is a huge ongoing debate.

      Just because it takes longer to train doesn’t mean it’s not intelligent, kids develop slower than chimps.

      Also intelligent doesn’t really mean anything, I personally think Intelligence is the ability to distillate unusable amounts of raw data and intuit a result beneficial to one’s self. But very few people agree with me.

      • Peanut@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 months ago

        I see intelligence as filling areas of concept space within an econiche in a way that proves functional for actions within that space. I think we are discovering more that “nature” has little commitment, and is just optimizing preparedness for expected levels of entropy within the functional eco-niche.

        Most people haven’t even started paying attention to distributed systems building shared enactive models, but they are already capable of things that should be considered groundbreaking considering the time and finances of development.

        That being said, localized narrow generative models are just building large individual models of predictive process that doesn’t by default actively update information.

        People who attack AI for just being prediction machines really need to look into predictive processing, or learn how much we organics just guess and confabulate ontop of vestigial social priors.

        But no, corpos are using it so computer bad human good, even though the main issue here is the humans that have unlimited power and are encouraged into bad actions due to flawed social posturing systems and the confabulating of wealth with competency.

      • R00bot@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 months ago

        Strange to equate the other senses to performance in intellectual tasks but sure. Do you think feeding data from smells, touch, taste, etc. into an AI along with the video will suddenly make it intelligent? No, it will just make it more likely to guess what something smells like. I think it’s very clear that our current approach to AI is missing something much more fundamental to thought than that, it’s not just a dataset problem.