• gaja@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    I’ve hand calculated forward propagation (neural networks). AI does not learn, its statically optimized. AI “learning” is curve fitting. Human learning requires understanding, which AI is not capable of.

    • nednobbins@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 hours ago

      Human learning requires understanding, which AI is not capable of.

      How could anyone know this?

      Is there some test of understanding that humans can pass and AIs can’t? And if there are humans who can’t pass it, do we consider then unintelligent?

      We don’t even need to set the bar that high. Is there some definition of “understanding” that humans meet and AIs don’t?

      • gaja@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 hours ago

        It’s literally in the phrase “statically optimized.” This is like arguing for your preferred deity. It’ll never be proven but we have evidence to make our own conclusions. As it is now, AI doesn’t learn or understand the same way humans do.

        • nednobbins@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          So you’re confident that human learning involves “understanding” which is distinct from “statistical optimization”. Is this something you feel in your soul or can you define the difference?

          • gaja@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            49 minutes ago

            Yes. You learned not to touch a hot stove either from experience or a warning. That fear was immortalized by your understanding that it would hurt. An AI will tell you not to touch a hot stove (most of the time) because the words “hot” “stove” “pain” etc… pop up in its dataset together millions of times. As things are, they’re barely comparable. The only reason people keep arguing is because the output is very convincing. Go and download pytorch and read some stuff, or Google it. I’ve even asked deepseek for you:

            Can AI learn and understand like people?

            AI can learn and perform many tasks similarly to humans, but its understanding is fundamentally different. Here’s how AI compares to human learning and understanding:

            1. Learning: Similar in Some Ways, Different in Others

            • AI Learns from Data: AI (especially deep learning models) improves by processing vast amounts of data, identifying patterns, and adjusting its internal parameters.
            • Humans Learn More Efficiently: Humans can generalize from few examples, use reasoning, and apply knowledge across different contexts—something AI struggles with unless trained extensively.

            2. Understanding: AI vs. Human Cognition

            • AI “Understands” Statistically: AI recognizes patterns and makes predictions based on probabilities, but it lacks true comprehension, consciousness, or awareness.
            • Humans Understand Semantically: Humans grasp meaning, context, emotions, and abstract concepts in a way AI cannot (yet).

            3. Strengths & Weaknesses

            AI Excels At:

            • Processing huge datasets quickly.
            • Recognizing patterns (e.g., images, speech).
            • Automating repetitive tasks.

            AI Falls Short At:

            • Common-sense reasoning (e.g., knowing ice melts when heated without being explicitly told).
            • Emotional intelligence (e.g., empathy, humor).
            • Creativity and abstract thinking (though AI can mimic it).

            4. Current AI (Like ChatGPT) is a “Stochastic Parrot”

            • It generates plausible responses based on training but doesn’t truly “know” what it’s saying.
            • Unlike humans, it doesn’t have beliefs, desires, or self-awareness.

            5. Future Possibilities (AGI)

            • Artificial General Intelligence (AGI)—a hypothetical AI with human-like reasoning—could bridge this gap, but we’re not there yet.

            Conclusion:

            AI can simulate learning and understanding impressively, but it doesn’t experience them like humans do. It’s a powerful tool, not a mind.

            Would you like examples of where AI mimics vs. truly understands?