• Sir Arthur V Quackington@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    18 hours ago

    There is nothing intelligent about “AI” as we call it. It parrots based on probability. If you remove the randomness value from the model, it parrots the same thing every time based on it’s weights, and if the weights were trained on Harry Potter, it will consistently give you giant chunks of harry potter verbatim when prompted.

    Most of the LLM services attempt to avoid this by adding arbitrary randomness values to churn the soup. But this is also inherently part of the cause of hallucinations, as the model cannot preserve a single correct response as always the right way to respond to a certain query.

    LLMs are insanely “dumb”, they’re just lightspeed parrots. The fact that Meta and these other giant tech companies claim it’s not theft because they sprinkle in some randomness is just obscuring the reality and the fact that their models are derivative of the work of organizations like the BBC and Wikipedia, while also dependent on the works of tens of thousands of authors to develop their corpus of language.

    In short, there was a ethical way to train these models. But that would have been slower. And the court just basically gave them a pass on theft. Facebook would have been entirely in the clear had it not stored the books in a dataset, which in itself is insane.

    I wish I knew when I was younger that stealing is wrong, unless you steal at scale. Then it’s just clever business.

    • Dr. Moose@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      17 hours ago

      Except that breaking copyright is not stealing and never was. Hard to believe that you’d ever see Copyright advocates on foss and decentralized networks like Lemmy - its like people had their minds hijacked because “big tech is bad”.

      • Sir Arthur V Quackington@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 hours ago

        Ingesting all the artwork you ever created by obtaining it illegally and feeding it into my plagarism remix machine is theft of your work, because I did not pay for it.

        Separately, keeping a copy of this work so I can do this repeatedly is also stealing your work.

        The judge ruled the first was okay but the second was not because the first is “transformative”, which sadly means to me that the judge despite best efforts does not understand how a weighted matrix of tokens works and that while they may have some prevention steps in place now, early models showed the tech for what it was as it regurgitated text with only minor differences in word choice here and there.

        Current models have layers on top to try and prevent this user input, but escaping those safeguards is common, and it’s also only masking the fact that the entire model is built off of the theft of other’s work.

      • josefo@leminal.space
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        13 hours ago

        What name do you have for the activity of making money using someone else work or data, without their consent or giving compensation? If the tech was just tech, it wouldn’t need any non consenting human input for it to work properly. This are just companies feeding on various types of data, if justice doesn’t protects an author, what do you think it would happen if these same models started feeding of user data instead? Tech is good, ethics are not

        • Dr. Moose@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          12 hours ago

          How do you think you’re making money with your work? Did your knowledge appear from a vacuum? Ethically speaking nothing is “original creation of your own merit only” - everything we make is transformative by nature.

          Either way, the talks are moot as we’ll never agree on what is transformative enough to be harmful to our society unless its a direct 1:1 copy with direct goal to displace the original. But thats clearly not the case with LLMs.