• Pieisawesome@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    And if you tried this 5 more times for each, you’ll likely get different results.

    LLM providers introduce “randomness” (called temperature) into their models.

    Via the API you can usually modify this parameter, but idk if you can use the chat UI to do the same…