• 0 Posts
  • 4 Comments
Joined 2 months ago
cake
Cake day: February 8th, 2025

help-circle

  • I’ll bait. Let’s think:

    -there are three humans who are 98% right about what they say, and where they know they might be wrong, they indicate it

    • now there is an llm (fuck capitalization, I hate the ways they are shoved everywhere that much) trained on their output

    • now llm is asked about the topic and computes the answer string

    By definition that answer string can contain all the probably-wrong things without proper indicators (“might”, “under such and such circumstances” etc)

    If you want to say 40% wrong llm means 40% wrong sources, prove me wrong