LOOK MAA I AM ON FRONT PAGE

  • Knock_Knock_Lemmy_In@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    5
    ·
    3 days ago

    When given explicit instructions to follow models failed because they had not seen similar instructions before.

    This paper shows that there is no reasoning in LLMs at all, just extended pattern matching.