• THB@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    2
    ·
    edit-2
    5 days ago

    It’s like talking with someone who thinks the Earth is flat. There isn’t anything to discuss. They’re objectively wrong.

    Humans like to anthropomorphize everything. It’s why you can see a face on a car’s front grille. LLMs are ultra advanced pattern matching algorithms. They do not think or reason or have any kind of opinion or sentience, yet they are being utilized as if they do. Let’s see how it works out for the world, I guess.