• luciferofastora@feddit.org
    link
    fedilink
    arrow-up
    5
    ·
    14 days ago

    LLMs are highly impressive text generators, amazing facsimiles of human writing and wholly unsuited to anything involving semantic understanding and critical thought. You cannot generate facts, and it doesn’t understand how the patterns it analyses and reproduces relate to actual concepts or things, but they’re extremely “knowledgeable” about those patterns.

    They’re a technological marvel, relentlessly abused by grifters posing as prophets to scam the gullible.

    Unfortunately, the gullible are executives and representatives.