• clucose@lemmy.ml
    link
    fedilink
    English
    arrow-up
    74
    arrow-down
    3
    ·
    3 days ago

    It is possible for AI to hallucinate elements that don’t work, at least for now. This requires some level of human oversight.

    So, the same as LLMs and they got lucky.

    • ATDA@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      1 day ago

      It’s like putting a million monkeys in a writers’ room, but super charged on meth and consuming insane resources.

      • john89@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        6
        ·
        1 day ago

        That monkey analogy is so far removed from reality, I think less of anyone who perpetuates it.

        A room full of monkeys banging on keyboards will always generate gibberish, because they’re fucking monkeys.