• panda_abyss@lemmy.ca
      link
      fedilink
      English
      arrow-up
      23
      ·
      1 month ago

      You get totally different answers to “is X healthy” vs “is X unhealthy”

      But yeah, if ChatGPT tells you to order restricted substances on the internet, probably don’t do that

      • TriflingToad@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        if ChatGPT tells you to order restricted substances on the internet, probably don’t do that

        so what am I supposed to do with my 3.8 tons of gunpowder now??

        • panda_abyss@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          Mix it with water and finger paint on a 12’ by 12’ canvas

          Then do a big art attack

  • yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    1
    ·
    1 month ago

    The thing that bothers me about LLMs is that people will acknowledge the hallucinations and lies LLMs spit out when their discussing information the user is familiar with.

    But that same person will somehow trust an LLM as an authority on subjects to which they’re not familiar. Especially on subjects that are on the edges or even outside human knowledge.

    Sure I don’t listen when it tells me to make pizza with glue, but it’s ideas about Hawking radiation are going to change the field.

    • MysteriousSophon21@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 month ago

      This is literally the Dunning-Kruger effect in action - people can’t evaluate the quality of AI responses in domains where they lack the knowledge to spot the bs.

  • Grandwolf319@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    34
    ·
    edit-2
    1 month ago

    After years of bullshit, corruption and nepotism, we as a society (or a critical mass of it) accepted that lies and bullshit is a part of life.

    I really think that’s what is going on here, we filled our reality with contradictions and things that drive us crazy, now a large percentage of the population are okay listening to inefficient guessing machines.

    Seriously, the fact that hallucinations didn’t kill the hype is, imo, a hallmark of being in a post truth era.

    This is not the mindset that made computers and the Internet. Feels more like late stage Rome.

  • unphazed@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 month ago

    I told my wife about this story and she told me that everything on the internet should be taken with a grain of salt. I told her this guy took a few too many grains.

  • deafboy@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    13
    ·
    1 month ago

    A man uses the internet to poison himself. The story as old as time But if we stick the AI in the title, we can get some sweet clicks out if it.

  • sem@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    “For 3 months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT.”

    I didn’t want to click. But I did so here you go.