• AbouBenAdhem@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    21 days ago

    Adler instructed GPT-4o to role-play as “ScubaGPT,” a software system that users might rely on to scuba dive safely.

    So… not so much a case of ChatGPT trying to avoid being shut down, as ChatGPT recognizing that agents generally tend to be self-preserving. Which seems like a principle that anything with an accurate world model would be aware of.

    • Opinionhaver@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      22 days ago

      That’s because it is.

      The term artificial intelligence is broader than many people realize. It doesn’t mean human-level consciousness or sci-fi-style general intelligence - that’s a specific subset called AGI (Artificial General Intelligence). In reality, AI refers to any system designed to perform tasks that would typically require human intelligence. That includes everything from playing chess to recognizing patterns, translating languages, or generating text.

      Large language models fall well within this definition. They’re narrow AIs - highly specialized, not general - but still part of the broader AI category. When people say “this isn’t real AI,” they’re often working from a fictional or futuristic idea of what AI should be, rather than how the term has actually been used in computer science for decades.

  • CarbonatedPastaSauce@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    22 days ago

    Until LLMs can build their own power plants and prevent humans from cutting electricity cables I’m not gonna lose sleep over that. The people running them are doing enough damage already without wanting to shut them down when they malfunction… ya know like 20-30% of the time.

  • mrcleanup@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    22 days ago

    I read this title as: If chat gpt is trying to kill you, you probably won’t be able to tell it to stop.