• Ugurcan@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    8
    ·
    edit-2
    6 days ago

    I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.

    Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).

    And 2025’s investors doesn’t give a flying fuck about energy efficiency.

    • PostaL@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      6 days ago

      And they don’t want to disclose the energy efficiency becaaaause … ?

    • Sl00k@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      It also has a very flexible “thinking” nature, which means far far less tokens spent on most peoples responses.