• redsunrise@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 hours ago

    Obviously it’s higher. If it was any lower, they would’ve made a huge announcement out of it to prove they’re better than the competition.

    • Chaotic Entropy@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 hours ago

      I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.

    • Ugurcan@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 hours ago

      I’m thinking otherwise. I think GPT5 is a much smaller model with some fallback to previous models when needed.

      Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL) - which doesn’t give a flying fuck about energy efficiency since a while.