• Blue_Morpho@lemmy.world
    link
    fedilink
    English
    arrow-up
    111
    arrow-down
    14
    ·
    5 days ago

    Cancelling new data centers because deep seek has shown a more efficient path isn’t proof that AI is dead as the author claims.

    Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient. The Internet investment bubble popped. That didn’t mean the Internet was dead.

    • contrafibularity@lemmy.world
      link
      fedilink
      English
      arrow-up
      53
      arrow-down
      1
      ·
      5 days ago

      yeah, genai as a technology and field of study may not disappear. genai as an overinflated product marketed as the be all end all that would solve all of humanity’s problems may. the bubble can’t burst soon enough

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 days ago

        Sometimes the hype bubble bursts and then the products eventually grows to be even larger than the hype. But you never know how connected hype actually is to any realistic timeline. Hype can pop like a cherry tree flowering on the first sunny day of spring, thinking summer has arrived, but then get drenched by another few weeks of rain. And as stupid as that cherry tree is, summer will eventually arrive.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      38
      arrow-down
      2
      ·
      5 days ago

      I’m gonna disagree - it’s not like DeepSeek uncovered some upper limit to how much compute you can throw at the problem. More efficient hardware use should be amazing for AI since it allows you to scale even further.

      This means that MS isn’t expecting these data centers to generate enough revenue to be profitable, and they’re not willing to bet on further advancements that might make them profitable. In other words, MS doesn’t have a positive outlook for AI.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        5 days ago

        More efficient hardware use should be amazing for AI since it allows you to scale even further.

        If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

        When all the Telcos scaled back on building fiber in 2000, that was because they didn’t have a positive outlook for the Internet?

        Or when video game companies went bankrupt in the 1980’s, it was because video games were over as entertainment?

        There’s a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that’s the reason AI is over.

        • Takumidesh@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          5 days ago

          If buying a new video card made me money, yes.

          This doesn’t really work, because the goal when you buy a video card isn’t to have the most possible processing power ever and playing video games doesn’t scale linearly so having an additional card doesn’t add anything.

          If I was mining crypto, or selling GPU compute (which is basically what ai companies are doing) and the existing card got an update that made it perform on par with new cards, I would buy out the existing cards and when there are no more, I would buy up the newer cards, they are both generating revenue still.

          • Blue_Morpho@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            4
            ·
            edit-2
            5 days ago

            If buying a new video card made me money, yes

            But this is the supposition that not buying a video card makes you the same money. You’re forecasting free performance upgrades so there’s no need to spend money now when you can wait and upgrade the hardware once software improvements stop.

            And that’s assuming it has anything to do with AI but the long term macroeconomics of Trump destroying the economy so MS is putting off spending when businesses will be slowing down because of the tariff war.

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?

          It doesn’t make any sense to compare games and AI. Games have a well-defined upper bound for performance. Even Crysis has “maximum settings” that you can’t go above. Supposedly, this doesn’t hold true for AI, scaling it should continually improve it.

          So: yes, in your analogy, MS would still buy a new video card this year if they believed in the progress being possible and reasonably likely.

          • Blue_Morpho@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            5 days ago

            Like games have diminished returns on better graphics (it’s already photo realistic few pay $2k on a GPU for more hairs?), AI has a plateau where it gives good enough answers that people will pay for the service.

            If people are paying you money and the next level of performance is not appreciated by the general consumer, why spend billions that will take longer to recoup?

            And again data centers aren’t just used for AI.

            • FooBarrington@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 days ago

              It’s still not a valid comparison. We’re not talking about diminished returns, we’re talking about an actual ceiling. There are only so many options implemented in games - once they’re maxed out, you can’t go higher.

              That’s not the situation we have with AI, it’s supposed to scale indefinitely.

              • Blue_Morpho@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                4 days ago

                Current games have a limit. Current models have a limit. New games could scale until people don’t see a quality improvement. New models can scale until people don’t see a quality improvement.

                • FooBarrington@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  4 days ago

                  I’m supposed to be able to take a model architecture from today, scale it up 100x and get an improvement. I can’t make the settings in Crysis 100x higher than they can go.

                  Games always have a limit, AI is supposed to get better with scale. Which part do you not understand?

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      This is a good point. It’s never sat right with me that LLMs require such overwhelming resources and cannot be optimized. It’s possible that innovation has been too fast to worry about optimization yet, but all this BS about building new power plants and chip foundries for trillions of dollars and whatnot just seems mad.

    • bean@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 days ago

      Yeah you echo my thoughts actually. That efficiency could be found in multiple areas, including deepseek. That perhaps too that some other political things may be a bit more uncertain.

  • ohshittheyknow@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    52
    ·
    5 days ago

    AI is a tool, like a hammer. Useful when used for its purpose. Unfortunately every tech company under the sun is using it for the wrong fucking thing. I don’t need AI in my operating system or my browser or my search engine. Just let it work on protein folding, chemical synthesis and other more useful applications. Honestly can’t wait for the AI hype to calm the fuck down.

    • Tattorack@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      5 days ago

      The only way it’s going to die down is if it gets replaced with the next tech bro buzzword.

      The previous one was “smart”, and it stuck around for a very long time.

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      5 days ago

      Preach it. I have been so sick of AI hype and rolling my eyes any time a business advertises it, and in some cases moving on. I don’t care about your glorified chat bot or search engine.

      • bitjunkie@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 days ago

        AI is the buzzword for a search engine that actually fucking works, something we used to have that gradually got enshittified out of existence

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      4 days ago

      It’ll balance out. I’m old enough to remember many web tech being this way from flash, to Bluetooth to Cloud.

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      It works pretty well as research/learn tool at my job… I learned a lot very fast using AI as a tool in my browser.

  • BlameTheAntifa@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    3
    ·
    4 days ago

    There’s no need for huge, expensive datacenters when we can run everything on our own devices. SLMs and local AI is the future.

  • Jesus@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    2
    ·
    5 days ago

    My guess is that, given Lemmy’s software developer demographic, I’m not the only person here who is close to this space and these players.

    From what I’m seeing in my day to day work, MS is still aggressively dedicated to AI internally.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      5 days ago

      That’s compatible with a lack of faith in profitable growth opportunity.

      So far they have gone big with what I’d characterize as more evolutionary enhancements to tech. While that may find some acceptance, it’s not worth quite enough to pay off the capital investment in this generation of compute. If they overinvest and hope to eventually recoup by not upgrading, they are at severe risk of being superseded by another company that saved some expenditure to have a more modest, but more up to date compute infrastructure.

      Another possibility is that they predicted a huge boom of a other companies spending on Azure hosting for AI stuff, and they are predicting those companies won’t have the growth either.

    • Optional@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      5 days ago

      I am sure the internal stakeholders of Micro$oft’s AI strategies will be the very last to know. Probably as they are instructed to clean out their desks.

      • Jesus@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 days ago

        There are a few of us here who are closer to Satya‘s strategic roadmap than you might think.

        • Optional@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 days ago

          I’m sure but they’re not going to hedge on a roadmap. Roadmaps are aways full-steam-ahead.

    • turnip@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      5 days ago

      Because investors expect it, whether it generates profit or not. I guess we will see how it changes workflows, or whether people continue to do things like they always have.

    • Alex@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      5 days ago

      Context is king which is why even the biggest models get tied in knots when I try them on my niche coding problems. I’ve been playing a bit with NotebookLM which promises to be interesting with enough reference material but unfortunately when I tried to add the Vulcan specs it complained it couldn’t accept them (copyright maybe?).

      We have recently been given clearance to use the Gemini Pro tools with Google office at work. While we are still not using them for code generation I have found the transcription and meeting summary tools very useful and certainly a time saver.

  • gmtom@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    5 days ago

    Yeah I mean, when has Microsoft of all companies ever been wring about the future of technology…

    • yeehaw@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      Hmmm let me just bring this on Internet explorer on my windows phone.

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    4 days ago

    Maybe thanks to tariffs the importation of components made overseas will become cost prohibitive vs any expected potential gains from further development of LLM/AI. Or, perhaps in addition, an expected economic downturn has caused them to re-evaluate large investments in the immediate future. Or maybe they think AI is dumb.

  • Petter1@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    4 days ago

    I see it more like they are confident to get running LLMs less resource intensive 🤔