• sorghum@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    68
    arrow-down
    2
    ·
    edit-2
    10 days ago

    Considering that the AI craze is what’s fueling the shortage and massive increase in GPU prices, I really don’t see gamers ever embracing AI.

    • ElectroVagrant@lemmy.world
      link
      fedilink
      English
      arrow-up
      56
      arrow-down
      1
      ·
      10 days ago

      […] I really don’t see gamers ever embracing AI.

      They’ve spent years training to fight it, so that tracks.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      6
      ·
      edit-2
      10 days ago

      The Nvidia GPUs in data centers are separate (and even on separate nodes than, with different memory chips than) gaming GPUs. The sole exception is the 4090/5090 which do see some use in data center forms, but at low volumes. And this problem is pretty much nonexistent for AMD.

      …No, it’s just straight up price gouging and anti competitiveness. It’s just Nvidia being Nvidia, AMD being anticompetitive too (their CEOs are like cousins twice removed), and Intel unfortunately not getting traction, even though Battlemage is excellent.

      For local AI, the only thing that gets sucked up are 3060s, 3090s, and for the rich/desperate, 4090s/5090s, with anything else being a waste of money with too little VRAM. And this is a pretty small niche.

      • RejZoR@lemmy.ml
        link
        fedilink
        English
        arrow-up
        36
        arrow-down
        1
        ·
        10 days ago

        Chip fabbing allocations are limited and what chips for Ai datacenters takeup, the desktop GPUs don’t get made. And what’s left of it are desktop chips sold for workstation Ai models like the RTX 5090 and even RX 7900 XTX because they have more memory. Meanwhile they still sell 8GB cards to gamers when it hasn’t been enough for a while. Whole situation is just absurd.

        • ohulancutash@feddit.uk
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          4
          ·
          10 days ago

          Fabbing is limited to keep prices high. Just like OPEC turning down oil extraction when the price gets too low.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          9 days ago

          Unfortunately, no one is buying a 7900 XTX for AI, mostly not a 5090 either. The 5090 didn’t even work till recently and still doesn’t work with many projects, doubly so for the 7900 XTX.

          The fab capacity thing is an issue, but not as much as you’d think since the process nodes are different.

          Again, I am trying to emphasize, a lot of this is just Nvidia being greedy as shit. They are skimping on VRAM/busses and gouging gamers because they can.

      • sorghum@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 days ago

        Still have limited wafers at the fabs. The chips going to datacenters could have been consumer stuff instead. Besides they (nVidia, Apple, AMD) are all fabricated at TSMC.

        Local AI benefits from platforms with unified memory that can be expanded. Watch platforms based on AMD’s Ryzen AI MAX 300 chip or whatever they call it take off. Frameworks you can config a machine with that chip to 128 GB RAM iirc. It’s the main reason why I believe Apple’s memory upgrades cost a ton so that it isn’t a viable option financially for local AI applications.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          9 days ago

          The chips going to datacenters could have been consumer stuff instead.

          This is true, but again, they do use different processes. The B100 (and I think the 5090) is TSMC 4NP, while the other chips use a lesser process. Hopper (the H100) was TSMC 4N, Ada Lovelace (RTX 4000) was TSMC N4. The 3000 series/A100 was straight up split between Samsung and TSMC. The AMD 7000 was a mix of older N5/N6 due to the MCM design.

          Local AI benefits from platforms with unified memory that can be expanded.

          This is tricky because expandable memory is orthogonal to bandwidth and power efficiency. Framework (ostensibly) had to use soldered memory for their Strix Halo box because it’s literally the only way to make the traces good enough: SO-DIMMs are absolutely not fast enough, and even LPCAMM apparently isn’t there yet.

          AMD’s Ryzen AI MAX 300 chip

          Funny thing is the community is quite lukewarm to the AMD APUs due to poor software support. It works okay… if you’re a python dev that can spend hours screwing with rocm to get things fast :/ But it’s quite slow/underutilized if you just run popular frameworks like ollama or the old diffusion ones.

          It’s the main reason why I believe Apple’s memory upgrades cost a ton so that it isn’t a viable option financially for local AI applications.

          Nah, Apple’s been gouging memory way before AI was a thing. It’s their thing, and honestly it kinda backfired because it made them so unaffordable for AI.

          Also, Apple’s stuff is actually… Not great for AI anyway. The M-chips have relatively poor software support (no pytorch, MLX is barebones, leaving you stranded with GGML mostly). They don’t have much compute compared to a GPU or even an AMD APU, the NPU part is useless. Unified memory doesn’t help at all, it’s just that their stuff happens to have a ton of memory hanging off the GPU, which is useful.

      • Dultas@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 days ago

        I’m pretty sure the fabs making the chips for datacenter cards could be making more consumer grade cards but those are less profitable. And since fabs aren’t infinite the price of datacenter cards is still going to affect consumer ones.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 days ago

          Heh, especially for this generation I suppose. Even the Arc B580 is on TSMC and overpriced/OOS everywhere.

          It’s kinda their own stupid fault too. They could’ve uses Samsung or Intel, and a bigger slower die for each SKU, but didn’t.

          • Buddahriffic@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 days ago

            TSMC is the only proven fab at this point. Samsung is lagging and current emerging tech isn’t meeting expectations. Intel might be back in the game with their next gen but it’s still to be proven and they aren’t scaled up to production levels yet.

            And the differences between the different fabs means that designing a chip to be made at more than one would be almost like designing entirely different chips for each fab. Not only are the gates themselves different dimensions (and require a different layout) but they also have different performance and power profiles, so even if two chips are logically the same and they could trade area efficiency for more consistent higher level layout (like think two buildings with the same footprint but different room layouts), they’d need different setups for things like buffers and repeaters. And even if they do design the same logical chip for both fabs, they’d end up being different products in the end.

            And with TSMC leading not just performance but also yields, the lower end chips might not even be cheaper to produce.

            Also, each fab requires NDAs and such and it could even be a case where signing one NDA disqualifies you from signing another, so they might require entirely different teams to do the NDA-requiring work rather than being able to have some overlap for similar work.

            Not that I disagree with your sentiment overall, it’s just a gamble. Like what if one company goes with Samsung for one SKU and their competition goes with TSMC for the competing SKU and they end up with a whole bunch of inventory that no one wants because the performance gap is bigger than the price gap making waiting for stock the no brainer choice?

            But if Intel or Samsung do catch up to TSMC in at least some of the metrics, that could change.

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 days ago

              Yeah you are correct, I was venting lol.

              Another factor is that fab choice design decisions were made way before the GPUs launched, when everything you said (TSMC’s lead/reliability, in particular) rang more true. Maybe Samsung or Intel could offer steep discounts for the lower performance (hence Nvidia/AMD could translate that to bigger dies), but that’s quite a fantasy I’m sure…

              It all just sucks now.

    • Affidavit@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      15
      ·
      10 days ago

      Speak for yourself. As an avid gamer I am excitedly looking towards the future of AI in games. Good models (with context buffers much longer than the .9s in this demo) have the potential to revolutionise the gaming industry.

      I really don’t understand the amount of LLM/AI hate in Lemmy. It is a tool with many potential uses.

  • mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    9 days ago

    Demonstrating some crazy idea always confuses people who expect a finished product. The fact this works at all is sci-fi witchcraft.

    Video generators offer rendering without models, levels, textures, shaders-- anything. And they’ll do shocking photorealism as easily as cartoons. This one runs at interactive speeds. That’s fucking crazy! It’s only doing one part of one game that’d run on a potato, and it’s not doing it especially well, but holy shit, it’s doing it. Even if the context length stayed laughably short - this is an FMV you can walk around in. This is something artists could feed and prune and get real fuckin’ weird with, until it’s an inescapable dream sequence that looks like nothing we know how to render.

    The most realistic near-term application of generative AI technology remains as coding assistants and perhaps rapid prototyping tools for developers, rather than a drop-in replacement for traditional game development pipelines.

    Sure, let’s pretend text is all it can generate. Not textures, models, character designs, et very cetera. What possible use could people have for an army of robots if they only do a half-assed job?

  • digitalnuisance@lemm.ee
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    14
    ·
    edit-2
    9 days ago

    AAA dev here.

    Carmack is correct. I expect to be dogpiled by uninformed disagreements, though, because on social media all AI = Bad and no nuance is allowed. If that’s your knee-jerk reaction, please refrain for a moment and calmly re-think through your position.

    EDIT: lol called it

      • MetaStatistical@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 days ago

        Stable Diffusion does a lot already, for static pictures. I get good use out of Eleven for voice work, when I want something that isn’t my own narration.

        I’m really looking forward to all of these new AI features in DaVinci Resolve 20. These are actual useful features that would improve my workflow. I already made good use of the “Create Subtitles From Audio” feature to streamline subtitling.

        Good AI tools are out there. They are just invisibility doing the work for people that pay attention while all of the billionaires make noise about LLMs that do almost nothing.

        I compare it to CGI. The very best CGI are the effects you don’t even notice. The worst CGI is when you try to employ it in every place that it’s not designed for.

        • funkforager@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          3
          ·
          9 days ago

          Obviously AI is coming for sound designers too. You know that right? https://elevenlabs.io/sound-effects

          And if you work on games and you haven’t seen your industry decimated in the past 16 months, I want to know what rock you have been living under and if there’s room for one more.

          • digitalnuisance@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            5 days ago

            Okay, dude. Mute this video (no cheating by listening to it first; you have to act like a REAL designer here), then use that AI to generate for me some sound design that works for the visuals at the timestamp. Should be simple to do better than the sound designers over at Riot for an expert like yourself with access to an AI that makes their expertise irrelevant and will totally steal their job.

            Oh, what’s that? It sounds awful and doesn’t represent the character at all? Hmm…

          • digitalnuisance@lemm.ee
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            12
            ·
            edit-2
            9 days ago

            I love when regular folks act like they understand things better than industry insiders near the top of their respective field. It’s genuinely amusing.

            Let me ask you a simple question: do YOU want to play a game with mediocre, lowest-common-denominator-generated AI audio (case-in-point, that AI audio generator sounds like dogshit and would never fly in a retail product)? Or do you want something crafted by a human with feelings (a thing an AI model does not have) and the ability to create unique design crafted specifically to create emotional resonance within you (and thing an AI has exactly zero intuition for) that is specifically tailored for the game in question, as any good piece of art demands?

            Answers on a postcard, thanks. The market agrees with me as well; no AI-produced game is winning at the Game Awards any time even remotely soon, because nobody wants to play stuff like that. And you know what’s even funnier? We TRIED to use tools like this a few years ago when they began appearing on the market, and we very quickly ditched them because they sounded like ass, even when we built our own proprietary models and trained them on our own designed assets. Turns out you can’t tell a plagiarism machine to be original and good because it doesn’t know what either of those things mean. Hell, even sound design plugins that try to do exactly what you’re talking about have kinda failed in the market for the exact reasons I just mentioned. People aren’t buying Combobulator, they’re buying Serum 2 in droves.

            And no, I have not seen my industry decimated by AI. Talk to any experienced AAA game dev on LinkedIn or any one of our public-facing Discord servers; it’s not really a thing. There still is and always will be a huge demand for art specifically created by humans and for humans for the exact reasons listed above. What has ACTUALLY decimated my industry is the overvaluation and inflation of everything in the economy, and now the low interest rates put in place to counter it, which is leading to layoffs once giant games don’t generate the insane profit targets suits have, which is likely what you are erroneously attributing to AI displacement.

            • Vanilla_PuddinFudge@infosec.pub
              link
              fedilink
              English
              arrow-up
              14
              arrow-down
              3
              ·
              edit-2
              9 days ago

              Do you remember the music from the last Marvel film you watched?

              I don’t.

              Quality isn’t directly correlated to success. Buy a modern pair of Nikes or… Go to McDonalds, play a modern mobile game.

              I love when industry insiders think they’re so untouchable that a budget cut wouldn’t have them on the chopping block. You’re defensive because its your ass on the line, not because its true.

              People gargle shit products and pay for them willingly all day long. So much so that it’s practically the norm. You’re just insulated from it, for now.

              • digitalnuisance@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                7
                ·
                edit-2
                9 days ago

                “Oh no, all my quality work won’t be in the next marvel movie or in mcdonalds’ next happy-meal promo campaign, darn. Guess I’ll have to make and sell something else.”

                ~ Literally every artist with a modicum of talent, ambition and a brain

                What’s your favorite big-budget, AI-generated game/movie/show that you’ve given money to, again?

                This is such a flimsy argument that it’s barely worth responding to. People by-and-large are absolutely sick of Marvel slop and still seek quality art elsewhere; this is not a novel concept, nor will it be outmoded by the introduction of AI. The internet and entertainment industry at large is still actively exploding with monetized, unique, quality content because not everybody wants slop; most people are actively sick of it. Talented visual artists are still and will continue to be hired in the entertainment industry and will also continue to be able to independently release stuff online because they have their own individual perspective and the x-factor of “human creativity” that AI slop just cannot compete with. Interesting that you didn’t address that, but what’s also interesting is you’re touching upon the reason most people are mad; AI models tend to churn out mediocre work, and people feel threatened because they aren’t good enough at their craft to compete with it, so instead of becoming better they scream at anybody trying to advance the technology of their particular discipline for taking away extremely easy kinds of work that they barely had to do anything to get before (patreon commissions, etc.). Work a tad harder, try to express yourself more effectively and I promise you somebody will value your work above the forgettable music from “The Eternals”. People with talent tend to break through if they try hard enough, it’s not rocket science.

                And I addressed the budget-cut thing earlier, so no I am not acting the way you described. Budget cuts are not an AI problem, they’re a capitalism problem, as I stated previously. Please read.

                INB4 people scream “survivorship bias”. No, you’re just not good enough, and you’d rather scream and yell at sensible takes from every expert in their field or craft than accept that fact. Legitimately. I know you don’t like hearing that, but you need to accept it in order to improve. Get better at your craft. If you can’t make stuff with greater quality than AI slop, you’re not going to be capable of making things that resonate with people anyway. AI will never be able to do this, and this kind of quality creates sales. AI will be used, sure, but it will be leveraged to improve efficiency, not replace artists

                • Vanilla_PuddinFudge@infosec.pub
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  1
                  ·
                  edit-2
                  9 days ago

                  At this point, it should be obvious that no one is downvoting you because they believe you’re wrong. Rather, its because you’re an inflated, insecure douchebag who’s so threatened by the opinions of two federated users on the ass end of the internet, that he feels the need to write an essay about it, not to us, no, to his own ego.

                  And for the record, I’m not one of the believers. On a long enough timeline, you’ll be playing birthday parties dressed as a cowboy. Ai is improving while people have a bell curve. It’s only a matter of time. Cheers. I hope you find happiness one day.

            • MetaStatistical@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              9 days ago

              What has ACTUALLY decimated my industry is the overvaluation and inflation of everything in the economy

              The real answer, like every creative industry over the past 200+ years, is oversaturation.

              Artists starve because of oversaturation. There is too much art and not enough buyers.

              Musicians starve because of oversaturation. And music is now easier than ever to create. Supply is everywhere, and demand pales in comparison. I have hundreds of CC BY-SA 4.0 artists in a file that I can choose for use in my videos, because the supply is everywhere.

              Video games are incredibly oversaturated. Throw a stick at Steam, and it’ll land on a thousand games. There’s plenty of random low-effort slop out there, but there’s also a lot of passionate indie creators trying to make their mark, and failing, because the marketing is not there.

              Millions of people shouting in the wind, trying to make their voices heard, and somehow become more noticed than the rest of the noise. It’s a near-impossible task, and it’s about 98% luck. Yet the 2% of people who actually “make it” practice survivorship bias on a daily basis, preaching that hard work and good ideas will allow you to be just like them.

              It’s all bullshit, of course. We don’t live in a meritocracy.

              • digitalnuisance@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                8 days ago

                Nah, good art breaks through with enough perserverance, time, improvement in your work and a little bit of luck (which you need less of the more of the first three you have). People just underestimate what “good art” is defined as. The bar is now just where it always should have been, which is JUST above somebody copying your work without any underlying understanding as to why it works or the cultural gestalt involved. Not a very high bar to clear, tbh, but I could understand why some entry-level folks feel frustrated. If that’s you, keep your head down, push through and improve, you’ll get there.

            • mindbleach@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              9 days ago

              When it’s other people’s work, well, people need a nuanced opinion about this nascent technological breakthrough.

              When it’s your specific area of expertise, it’s “the plagiarism machine.”

              You are Knoll’s law personified.

              • mindbleach@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                9 days ago

                And yeah yeah yeah, it does a mediocre job of whatever you do. That’s the opposite of safety. Disruptive change only cares about whether it can do the job. Already the answer seems to be a soft yes.

                Right now is the worst the tech will ever be, again.

              • digitalnuisance@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                8 days ago

                I love how you didn’t read anything else I wrote regarding this and boiled it down to a quippy, holier-than-thou and wrong statement with no nuance. Typical internet brainrot.

  • chunes@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    13
    ·
    edit-2
    10 days ago

    I get it, AI has some significant downsides, but people go way overboard. You don’t have to tell people who use AI to kill themselves.