• Nate@programming.dev
    cake
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    2
    ·
    29 days ago

    Sorry but I don’t understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren’t forcing your hand or limiting your options.

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 days ago

        Saying that this is being spun for clicks is false, misinformed? How so? I am aware of no controversy, neither from regular people nor reputable sources. All I see here is a youtube personality making videos for view revenue.

    • Contramuffin@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      12
      ·
      29 days ago

      The big deal is that the vast majority of gamers aren’t techies. They don’t know to check VRAM. 8 GB is insufficient nowadays, and any company that sells an 8 GB card and doesn’t make it obvious that it’s low-end is exploiting consumers’ lack of knowledge

      • Nate@programming.dev
        cake
        link
        fedilink
        English
        arrow-up
        13
        ·
        29 days ago

        I run most games just fine with my 3070 8gb. While I would’ve preferred to have more when I bought it, it’s held up just fine.

        While the 9060 XT isn’t released yet, everything I’ve seen so far has made the difference pretty clear. I have no problem with offering a lesser sku if the difference is clear. Not like Nvidia and their 1060 3gb and 6gb where they also cut the cores and memory bandwidth. If these differ on release my stance would be different.

        Also gaming isn’t the only reason to have a GPU, I still use my 1060 in my server for transcoding and it works just fine. If I needed something to replace it, or if I was building a new one from scratch, 9060 XT 8gb or an arc would be a fine choice.

  • ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    edit-2
    29 days ago

    He said most people are playing at 1080p, and last month’s Steam survey had 55% of users with that as their primary display resolution, so he’s right about that. Ignore what’s needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?

    • leave_it_blank@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      29 days ago

      Absolutely. Why pay more if less is good enough?

      They are open about it, and give the option to get more RAM if you want it. Fine by me.

      No one with a 4k monitor will by them anyway.

      • yeehaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        5
        ·
        29 days ago

        Absolutely. Why pay more if less is good enough?

        Different problem, IMO.

    • obsoleteacct@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      29 days ago
      1. Why would they be buying a new card to play how they’re already playing?
      2. What does the long term trend line look like?

      You can confidently say that this is fine for most consumers today. There really isn’t a great argument that this will serve most consumers well for the next 3 to 5 years.

      It’s ok if well informed consumers are fine with a compromise for their use case.

      Misrepresenting the product category, and misleading less informed consumers to believe that it’s not a second rate product in the current generation is deeply anti-consumer.

  • the_q@lemm.ee
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    8
    ·
    29 days ago

    This really feels like AMD’s “don’t you guys have phones” moment.

  • skisnow@lemmy.ca
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    28 days ago

    The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.

    • recursive_recursion they/them@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      6
      ·
      edit-2
      28 days ago

      The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.

      This is one of the most uninformed comments I’ve read so far.


      I shared this vid to try and spread awareness that Frank Azor, AMD’s Chief Architect of Gaming Solutions and Gaming Marketing, of whom made that needless badfaith comment as it holds back the advancement of gaming.

      AMD’s 9060XT 16GB ($350) released recently but we’ve yet to see if they’re able to provide them to consumers at AMD’s own stated MSRP price of $350 USD; something that was unmet in their previous launch.


      For PC building, you walk around this show Computex and talk to case manufacturers and cooler manufacturers, and they’ll sync up their launches to Nvidia GPU launches cause they don’t sell things in between at the same velocity. And so if Nvidia launches a GPU and the interest falls off a cliff because people just feel like they either can’t get a card or they get screwed if they get a card, it I think actively damages the hobby.

      I remember even when the RTX 3070 came out and I gave that a positive review I said it was a great value product because by all metrics/measurements that we had at the time it was a good product. We had very few examples that we could point to where 8 gigabytes wasn’t enough. Of course the competing card the upcoming competing card we knew had a 16 GB vram buffer so; it/that doesn’t necessarily make that a valid thing. Like it can be like if you had a 32GB buffer now on that product, you’d be like “Well it’s got enough vram”. It’s probably nothing.

      But because we did see, and even when you were looking at like dedicated used vram, a lot of games like 7, 7 and a ½GB [usage of vram increasing]. So you could see it creeping up over the years from like 4, 5, 6, 7; you could see where it was trending right? Which is why I always find it funny when people [say] “Why are we using more than 8 now? Like 8 should be enough”.

      • First quote paragraph from Steve Burke of Gamers Nexus, second and third quote paragraph from Steve Walton of Hardware Unboxed
      • Is Nvidia Damaging PC Gaming? feat. Gamers Nexus
      • if you replace Nvidia’s name in the provided quotes with AMD it still holds the same force in that AMD would ruin the gaming landscape for the benefit of only themselves at the cost of literally everyone else.

      Clicks literally has no value to me as what I care about the most is trying to inform gamers so that people aren’t exploited by badfaith actors, especially hardware manufacturers as they dictate the limitations that game developers must work within. I’m not paid or affiliated with any of the hardware manufacturers.

      shitstirring for clicks

      “Shitstirring for clicks” literally does nothing for me. It would actually be detrimental for my reputation if I were to do so.


      No one should get a free pass if behaving badly. If Nvidia acts poorly they should be called out. Same for AMD, same for Intel. 0 exceptions.

  • Vik@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    28 days ago

    I can agree that the tweet was completely unnecessary, and the naming is extremely unfair given both variants have the exact same brand name. Even their direct predecessor does not do this.

    The statement that AMD could easily sell the 16 GiB variant for 50 dollars less and that $300 gives “plenty of room” is wildly misleading, and from that I can tell they’ve not factored in BOM at all.

    They blanketly state that GDDR6 is cheap and I’m not sure how they figure.

  • kemsat@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    28 days ago

    I’ve been playing Jedi Survivor off Game Pass, and the 12GB I have is getting maxed out. Wishing I woulda had the extra money to get one of the 16GB gpus.

  • Sheldan@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    29 days ago

    I agree if that would mean that those options are cheaper. You can be a very active gamer and not need more, why pay more.

  • Red_October@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    28 days ago

    At least we know nVidia aren’t the only ones being shitty, they just lead the market in it.