Nvidia is using the “its fake news” strategy now? My how the mighty have fallen.
I’ve said it many times but publicly traded companies are destroying the world. The fact they have to increase revenue every single year is not sustainable and just leads to employees being underpaid, products that are built cheaper and invasive data collection to offset their previous poor decisions.
AMD & Intel ARC are king now. All that CUDA nonsense, is just price-hiking justification
Bought my first AMD card last year, never looked back
AMD’s Windows drivers are a little rough, but the open source drivers on Linux are spectacular.
Since when did gfx cards need to cost more than a used car?
We are being scammed by nvidia. They are selling stuff that 20 years ago, the equivalent would have been some massive research prototype. And there would be, like, 2 of them in an nvidia bunker somewhere powering deep thought whilst it calculated the meaning of life, the universe, and everything.
3k for a gfx card. Man my whole pc cost 500 quid and it runs all my games and pcvr just fine.
Could it run better? Sure
Does it need to? Not for 3 grand…
Fuck me!..
I haven’t bought a GPU since my beloved Vega 64 for $400 on Black Friday 2018, and the current prices are just horrifying. I’ll probably settle with midrange next build.
“and the drivers, for which NVIDIA has always been praised, are currently falling apart”
what? they were shit since hl2
My last nvidia card was gtx 980.I bought two of them. After i heard about 970 scandal. It didnt directly affect me but fuck nvidia for pulling that shit. Havent bought anything from them. Stopped playing games on pc afterwards, just occasionally on console and laptop igpu.
My mind is still blown on why people are so interested in spending 2x the cost of the entire machine they are playing on AND a hefty power utility bill to run these awful products from Nvidia. Generational improvements are minor on the performance side, and fucking AWFUL on the product and efficiency side. You’d think people would have learned their lessons a decade ago.
they pay because AMD (or any other for that matter) has no product to compete with a 5080 or 5090
I have overclocked my AMD 7900XTX as far as it will go on air alone.
Undervolted every step on the frequency curve, cranked up the power, 100% fan duty cycles.
At it’s absolute best, it’s competitive or trades blows with the 4090D, and is 6% slower than the RTX 4090 Founder’s Edition (the slowest of the stock 4090 lineup).
The fastest AMD card is equivalent to a 4080 Super, and the next gen hasn’t shown anything new.
AMD needs a 5090-killer. Dual socket or whatever monstrosity which pulls 800W, but it needs to slap that greenbo with at least a 20-50% lead in frame rates across all titles, including raytraced. Then we’ll see some serious price cuts and competition.
And/or Intel. (I can dream, right?) Hell, perform a miracle Moore Threads!
That’s exactly it, they have no competition at the high end
Because they choose not to go full idiot though. They could make their top-line cards to compete if they slam enough into a pipeline and require a dedicated PSU to compete, but that’s not where their product line intends to go. That’s why it’s smart.
For reference: AMD has the most deployed GPUs on the planet as of right now. There’s a reason why it’s in every gaming console except Switch 1/2, and why OpenAI just partnered with them for chips. The goal shouldn’t just making a product that churns out results at the cost of everything else does, but to be cost-effective and efficient. Nvidia fails at that on every level.
this openai partnership really stands out, because the server world is dominated by nvidia, even more than in consumer cards.
Yup. You want a server? Dell just plain doesn’t offer anything but Nvidia cards. You want to build your own? The GPGPU stuff like zluda is brand new and not really supported by anyone. You want to participate in the development community, you buy Nvidia and use CUDA.
yeah, I helped raise hw requirements for two servers recently, an alternative to nvidia wasn’t even on the table
Actually…not true. Nvidia recently became bigger in the DC because of their terrible inference cards being bought up, but AMD overtook Intel on chips with all major cloud platforms last year, and their Xilinix chips are slowly overtaking the sales of regular CPUs for special purposes processing. By the end of this year, I bet AMD will be the most deployed brand in datacenters globally. FPGA is the only path forward in the architecture world at this point for speed and efficiency in single-purpose processing. Nvidia doesn’t have a competing product.
we’re talking GPUs, idk why you’re bringing FPGA and CPUs in the mix
Unfortunately, this partnership with OpenAI means they’ve sided with evil and I won’t spend a cent on their products anymore.
enjoy never using a computer again i guess?
Oh so you support grifting off the public domain? Maybe grow some balls instead of taking the status quo for granted.
Well, to be fair the 10 series was actually an impressive improvement to what was available. Since then I switched to AMD for better SW support. I know since then the improvements have dwindled.
AMD is at least running the smart game on their hardware releases with generational leaps instead of just jacking up power requirements and clock speeds as Nvidia does. Hell, even Nvidia’s latest lines of Jetson are just recooked versions from years ago.
Cause numbers go brrrrrrrrr
Have a 2070s. Been thinking for a while now my next card will be AMD. I hope they get back into the high end cards again :/
The 9070 XT is excellent and FSR 4 actually beats DLSS 4 in some important ways, like disocclusion.
Concur.
I went from a 2080 Super to the RX 9070 XT and it flies. Coupled with a 9950X3D, I still feel a little bit like the GPU might be the bottleneck, but it doesn’t matter. It plays everything I want at way more frames than I need (240 Hz monitor).
E.g., Rocket League went from struggling to keep 240 fps at lowest settings, to 700+ at max settings. Pretty stark improvement.
AMD only releases high end for servers and high end workstations
I wish I had the money to change to AMD
And only toke 15 years to figure it out?