And DIY ML use cases (local LLM, video/audio upscaling, image generation).
Hope it will be possible to use two cards together for 48GB of VRAM 🤞.
But does it support Pytorch
If I understand correctly, these cards are less energy efficient than at least nVidia’s. Which makes me sad that nobody takes this into the account - both environmentally and price wise.
Less efficient than NVIDIA, but still more efficient than last gen.
They just entered the market, it will take some time to mature. The B580 is definitely a great contender when you can get it for MSRP. In my country in W Europe I’d pay €329 which is $344, so the benefit of a sharply priced alternative to a 4060 with 50% more VRAM is moot as its actually the same price or more for me.
Probably more efficient on an absolute basis. I believe 5090 will be 600 watts and 5080 will be 400 watts.
You should compare it to a model that’s in the same range, not those monsters.