> they've never dealt with AMD (who have supply problems for other reasons), whose GPUs haven't been the premium product for a while (arguably, ever),
Yep. AMD can't write Windows graphics drivers that are stable and performant to save their lives, their cards use enormous amounts of power compared to similarly performing NVIDIA cards, they're nearly useless for computational stuff because most everyone goes for CUDA, and AMD cheated on benchmarks by overclocking/overvolting their cards and slapping huge heatsinks on them so that they'd last just long enough for a benchmark to run before starting to overheat.
There are only two markets for AMD cards that I see: Hackintoshes (increasingly irrelevant with the shift to the new Apple Silicon and inevitable phase-out of Intel-supporting MacOS releases), Linux systems, and AMD fanboys. All pretty small markets.
AMD is properly fucked in the GPU market now that Apple is no longer using discreet GPUs unless they manage to bring to bear the sort of enginering might they did for Ryzen (where they're also slipping as Intel pushes back.) NVIDIA can carry on with all their datacenter-class ML products and lose a bunch of money on the consumer side. The problem is...if AMD drops out of the GPU market, that leaves NVIDIA and Intel, and reportedly Intel's product is even worse than AMD. If NVIDIA is insufferable now, imagine them when they're the only game in town...
> There are only two markets for AMD cards that I see: Hackintoshes (increasingly irrelevant with the shift to the new Apple Silicon and inevitable phase-out of Intel-supporting MacOS releases), Linux systems, and AMD fanboys. All pretty small markets.
You're forgetting two more, one of which is massive: consoles and high-performance embedded gaming systems (basically, the Steam Deck and other high-end handhelds). Consoles aren't going to be super profitable on a per-unit basis, but AMD is making the chips both Microsoft and Sony are using in their boxes, and they're selling a lot of those boxes (right now, about 36MM since they debuted a little under two years ago). Similarly, the Steam Deck is a pretty damn good first attempt at a cohesive handheld that runs PC titles, and I could see that extending past Valve and into a generalized marketplace (and the fact you can plug in a USB-C cable into a dock and get a real Linux desktop is really, really cool).
"The AMD MI250X GPU is in a class of its own now and for a long time to come. The technical supremacy and performance per watt were the primary reasons why AMD’s MI250X GPUs were selected for LUMI, explains Pekka Manninen, Director of LUMI Leadership Computing Facility."
You might look at the current No. 1 on the Top500 list, Frontier at ORNL. https://www.olcf.ornl.gov/frontier/
Arguably the most powerful machine today, delivering >1 Exaflop/s performance.
I was very specifically speaking about the market for AMD's GPU cards, in the context of a discussion about GPU cards. I'm not being pedantic here.
AMD's embedded GPU unit may do fine, and it would be smart for them to shift their development priority to retain that market share. They lost the desktop/laptop discreet GPU battle years ago.
I just looked at different benchmarks and the general consensus seems to be that latest Radeon and Nvidia cards are quite similar in performance.
Nvidia seems to lead in raytracing and upscaling but in my opinion only the upscaling part is really important. And I bet that all GPU makers will have a decent solution available for that soon.
People on enthusiast forums are now talking about buying 1.2kW+ power supplies for the new generation of nvidia cards. I don't think you can point at one or the other as bad here.
people on enthusiast forums have been sold a bill of goods by a couple attention-seeking twitter stars like kopite7kimi.
remember his rumors about 4090 having a 800-900W TGP? That got wacked way down in recent months by more reputable leakers, and now the number is rumored to be around 450W TBP (more like 375-400W TGP), which is plausible given the size of the die (and much more comparable to the rumored 400W of the dual-die AMD products).
Similarly he had 4080 at 600W and other Shit That Ain't Gonna Happen. Stupid partner boards pushing everything to 2x the official spec? maybe, but as a reference TGP? lol no.
if you believed that 800W was ever a plausible configuration for a consumer product, then I have a bridge to sell you. It was most likely some kind of engineering test bios that turned everything up to 11 at max clocks and he heard about that number and just didn't apply any common sense. Which, why would he when he could get months of attention off it from twitter?
People just race to buy into anything negative that is rumored about NVIDIA products and it's taken as absolute gospel as soon as anyone hears it. And the leakers have realized that and play into it.
I think that the fact it was even plausible to people who are fans of nvidia, and that it is discussed on said forums, means that customers don't have any expectations of nvidia keeping a lid on the power requirements. Both big manufacturers are recommending at least 750W supplies for their current flagships.
As such I still don't think it's fair to characterise just AMD as wasteful power hogs here, the post I was responding to seems to me to be unnecessarily partisan.
Yes, and I'm absolutely certain they're kicking themselves for having agreed to do so.
As far as I know, console manufacture comes with supply agreements- you agree to supply X number of processors at Y cost... even if the price of those processors, when fitted to a PCI Express card instead of a PS5 motherboard, would net 3 times the profit.
It's also a nice way to get out of having bad drivers- if developers are making games to the PS5 standard, then any errata becomes their problem more than strictly AMD's...
It's far more nuanced than this, really. Much like Apple funds a lot of TSMC's early R&D for those nodes - it's not just that they're willing to bid more, they are a risk partner in the development of the technology itself - console companies fund a lot of AMD's early-stage graphics development. Without their funding, the Polaris/Vega era of GCN would have seen even less iteration from AMD. And they often also fund custom variations on the general architectures, which helps keep hardware engineers employed. And they also contribute to AMD's wafer purchases, which lets AMD get better deals on the rest of its stuff because they're buying 2x as many wafers as they could use themselves.
The situation with TSMC being super super limited during the pandemic is unusual, and it was significantly amplified by the crypto boom happening at the same time. Normally it's actually a pretty good deal for AMD in a lot of ways.
The console companies tend to be very very conservative in their designs though and letting the console companies set the direction has left AMD significantly behind on many task-specific accelerators like tensor and RT. And tensor is no longer a niche thing, neural accelerators have been everywhere in phones for ages, Intel's laptop chips have had neural accelerators since 10th gen, intel desktop 11th gen had it, AMD's own 7000-series CPUs have neural accelerators, Intel GPUs have it, etc. Different product segments are finding different uses for it, and AMD has been not just left out of the rain but actually is slowing the development of the whole field because of how tardy to market they are. And beyond that they've taken a strongly anti-consumer position against open APIs that allow other brands to utilize hardware accelerators or closed-source libraries.
Similar problem with RT... AMD's implementation is very lackluster and has been slowing the whole industry down. It's about half the performance of NVIDIA's first-gen implementation. A lot of times it just seems like AMD automatically assumes that NVIDIA is incompetent and wasteful, and there's some big savings to be had, so they'll take whatever NVIDIA does and cut it in half or a third or whatever (which is what they're rumored to do with tensors next generation - it has "accelerator instructions" but less powerful than the full dedicated units on CDNA, and presumably weaker than NVIDIA's dedicated units as well). Not sure if it's that, or if they're just cheap on silicon, especially with the console market's influence...
You're overselling the importance of console chips for AMDs bottom line. Console chips have notoriously low margins due to being custom in nature and cost down to oblivion, making their development an expensive PITA with very low margins.
It's why Intel and Nvidia refused to work with Sony and Microsoft for their console chips and only AMD took the original contract since they were close to bankruptcy at the time so every extra dollar mattered for them.
What I'm saying is that the final sale price isn't the only relevant number here - Sony and MS are paying to develop the graphics architecture in the first place. That's revenue on the frontend that helps develop the chips in the first place. The B2B sales of the console chips that result are lower-margin than the dGPU sales into the broader consumer market, but, it's hundreds of millions of dollars of R&D money that AMD doesn't have to find elsewhere a half decade before the product goes to market. That money has a lot of time-value, because it's available now, not 5 years and 10 billion dollars from now.
Early VC funding rounds are way less than you'll see when your product goes to market too. Applying your accounting philosophy, a couple million bucks surely must be insignificant compared to billions of dollars of sales revenue we'll see later, right? And they want how much equity? What a terrible deal.
Except for the part where you fold years before you get to "later", of course.
And that was a very, very, very real concern with AMD. They were right on the edge for years and years during the Bulldozer years, Ryzen was very much a last-chance move for them and they were obviously still scraping bottom for the first couple years even then. There are numerous factors that all aligned to keep them alive - they would not have been alive without Intel misstepping on 14nm (the early days were very bad) and then 10nm, they would not have been alive without Apple writing huge R&D checks to advance TSMC 7nm and 5nm far enough ahead of the market that GloFo bowed out of the game, they would not have been alive without consoles keeping the lights on while they spent every penny on Zen R&D, and they would not have been alive without their interconnect turning all their IPs into legos and allowing them to break and scale them.
Things were really, really, really touch-and-go for AMD in 2016/2017/2018. I figured they were going under. I have no doubt that console R&D money (along with those other factors) was instrumental in keeping the lights on at RTG.
It's not like they have any other options. Who else is there? NVIDIA? Microsoft's issues with them shows how precarious that relationship is. I'm sure issues with NVIDIA are why we haven't seen an improved Switch yet and I think there are clear concerns about potential SOCs for a Switch 2. Unless you have to, you don't want to work with NV.
RDNA2 APUs are in a similar place as the Tegra X1 was when the Switch first came out; I don't think it's impossible Nintendo switches back to AMD graphics, they have a long history of shipping Radeon in their home consoles all the way back to the GameCube.
I've had zero problems with AMD hardware on Windows (Linux kernel, on the other hand ootb still won't boot with this Vega as the amdgpu driver is a disgrace) and other than windows 11 being a complete failure, nobody has reported serious problems.
re benchmarks/power, they scale dynamically and do not have as tight power envelopes as other vendors, so obviously it will use more power [citation needed also].
Their generic GPUs are very fast, but why do they even care when they own the console market, the higher end in car infotainment market, etc?
Yep. AMD can't write Windows graphics drivers that are stable and performant to save their lives, their cards use enormous amounts of power compared to similarly performing NVIDIA cards, they're nearly useless for computational stuff because most everyone goes for CUDA, and AMD cheated on benchmarks by overclocking/overvolting their cards and slapping huge heatsinks on them so that they'd last just long enough for a benchmark to run before starting to overheat.
There are only two markets for AMD cards that I see: Hackintoshes (increasingly irrelevant with the shift to the new Apple Silicon and inevitable phase-out of Intel-supporting MacOS releases), Linux systems, and AMD fanboys. All pretty small markets.
AMD is properly fucked in the GPU market now that Apple is no longer using discreet GPUs unless they manage to bring to bear the sort of enginering might they did for Ryzen (where they're also slipping as Intel pushes back.) NVIDIA can carry on with all their datacenter-class ML products and lose a bunch of money on the consumer side. The problem is...if AMD drops out of the GPU market, that leaves NVIDIA and Intel, and reportedly Intel's product is even worse than AMD. If NVIDIA is insufferable now, imagine them when they're the only game in town...