Hacker News new | past | comments | ask | show | jobs | submit login

> they've never dealt with AMD (who have supply problems for other reasons), whose GPUs haven't been the premium product for a while (arguably, ever),

Yep. AMD can't write Windows graphics drivers that are stable and performant to save their lives, their cards use enormous amounts of power compared to similarly performing NVIDIA cards, they're nearly useless for computational stuff because most everyone goes for CUDA, and AMD cheated on benchmarks by overclocking/overvolting their cards and slapping huge heatsinks on them so that they'd last just long enough for a benchmark to run before starting to overheat.

There are only two markets for AMD cards that I see: Hackintoshes (increasingly irrelevant with the shift to the new Apple Silicon and inevitable phase-out of Intel-supporting MacOS releases), Linux systems, and AMD fanboys. All pretty small markets.

AMD is properly fucked in the GPU market now that Apple is no longer using discreet GPUs unless they manage to bring to bear the sort of enginering might they did for Ryzen (where they're also slipping as Intel pushes back.) NVIDIA can carry on with all their datacenter-class ML products and lose a bunch of money on the consumer side. The problem is...if AMD drops out of the GPU market, that leaves NVIDIA and Intel, and reportedly Intel's product is even worse than AMD. If NVIDIA is insufferable now, imagine them when they're the only game in town...




> There are only two markets for AMD cards that I see: Hackintoshes (increasingly irrelevant with the shift to the new Apple Silicon and inevitable phase-out of Intel-supporting MacOS releases), Linux systems, and AMD fanboys. All pretty small markets.

You're forgetting two more, one of which is massive: consoles and high-performance embedded gaming systems (basically, the Steam Deck and other high-end handhelds). Consoles aren't going to be super profitable on a per-unit basis, but AMD is making the chips both Microsoft and Sony are using in their boxes, and they're selling a lot of those boxes (right now, about 36MM since they debuted a little under two years ago). Similarly, the Steam Deck is a pretty damn good first attempt at a cohesive handheld that runs PC titles, and I could see that extending past Valve and into a generalized marketplace (and the fact you can plug in a USB-C cable into a dock and get a real Linux desktop is really, really cool).


You are also forgetting another market: HPC. AMD MI200 GPUs are the boss cards to have if you want to do HPC right now.


That's the first time I heard about that, do you have any examples? Google shows me one example of Microsoft using them but that's all I can find.


One I know of is the EU-funded LUMI supercomputer: https://www.lumi-supercomputer.eu/lumis-full-system-architec...

"The AMD MI250X GPU is in a class of its own now and for a long time to come. The technical supremacy and performance per watt were the primary reasons why AMD’s MI250X GPUs were selected for LUMI, explains Pekka Manninen, Director of LUMI Leadership Computing Facility."


You might look at the current No. 1 on the Top500 list, Frontier at ORNL. https://www.olcf.ornl.gov/frontier/ Arguably the most powerful machine today, delivering >1 Exaflop/s performance.


Also El Capitan supercomputer, and there are more.


I was very specifically speaking about the market for AMD's GPU cards, in the context of a discussion about GPU cards. I'm not being pedantic here.

AMD's embedded GPU unit may do fine, and it would be smart for them to shift their development priority to retain that market share. They lost the desktop/laptop discreet GPU battle years ago.


Replace GPU with CPU and you sound like someone who was very wrong after 2018 came around. Never say never.


I just looked at different benchmarks and the general consensus seems to be that latest Radeon and Nvidia cards are quite similar in performance. Nvidia seems to lead in raytracing and upscaling but in my opinion only the upscaling part is really important. And I bet that all GPU makers will have a decent solution available for that soon.


and Teslas. Seems like AMD’s play is high performance (somewhat) custom silicon as a service.


> AMD is properly fucked in the GPU market now

AMD GPUs are in XBox and Playstation.

> AMD can't write Windows graphics drivers that are stable and performant to save their lives

When they changed to RDNA they had teething issues. But their drivers are stable now.

> their cards use enormous amounts of power compared to similarly performing NVIDIA cards

Is this a joke? The 6800 XT draws less power than a 3080 12gb while having similar FPS. Ignoring FSR2 or DLSS.

Disclaimer. I've only ever had 1 AMD card (5700XT) and it was rock solid. Currently using a 3070 and 3050Ti.


> their cards use enormous amounts of power

People on enthusiast forums are now talking about buying 1.2kW+ power supplies for the new generation of nvidia cards. I don't think you can point at one or the other as bad here.


people on enthusiast forums have been sold a bill of goods by a couple attention-seeking twitter stars like kopite7kimi.

remember his rumors about 4090 having a 800-900W TGP? That got wacked way down in recent months by more reputable leakers, and now the number is rumored to be around 450W TBP (more like 375-400W TGP), which is plausible given the size of the die (and much more comparable to the rumored 400W of the dual-die AMD products).

So, off by basically double.

https://hothardware.com/news/geforce-rtx-4090-ti-allegedly-a...

Similarly he had 4080 at 600W and other Shit That Ain't Gonna Happen. Stupid partner boards pushing everything to 2x the official spec? maybe, but as a reference TGP? lol no.

if you believed that 800W was ever a plausible configuration for a consumer product, then I have a bridge to sell you. It was most likely some kind of engineering test bios that turned everything up to 11 at max clocks and he heard about that number and just didn't apply any common sense. Which, why would he when he could get months of attention off it from twitter?

People just race to buy into anything negative that is rumored about NVIDIA products and it's taken as absolute gospel as soon as anyone hears it. And the leakers have realized that and play into it.


I think that the fact it was even plausible to people who are fans of nvidia, and that it is discussed on said forums, means that customers don't have any expectations of nvidia keeping a lid on the power requirements. Both big manufacturers are recommending at least 750W supplies for their current flagships.

As such I still don't think it's fair to characterise just AMD as wasteful power hogs here, the post I was responding to seems to me to be unnecessarily partisan.


>AMD can't write Windows graphics drivers that are stable and performant to save their lives

I have an AMD GPU in my Windows PC and it works fine. It's stable. n=1 of course.

The reason I bought it is because I have a linux machine that currently has an Nvidia card. Fuck Nvidia and their linux drivers.

>their cards use enormous amounts of power compared to similarly performing NVIDIA cards

Don't think this is true at all. Eg first link I came across https://www.tomshardware.com/features/graphics-card-power-co...


Aren't all the game consoles running amd?


Yes, and I'm absolutely certain they're kicking themselves for having agreed to do so.

As far as I know, console manufacture comes with supply agreements- you agree to supply X number of processors at Y cost... even if the price of those processors, when fitted to a PCI Express card instead of a PS5 motherboard, would net 3 times the profit.

It's also a nice way to get out of having bad drivers- if developers are making games to the PS5 standard, then any errata becomes their problem more than strictly AMD's...


It's far more nuanced than this, really. Much like Apple funds a lot of TSMC's early R&D for those nodes - it's not just that they're willing to bid more, they are a risk partner in the development of the technology itself - console companies fund a lot of AMD's early-stage graphics development. Without their funding, the Polaris/Vega era of GCN would have seen even less iteration from AMD. And they often also fund custom variations on the general architectures, which helps keep hardware engineers employed. And they also contribute to AMD's wafer purchases, which lets AMD get better deals on the rest of its stuff because they're buying 2x as many wafers as they could use themselves.

The situation with TSMC being super super limited during the pandemic is unusual, and it was significantly amplified by the crypto boom happening at the same time. Normally it's actually a pretty good deal for AMD in a lot of ways.

The console companies tend to be very very conservative in their designs though and letting the console companies set the direction has left AMD significantly behind on many task-specific accelerators like tensor and RT. And tensor is no longer a niche thing, neural accelerators have been everywhere in phones for ages, Intel's laptop chips have had neural accelerators since 10th gen, intel desktop 11th gen had it, AMD's own 7000-series CPUs have neural accelerators, Intel GPUs have it, etc. Different product segments are finding different uses for it, and AMD has been not just left out of the rain but actually is slowing the development of the whole field because of how tardy to market they are. And beyond that they've taken a strongly anti-consumer position against open APIs that allow other brands to utilize hardware accelerators or closed-source libraries.

https://youtu.be/8ve5dDQ6TQE?t=974

Similar problem with RT... AMD's implementation is very lackluster and has been slowing the whole industry down. It's about half the performance of NVIDIA's first-gen implementation. A lot of times it just seems like AMD automatically assumes that NVIDIA is incompetent and wasteful, and there's some big savings to be had, so they'll take whatever NVIDIA does and cut it in half or a third or whatever (which is what they're rumored to do with tensors next generation - it has "accelerator instructions" but less powerful than the full dedicated units on CDNA, and presumably weaker than NVIDIA's dedicated units as well). Not sure if it's that, or if they're just cheap on silicon, especially with the console market's influence...


You're overselling the importance of console chips for AMDs bottom line. Console chips have notoriously low margins due to being custom in nature and cost down to oblivion, making their development an expensive PITA with very low margins.

It's why Intel and Nvidia refused to work with Sony and Microsoft for their console chips and only AMD took the original contract since they were close to bankruptcy at the time so every extra dollar mattered for them.


What I'm saying is that the final sale price isn't the only relevant number here - Sony and MS are paying to develop the graphics architecture in the first place. That's revenue on the frontend that helps develop the chips in the first place. The B2B sales of the console chips that result are lower-margin than the dGPU sales into the broader consumer market, but, it's hundreds of millions of dollars of R&D money that AMD doesn't have to find elsewhere a half decade before the product goes to market. That money has a lot of time-value, because it's available now, not 5 years and 10 billion dollars from now.

Early VC funding rounds are way less than you'll see when your product goes to market too. Applying your accounting philosophy, a couple million bucks surely must be insignificant compared to billions of dollars of sales revenue we'll see later, right? And they want how much equity? What a terrible deal.

Except for the part where you fold years before you get to "later", of course.

And that was a very, very, very real concern with AMD. They were right on the edge for years and years during the Bulldozer years, Ryzen was very much a last-chance move for them and they were obviously still scraping bottom for the first couple years even then. There are numerous factors that all aligned to keep them alive - they would not have been alive without Intel misstepping on 14nm (the early days were very bad) and then 10nm, they would not have been alive without Apple writing huge R&D checks to advance TSMC 7nm and 5nm far enough ahead of the market that GloFo bowed out of the game, they would not have been alive without consoles keeping the lights on while they spent every penny on Zen R&D, and they would not have been alive without their interconnect turning all their IPs into legos and allowing them to break and scale them.

Things were really, really, really touch-and-go for AMD in 2016/2017/2018. I figured they were going under. I have no doubt that console R&D money (along with those other factors) was instrumental in keeping the lights on at RTG.


> even if the price of those processors, when fitted to a PCI Express card instead of a PS5 motherboard, would net 3 times the profit.

The GPUs in Xboxes and Playstations are custom silicon.


True, but they are based on the RDNA arch. Same as the CPU cores are monolithic Zen cores.


I doubt they were kicking themselves considering they knew what was coming based on their experience with the last gen consoles.


It's not like they have any other options. Who else is there? NVIDIA? Microsoft's issues with them shows how precarious that relationship is. I'm sure issues with NVIDIA are why we haven't seen an improved Switch yet and I think there are clear concerns about potential SOCs for a Switch 2. Unless you have to, you don't want to work with NV.


All except Nintendo ;(


RDNA2 APUs are in a similar place as the Tegra X1 was when the Switch first came out; I don't think it's impossible Nintendo switches back to AMD graphics, they have a long history of shipping Radeon in their home consoles all the way back to the GameCube.


What? Did you go out of the loop at least 4 years ago?


citation needed.

I've had zero problems with AMD hardware on Windows (Linux kernel, on the other hand ootb still won't boot with this Vega as the amdgpu driver is a disgrace) and other than windows 11 being a complete failure, nobody has reported serious problems.

re benchmarks/power, they scale dynamically and do not have as tight power envelopes as other vendors, so obviously it will use more power [citation needed also].

Their generic GPUs are very fast, but why do they even care when they own the console market, the higher end in car infotainment market, etc?


Holy shit, is this guy being paid by nVidia?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: