Hacker News new | past | comments | ask | show | jobs | submit login
Ryzen 8000G review: An integrated GPU that can beat a graphics card, for a price (arstechnica.com)
114 points by kristianp on Jan 30, 2024 | hide | past | favorite | 106 comments



Ryzen 8700G delivers comparable cpu performance to Core i5-14600K while using only one-third of the TDP

65 vs 181 watt

now that is impressive!


Ryzen has been more power friendly than Intel Core for a while now. I don't think consumers care much about eco friendliness, though, which is a shame. I suppose it could always be sold as "doesn't heat up the room as much", but anyone with a dedicated GPU will have a computational space heater sitting next to them regardless these days.


In Europe electricity is expensive, though. If you pay 50c€ per kWh, energy efficiency does matter.

Less heat also allows for a more compact or mobile design. Smaller builds are gaining popularity.


>Smaller builds are gaining popularity.

Gaining popularity in the ever shrinking market of PCs is nothing to brag about. I'd wager it's still not that popular as you make it be. Everyone just buys laptops nowadays and whoever they do buy PCs they get the full sized towers with discrete GPUs for gaming, not small form factors with APUs. That's more of a niche of a niche.

That's why AMD hasn't focused too much on APUs for desktop PCs. It's not a big market. Their biggest sellers are APUs for notebooks and consoles, CPUs for datacenters and desktops, discrete GPUs for ... everything.


It's been an "ever shrinking market" since the 2000s, if you believe the news.

A thing to consider is, for the consumer market, machines have gotten good enough that most don't need a new machine for 5+ years, in terms of performance. Especially for casual household tasks.


No, you are right. I think this APU is basically a spin-off mobile processor. The technology comparison would be translatable, though, as with laptops battery life and weight matters.

However, here we talk about the desktop market, don't we? And not every desktop PC has to be a gaming capable machine. Power-savings will probably be even more relevant for organizations running hundreds of office machines and terminals.


>And not every desktop PC has to be a gaming capable machine.

Yes, but desktop PCs not for gaming aren't really selling volumes nowadays, they're a tiny niche being eaten by notebooks and NUCs (also a niche) which Intel has been dominating as they had CPUs with iGPUs (basically APUs without the marketing) since ... forever.

>Power-savings will probably be even more relevant for organizations running hundreds of office machines and terminals.

When was the last time you saw organizations buy PCs for their workforce? In volume I mean, not just one PC for Bob who needed a CAD workstation, because then Bob most likely isn't getting an APU for CAD but some workstation with discrete CPU and some Quadro GPU.

Organizations mostly transitioned their office machines to notebooks a long time ago, even moreso after WFH became more popular so there was less need of PCs tied to a fixed location. PCs not for gaming or heavy compute workstations aren't big sellers anymore.


In my experience, bureaucracy doesn't run on laptops.


On what does it run on?

Yeah, I'm sure governments do buy some amount of desktop PCs because they don't intend for their workforce to work from anywhere else other than a specific office, but you also can't tell me with a straight face that selling PCs to the DMV, FBI and other bureaucrats represents a big lucrative market that was waiting for the APU revolution.

PCs are a low selling niche now. APUs won't move the needle on that, that's why AMD ignored it for so long. End of story.


I think you are moving the goal post a bit.


How so? Please enlighten us with your arguments on the matter.


Well for starters the argument wasn't about economics, but who would value a power efficient processor like this.

Scroll up, enlighten yourself.


Economics matter. People can value a lot of things, but if there's too few of them, then the market won't bother catering to them. See small smartphones.


Sure, but maybe this objection to the original argument shouldn't come as a moving goal post, don't you think?

Cause moving goal posts are lame and exhausting. End of story.


they 100% do. what company or government agency have you ever seen running on desktops that isnt the DMV?


True, I actually have been underclocking my RTX 3800 to keep heat and costs down.


For mobile processors, it's one of the first things I look at though. And like you and others said I don't just look at the TDP but also rely on the big testing sites for load heat tests.

Or, at least, I did until the m series processors came out and you could get a passively cooled laptop for a decent price.


Maybe not everyone but I definitely care. Lower consumption means smaller PSU and cooling requirements and less issues during heatwaves (no AC). Futhermore, my computer corner with a fair bit of electronics is served by only one electricity socket with a bunch of extensions cords that I really would rather not overload.


Yeah, my gaming (obviously I'm not playing AAA action games) PC runs on a 350W PSU and is rarely audible.


I think that yes they're more power friendly when doing work, but consume more power at idle than Intel CPUs. At least that used be the case, don't know if that's still the case.

As for eco-friendliness... it's something I now consider when buying electronics but more from an electricity bill point of view.


Most of AMD's desktop processors for the past several generations have had poor idle power because they're built using multiple chiplets with a cheap, power-hungry interconnect. The 8000G series (and earlier -G parts) are monolithic laptop processors re-packaged for the desktop socket, so they are unaffected by the main cause of poor idle power in AMD's main desktop processor product line.


Intel is very likely still the IdleM@ster[1][2].

Also, most CPUs spend most of their time idling or very low load, not belching hellfire upon the scorched earth. Efficiency at idle/low power is far more significant than efficiency at flame belching if power consumption over time is concerned.

[1]: https://news.ycombinator.com/item?id=38823514

[2]: https://news.ycombinator.com/item?id=38829531


Idle power consumption is all over the place with AMD taking the lowest and highest spots on this chart: https://www.reddit.com/r/Amd/comments/10evt0z/ryzen_vs_intel...

Who wins depends on both the system and how it’s used.


I recall this blog post[1], discussed here[2] at the time, discussing how a SATA chip driver prevented the CPU from going to sleep state C6, instead keeping it at C3, leading to higher power consumption.

Switching the SATA card to one with a different chip fixed that.

There were other surprising factors which affected power draw.

My takeaway was that as in real life, sleeping can be hard.

[1]: https://mattgadient.com/7-watts-idle-on-intel-12th-13th-gen-...

[2]: https://news.ycombinator.com/item?id=38823514


In fact, you linked the exact thing I linked. :P


Oh lol. Opened the links in new tab but didn't read before commenting, classic n00b fail. The "visited" look didn't phase me since I had, as mentioned, read it here before.


How big is the difference on a regular setup? And is it a same kind of cpu, with integrated graphics? One of the example you linked says 7 watt, which is impressive, but I think you could achieve similar optimizations on an amd system. Let's say I install windows or some linux desktop and don't do any optimizations, how big is the difference?


> I don't think consumers care much about eco friendliness, though, which is a shame.

Many do care about noise though!


I do care about fan noise, my current system was specifically built for it (ryzen 3700x, rx5600 gpu)


Why don’t more people put their pc in another room or floor? Even Amazon $20 HDMI cables are pretty resilient at 50’.

Stuck my gaming PC in the basement decades ago and haven’t worried about cooling or noise since.


Most people don't have another room or floor to put it in?


Why don't people just buy bigger houses with more rooms, d'uh? /s


Be very careful about comparing TDPs, especially across manufacturers. It doesn't really say anything about real power consumption. The TDP difference is so big it probably represents some real difference in power consumption, but actual measurements are necessary to say anything concrete.


While you can't compare TDPs directly. Basically all tests of Ryzen processors with 3d cache use about half as much power or less than their Intel counterparts. So this wouldn't surprise me.


Yeah, the Ryzen part is almost certainly significantly lower power


TDP is arbitrary, they actually calculate it with arbitrary constants that change per chip. So it is meaningless at this point.

See https://gamersnexus.net/guides/3525-amd-ryzen-tdp-explained-...


Also, chiplet based Ryzens are notorious for higher idle power consumption than Intel's monolithic dies, which is where CPUs spend most of their time in consumer applications (web browsing, netflix, shitposting on reddit, etc) instead of running at full load all the time.

Synthetic tests that stress everything at full bore, are not realistic representations for your average consumer PC application scenario. Even video games don't cause as much power draw as synthetic benchmarks can which are more akin to power viruses.

In real life daily driving, Ryzen might still be more efficient than Intels but with a much smaller margin than such tests might lead you to believe.


Every Ryzen with letter G at the end is a monolithic design...


Interesting I didn't know.


I'm currently typing this on an i7-1165G7 running windows 11, with a few background tabs in edge, teams and outlook. CPU usage as reported by task manager seems to hover around 7-10%. CPU clock hovers around 2.2 GHz with regular peaks above 3 GHz.

Is that idle? Not sure I could get this box at any lower use than this, even though I understand this is different from having all cores pegged at 100%.


Doesn’t idle mean something more like C:\> than multitasking billions of instructions per second?


We're talking idling from the perspective of an actual user at home, so there will obviously be a web browser and some background tasks running.

Idling without anything running, even an OS, is interesting for metrics, but not something realistic for any user.


I agree that we should be comparing real scenarios, but I also notice that my desktop Linux systems tend to idle at <=1% CPU use if I don't have a browser open, so there's probably some genuine room to consider what counts as idle even with an OS running.


Reporting the clock frequency of a core requires running a bit of code on it which necessarily means the core is in an active state while you take the reading, so you will always read something like "2.5 GHz" even if the core does not execute anywhere near 2.5 billion cycles while you do that. Some tools can report metrics like "average effective clock" which count time spent in inactive power states as zero clock.


Hwinfo64 can tell you the percentage of time the cpu speeds in the idle state.


This should be great for battery life in the next few generations of laptops and handhelds.


It's a rebadged mobile APU (7040 series Phoenix APU).


Along with comparing what the others have said about comparing TDPs between vendors (or even different product lines sometimes) TDP is about multi-core draw so you'd need to also normalize the multithread performance differences between the two, not look at the single thread performance being near equal (that'll be significantly less than 65 Watts for each).

I think the 8700G is probably more efficient but it won't be anything like 65 vs 181 watt would leave one to believe.


CPU mark scores are 32k and 39k respectively. It's at least twice as efficient


Assumming:

- PL1/PL2 from Intel can be compared to TDP from AMD

- CPU Mark scores alone are the best indicator of what full utilization looks like vs another test

You'd be accurate. You can't really assume either of these things though. The better comparison to TDP from AMD is probably closer to "base power" of 125W than PL1/PL2 of 181 on the Intel side. Other benchmarks, like encoding using all cores efficiently with matrix extensions, are in the 25%-30% difference. Combine these two facts and it's probably closer to 1.5x efficient (instead of the 2.5x the raw numbers above suggest, 3x the relative improvement), but you'd have to actually measure active power doing the same amount of total work to know for sure. Even ignoring one of the facts the encoding benchmark differences are enough to guarantee it'd actually be less than twice as efficient, not at least twice as efficient.


The higher power consumption on the Intel side is mostly a reflection of the futility of feeding it more power. A better way to compare these would have been to dial back the power levels on the Intel CPU until it was only just as fast as the AMD part, and compare that power level.


A better comparison of efficiency is task energy, which the article did cover.

In a video encoding task, the 8700G used 30% less energy than the 14600K, but it actually used more energy than a 13400.


> But when integrated graphics push forward, it can open up possibilities for people who want to play games but can only afford a cheap desktop

Or those who play on recent handhelds. Steamdeck with integrated RDNA2 is very capable with even modern games. Sure, you can't crank it up to ultimate quality with hundreds of fps - but it's enough, and above the "cheap" level of performance.

Also, looking at geekbench results, it basically matches the M2 Max in a mbp.


There’s either a mistake or this is a bad metric, perhaps.

Currently the most powerful AMD iGPU is Radeon 780M (found 7840U/HS and 8700G CPUs). Judging by Notebookcheck‘s results, M2 Max GPU has up 2× the fps in Borderlands, 2.5× the fps in Witcher 3, and 3× the fps in Shadow of The Tomb Raider.

As for the benchmarks, the M2 Max GPU has 4–6× the fps in GFXBench, compared to Radeon 780M. And the RDNA3-based 780M has twice the raw compute performance, compared to Steam Deck’s RDNA2 GPU.

Unfortunately, GPUs in handhelds are always severely underpowered.


The biggest problem of the M2 Max: the games you mentioned are 3 of like 10 that run on it (might be more, I am exaggerating a bit).

I wish Asahi would move forward much faster and I could game on it, but at this point, gaming on a Mac isn't really a thing.

Source: I have one from work that I really want to be able to play. Whisky, Crossover, Parallels, I tried them all.


at least historically handhelds can't be too expensive if they want to archive wide success

M2 Max isn't available on the free market (as part) but if it where it likely alone would cost noticable more then what a full handheld can afford to cost


I meant the CPU part of 8000G, not the GPU. Could've been clearer about it.


>Steamdeck with integrated RDNA2 is very capable with even modern games.

With a caveat: at its native 1280×800 resolution.


Sometimes even lower with upscaling. But yes, one "advantage" of handhelds is that, thanks to their tiny screens, they can run at lower resolution without too much visible degradation.


Can this integrated graphic units (APUs) work similarly to Apple M series for Generative AI inference, in combining GPU and System memory and give competitive advantage to AMD?


Memory bandwidth is severely limited compared to m1. So, not yet


They currently don't.


What effective vram can they use?


If the memory is properly unified yet: all?


It’s iGPU. There’s no GPU memory.


If instead AMD made a graphics card with expandable DDR5 slots, that'd be life-threat to Nvidia's business model.


Why would a GPU with 1/3rd of the memory bandwidth and no CUDA be a threat to Nvidia?


Anything without CUDA is not an existential threat, but anything that supports PyTorch and provides lots of RAM at a low price is a monopoly-profits threat.


A regular CPU supports Pytorch and has lots of RAM at low price.

The problem is memory bandwidth.


There are lots of people finding M2 Max faster than graphics cards for their tasks.


How does memory bandwidth of M2 Max compare to memory bandwidth of Ryzen 8700G? 400GBs vs 80GBs? Exactly my point. 400GBs approaches lower end graphics card territory (e.g. 3060)


Would AMD be able to produce a core design with unified memory like that of Apple’s M series such that a Ryzen core with integrated GPU cores _and_ unified memory? If they do, it will open up a lot of interesting doors.

As good as Apple’s M series CPUs are, one cannot (reasonably) simply build server rigs with them and exploit all that power..


That would eat into their discrete gpu sales. It would take courage to cannibalize their own sales.


They've already created such a design. It's called the MI300A and costs $20k.


No, it wouldn't. Socketed DDR5 doesn't have the bandwidth necessary to feed a massive compute complex like the one in a modern graphics card.


DIMM does not, but CAMM does.


Isn't that a latency issue?


I wonder if there is a detailed benchmark on using stable diffusion on this hardware (and other) ...


is there a way to get this with working ecc ram support? amd lately is going all-out to prevent normal users from getting ecc support...


AMD leaves ECC support to motherboard vendors. Asrock has ECC supported ryzen boards

https://www.asrockrack.com/minisite/Ryzen7000/


yes, but not really, see e.g. this comment from a relevant thread on framework laptop forum: https://community.frame.work/t/will-the-new-amd-boards-suppo...

That is an explicit "ecc: no" from amd on 7640U spec page.

There is absolutely no ECC mentioned on 8700G spec page currently: https://www.amd.com/en/product/14066

Vendors can do only so much if amd starts actively disabling/removing/preventing ecc, like it seems to be doing quietly now.


Now that's annoying. That specs page for the 8700G used to say "ECC Support: Yes (Requires platform support)" three weeks ago. They've since removed it.


Yes, I have just noticed this too.

Even weirder is that the AMD specification pages have stated that both Rembrandt (Ryzen 6000 series) and Phoenix (Ryzen 7040 series) support ECC, for about a year and a half.

Then suddenly, some time in Q3 2023, they have removed ECC from their specifications.

Because Ryzen 8000G is just Phoenix or Hawk Point (the AMD information about this, i.e. about the speed of the NPU, has been contradictory) in another package, I was initially surprised that they have decided to support ECC in it, but now I see that they have reverted again the decision.


Because people get boards that do not support it then complain to AMD.

Easier to just not claim it's there, like AMD used to.

That said, thanks to AMD UEFI modules always exporting their settings databases, even if vendor didn't provide options to configure things, you can always try to enable ECC (and various other things - beware of possibly damaging the hardware!).


Hopefully that's it. In the previous generation only the PRO parts had ECC and those were only sold to OEMs.


The PRO parts are "certified" to have ECC working.

Essentially it's binning - AMD takes portion of the bins and runs them through more verification steps including ECC tests. The rest does not get run through ECC testing, but have the exact same parts, and - unlike Intel - AMD does not lock ECC out on them.

The CPUs are physically the same except for branding detail.


This is (slightly) wrong. ALL Socket AM4 Ryzen CPUs without an iGPU have working ECC support.

ONLY AM4 Ryzen APUs (i.e., CPUs with an iGPU) with the "PRO" infix in their name have working ECC support.

However, ALL AM4 platform ECC support also depends on the motherboard vendor/firmware. There are boards on which ECC simply will not work, no matter if your CPU and DIMM combination actually supports it. You have to check the mobo specsheet to make sure it does - ASUS and ASRock, for example, support ECC on most of their boards with a suitable selection of CPU/APU and DIMM.

To illustrate the CPU situation with a few examples:

  - Ryzen 5 3600 - proper, working ECC support possible.
  - Ryzen 7 5800X - proper, working ECC support possible.
  - Ryzen 9 3900X - proper, working ECC support possible.
  - Ryzen 5 5600G - no ECC support possible.
  - Ryzen 7 5700G - no ECC support possible.
  - Ryzen 5 PRO 5650G - proper, working ECC support possible.
  - Ryzen 7 PRO 4750G - proper, working ECC support possible.


Apparently this might be specific to AM4 iGPU models?

I've seen suggestions that for PRO, the memory controllers were redone. Wouldn't surprise me if the story is that Cezanne had a design error resulting in inoperational ECC, especially given that unlike done other models in the series they are not chiplet based.


But parts that fail the PRO tests can be sold as non-pro parts, so you better hope you don't get one of those.


ryzen desktop chips can support ecc, the consumer mobile range can´t.


Are the new 8**G desktop chips, or are they consumer mobile in a desktop chip format?


With APUs, you must buy the Pro SKUs for ECC support. I specifically had to get the 5650g pro (as opposed to the regular 5600g) for my home server due to this exact reason.


Just curious, but why opt for an apu in a server config? They tend to be marketed towards gamers on a budget so not including ecc support isnt that big of a deal.


Still need video output to set up the system and for occasional maintenance. Additionally, I host a media server that needs to transcode videos. The GPU in the APU is perfect for these tasks since it both costs less and draws less power than a discrete card.


So 8700G's GPU is at the level of GTX1070? Not bad... M3 Max is about 2x faster (2080Ti level)


Not being able to beat an rx570 isn't something to brag about. Those cards go used for under 50usd in the market and can be undervolted pretty easily. Just having a hard time figuring out why someone wouldn't just go with a 7700.


It would be interesting to revisit this review for the forthcoming Strix Halo. Particularly, for ML tasks on relative large memory sizes which are prohibitively expensive for GPUs.


Not sure of how well this will work with Linux. I had a AMD4 "5XXXG" CPU and there were contant freezes. The Linux kernel does eventually catch-up but it takes time.


I still have that cpu, I remember the freezes, it affected Windows too, it was because of the fTPM, you needed to update your firmware or disable fTPM.


Does fTPM not crash anymore? I had the same too, have it disabled since then, can it be enabled now? I also have AMD 5xxx series


https://www.phoronix.com/review/amd-ryzen7-8700g-linux

Appears to work well, although I've not heard of other people having the issues that you did on the 5000 series.

I'm still on the 3000 series so my experience is older, but always very stable even when new.


I just installed Ubuntu on Ryzen 7950X which uses RDNA2 igpu, since Ubuntu delivers both Chrome and Firefox through snap, both web browsers are completely broken with hardware acceleration enabled. With Debian that delivers Firefox from deb packages, everything works great... https://bugs.launchpad.net/ubuntu/+source/firefox/+bug/20045...


I apparently am forgetful, as the couch PC that I've got SteamOS on is a 5600X.

I've not seen any issues on that.


I have a variety of AMD desktops with XXXXG chips and Laptops with AMD integrated graphics. Hardware video decoding doesn't work in most of the cases, neither in Chrome nor in Firefox. My next desktop and laptop will be with the Intel integrated graphics - the only option working OK in Linux + Wayland Plasma + Chrome.

> The Linux kernel does eventually catch-up but it takes time.

Sometimes to a certain degree


as long as you don't run into any bios issues it should work as good as recent AMD laptops (which AFIK work fine if you use a recent stable kernel)


Strange that they talk so much about GTX 1650 but didn't bothered to benchmark it.

Someone in the comments linked to actual benchmark[0] and it looks like it ~30% faster in most cases...

[0] https://www.computerbase.de/2024-01/amd-ryzen-8700g-8600g-te...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: