Apple's best GPU is close to a mid-range previous gen nvidia GPU?
I'm sure it'll run a 4 year old game on medium settings great! But that's kind of the point, that it doesn't stand a chance against the rest of the industry.
Who's supposed to be impressed by getting middling performance on Death Stranding 4 years after it came out? Literal definition of "also ran".
That's neat and impressive, but that's all they have. They don't have a high end. For $3000 you can get a top end gaming PC. There's no amount of money you can spend on a Mac to equal that.
I think you're missing the point that parent (and Siracusa) made - Apple invests a signficant into the software and graphics stack, only to fumble it at the last minute by not having high-end graphics hardware, and caring enough to court "triple A" game developers to their platforms, despite them creating and maintaining Metal and this 20k-line WINE patch.
There's this weird mismatch of Apple dedicating a non-trival amount of time in their keynote to "Mac Gaming" as if it's supposed to be impressive to finally play a 4 year old game on a Mac because they don't ship high-end graphics devices.
Apple silicon wins on performance per watt but not in performance outright and suddenly everyone cares about power consumption. Whichever spec everyone's favorite fruit company excels at gets put on a pedestal.
The PC industry is no longer driven by desktops; laptops have taken over long ago. There is a gaming PC crowd, but that is a small captured audience who wants performance, wattage be damned.
Apple is selling around 80% laptops versus desktops, and the rest of the industry is something like 77%. The fact Apple is winning the laptop GPU race doesn't mean it should automatically be entered into the desktop GPU race, where it is not winning.
I would take a guess that Apple is shipping (far) more TFlops of GPU power than Nvidia or anyone else in the mobile GPU market. Few people are buying laptops with 80-150w TDP GPUs, as those start to stretch the definition of both 'laptop' and 'battery powered'. Big gaming laptops with an hour of battery life are more akin to the luggables of yore.
That's fair. Nvidia has issues scaling their full systems down to laptop spec, and Apple almost has the opposite problem. They're both impressive in their own right, but right now Nvidia has both the performance and performance-per-watt crown in this space. The disparity in 3D applications (like gaming and Blender[0]) so ugly it's not even close.
And in all fairness - Apple's products might not need more GPU power. Cyberpunk and Elden Ring appear to be CPU-bottlenecked, if people are comfortable upscaling they could get a pretty comfortable Retina experience. The 2D optimization and media accelerators are a good focus for mobile hardware. For more demanding applications though, it looks like Apple's current approach is not scaling well.
Yeah I'm really curious what Apple's next-gen GPU (with raytracing and a bunch of other stuff) brings to fix some of these shortcomings. It was supposed to show up on last year's iPhone 14 followed presumably by inclusion in the M-series, and the 3nm process was supposed to be shipping this year, but everything got set back a year. In Mac-land the M2 wound up just being an overclocked M1, so we're left waiting for M3 to bring us a more competitive GPU.
The other half of the story is a lot of software (inc Blender, looking at these crazy results) just isn't well optimized and Apple is still struggling to win over developers in certain sectors of the market. Nvidia's decade+ investment in the software side has paid off so incredibly well for them, it's basically made the company.
shockingly, I think there might be more than one person on the internet and these people might have varying opinions
but yea you can say the same thing about tons of brands. Last summer all the AMD fans were talking about 1€/kWh electricity and saying they were going to buy whatever dGPU was most efficient... when that turned out to be Ada by a country mile, everybody pivoted to whining about price and bought RDNA2 GPUs with half of the perf/w.
During RDNA2 everyone insisted that a 10% perf/w advantage for AMD was a buying point, back during the Vega years they insisted that a 2x perf/w disadvantage didn't matter. Rinse and repeat.
I generally think power matters when it rises to the level of a tangible difference... 200W difference between 4070 and 6950XT means the latter is really a non-starter even if it's 10% faster (at a 5% higher price), especially considering the big-picture featureset (DLSS improves both perf and perf/w). And really it matters more in laptops. You're right that Mac Studio/Mac Pro are not really a place where it hugely matters, but, in a laptop, the next-best thing would be a Ryzen 6800U which is about GTX 1630 performance, so 3060 performance in the same envelope is a big step upwards!
And really this "big differences matter, small ones don't" applies to most stuff in general. 5% this way or the other, who cares. That kind of thing is often less important than general UX/quality/features, I'll take a laptop that's 5% slower but way longer battery life or better screen/trackpad/whatever. When things start rising to the level of 25% or 30% difference in some spec, or in price... yeah that's immediately noticeable.
But yea I generally agree that desktops like Studio or outright workstations like Mac Pro are dGPU territory and people are generally not looking for a super efficient iGPU with 3060 performance. On the other hand, being able to talk to 192GB of VRAM is definitely novel, especially with large AI models being the talk of the town this year (and accessible to even the most casual of artists/developers), and the unified APU approach with uniform memory/zero-paging has other advantages for development too. AMD had a lot of this stuff hammered out 10 years ago, supposedly, and then... just never did anything with it, other than sell it to consoles. It's great for PS5 and Xbox, why can't I buy a PC laptop with 96GB of unified/uniform memory with 3060-level performance in a 25W envelope?
Really I think a lot of the people who have bought Macbooks recently are not "traditional" apple customers. The MBP and even MBA are legitimately really nice laptops with a good screen, good keyboard, good trackpad, good sound, etc. I have said before that I really think a lot of MBP customers would be interested in a "Macbook Tough" toughbook if they ever did that, although of course that's the most un-Jony Ives product possible.
There is a clear demand for a high-quality AMD-based non-GPU ultrabook using a 6800U or 7040U or whatever. Framework is the first company to even try, and they're using crappy 13" hardware on the upcoming AMD model while the market clearly wants more like a 15" or 16" (and their 16" will not have AMD boards). Why didn't anybody else do it first? Apple is catching on because they're filling a market niche that everyone else is ignoring, and they're not even really exactly filling it squarely, they just happen to be vaguely closer than the rest of the market.
And now that the nerd crowd has the hardware... the software is following. It's the same reason that CUDA has taken off while AMD's GPGPU programme has spun its wheels for 15 years, and the same reason AMD has good Linux drivers now. Give the nerds the hardware and innovation will follow - when they tinker they'll be tinkering with your platform.
Big missed opportunity for AMD, yet again. Or Intel, but, they're so far behind on APUs/integration that I think disappointment is basically the baseline expectation at this point. AMD had all the pieces, and yet again just chose not to do anything with them.
The most popular GPUs in use TODAY are 1650, 1060, 3060, 2060.
If Apple can get the M2 Macbook Air to run like one of these, it essentially makes the most popular laptop also the equivalent to the most popular gaming rig.
Two days ago we could speculate that maybe the $6000+ Mac Pro would bring better graphics performance, but now we know it's a $7000 Mac Studio with PCIe slots. And as far as we know you can't put a GPU in those slots.
Not that it would've been in my price range anyway, but it could've indicated that thunderbolt eGPU support would make a return.
Lack of that is a weird omission if Apple is trying to act like they have a gaming platform.
However, I imagine that they'd still like to sell more games in the Mac App Store (in addition to iOS ports, iPad games that can run on Apple Silicon, and Apple Arcade subscriptions) and this might help.
It might also make it easier to port games to Apple Arcade.
This is wildly false, see https://www.reddit.com/r/buildapc/comments/143ugg4/comment/j... as a real world example of a gaming PC you can buy right now for $2,600 which is the top of the crop and an absolutely wild gaming machine that's vastly better at gaming than $5,000 M2 Ultra Mac Studio even if the comparisons are done only in games that actually work and work well on the Mac.
If you were after a comparable experience with the fastest Macs on Earth you could configure a PC that's another $1,000 cheaper than that ($1,600).
Actually you can get NVidia's top consumer GPU today, the RTX 4090, for $1500-1600. Go back one generation and you can get a RTX 3090 for $750 which still packs a punch.
So it's quite possible to build a well-performing gaming PC for sub-$2000 with RTX 3090 which is still significantly more performant than Apple's latest Mac, in terms of GPU throughput.
I snapped myself a gaming PC for $1300 at last year's Thanksgiving sales, came with a AMD Ryzen, RTX 3080 (10 GB VRAM model) and 32 GB DDR4 RAM, no way I could have gotten a Mac with that performance for anything close in terms of price.
Yea, that’s true as of today in the US. There’s no way you could get such a deal where I live (Norway) due to our weak currency, and it used to be the other way around just a few years ago
Great, you can keep playing your old ass game at low frame rates. Enjoy your power savings... the game still runs like crap and the GPU still isn't that great.
Game enthusiasts can be so weird. Do y'all even have fun playing games, or do you just keep buying hardware and optimizing settings until they run at 120fps, declare victory, and move on to the next AAA game?
I don't think I have worried about game frame rate since the days of Quake 1. I set my graphics settings to "Medium" and then spend the rest of my time actually enjoying my old ass games.
Its exactly why PC/console gamers (generally, not talking about GP) make terrible customers and Apple is right to not play with fire by courting them too closely - they're loud, cheap, immature, mercurial, and demanding, and being associated with them is probably a net negative for brand. Let them stew in forums arguing over red vs green, tinkering with and breaking the warranty of their PC parts, being disloyal to the brand they loved 5 minutes ago, etc.
Better strategy is to make sure the door isn't closed on gaming for those that want to use their expensive Macs to occasionally play (protect the downside), rather than swing the door open enthusiastically for gamers to rush in.
Re: worrying about fun vs tinkering, I'm reminded of the 4 quadrants of hobbies. What we see on forums are generally gamers interested in gear & discussing, not 'doing the hobby'. https://brooker.co.za/blog/2023/04/20/hobbies.html
Yes, gamers have standards and thats why GPU performance on the PC does not suck and why it is relatively affordable. A tough market to be in for sure, everyone wishes they could just be Apple.
In many genres of game, frame rate makes a huge difference in the ability to control the game and catch what's happening. Framerate is probably the worst thing you could have picked to mock when you're making an argument about enjoyment. Especially when citing an FPS.
The M2 Max is closer to the 3060 than it is to the 3060 Ti, let alone 3070. And those numbers are quite possibly overly optimistic; workloads essentially never reach peak tflop, and I would not be surprised if practical workloads are better matched to nvidia's architecture than apple's, if only through sheer industry momentum (But that could go either way).
While the perf/watt is impressive, apple is also using 4 times the number of transistors on TSMC's latest process - and the comparison here is samsungs 8nm, I believe. It's not really all that impressive that that huge silicon investment has some results...
It's a tantalizing hint at what might be possible, but as it stands, I'm not really all that impressed, personally.
I didn't say 3070, that was from the parent "who is impressed by 3070 performance [in a 25W envelope]"? And the answer is a lot of people.
I actually added the 3060 bit myself lol, because yeah, that seems to be more like where it actually lands, more like desktop 3060.
edit: also desktop vs mobile is a factor here too... mobile 3070 is not the same thing as desktop 3070, and coming in at desktop 3070 would actually be fairly impressive. Mobile 3060, much less so.
Sure, no quibbles on that front. Comparisons like this are always best taken with a lot of salt anyhow; they're so different. And it's not like tflops are the great predictor of gaming performance.
Positively: as a device, having such a solid iGPU is pretty much exactly what I've always wanted in this kind of device. Having performance that's PS5 ballpark clearly is enough for a hell of a lot of things. Who really wants something much faster at the cost of much worse battery life?
But the air of incredibly ground-breaking technical greatness that apple manages to weave around its silicon seems a tad overdone. Given the amount of silicon, the process node, and the target tuning - this kind of result seems competitive with rather than outclassing their rivals.
I feel like you've got a chip on your shoulder about this for some reason
I've got a gaming desktop and also a MacBook Pro. If the next time I go on a trip, I'm able to play some games in my hotel room on (gasp) medium settings, with a device I already own, that I probably was already going to bring with me, that's a positive thing!
I also have a gaming desktop and a Macbook Pro, but I wish I didn't have to have a gaming desktop because I much prefer Macs and MacOS and I wish Apple was interested in competing. Then, they dedicate a segment to 'gaming on mac' to brag about porting a 4 year old game to the Mac.
>it doesn't stand a chance against the rest of the industry
Apple seems to be doing okay. I mean, a magnitude more people game on iPhones than game on PCs.
Apple is trying to support ancillary gaming for users who chose their platform for other reasons. That's it. They aren't targeting the 1200 watt, 12-fan 4090 PCMR sorts. And that's okay.
Do they need to most powerfull gaming hardware? Maybe just the fact that it might get reasonable to play most games on a macbook is enough to get people who also want to game to buy a macbook instead of a windows laptop. And maybe this is enough to get developers to consider mac.
They do not need to compete with nvidea for the top of the line
Yeah, high end enthusiast hardware is in fact pretty niche, and I say this as someone with a 5950X/3080Ti tower. The vast majority of people playing games are doing so on pretty old/average hardware, a bar which is met and exceeded by several M-series Macs.
High-end hardware is niche, but the state of AAA game performance these days ... ugh. I can't even get a stable 60fps on Jedi Survivor with my 5800x/3080 Ti rig, even with lower settings. How bad is it on something like a 1060?
I paid a ton of money for my top of the line M2 Max because I build large rust systems that are high bandwidth and low latency and it’s as good as it gets for that. I also like to play games, but I don’t need the cutting edge. I would rather eat glass than buy a windows bing advertisement device and find another footprint in my home to install it just so I can get a higher frame rate than my eye can see. In fact, my strategy for gaming over the last 20 years has been to buy games 3 years old and devices that run them at their top end. No bugs, tons of reviews to guide my purchases, tons of mods, full DLC sets, always on sale. As long as I don’t sit around feeling envy looking at what’s cutting edge, following the 3 year wave front gives me precisely the experience folks had 3 years ago - but better. I’ll have their current experience in 3 years, so long as I don’t die, without all the bleeding edge problems.
So, great. Apple lets me stay away from bard directed bing advertising and OS level spy ware on my desktop, simplify my computing footprint in my household, and provides me games from a few years ago. Seems like a win win.
Source: I am someone who paid significantly more money for a Max M2 Max
No corporation gets to take a moral high ground being amoral entities. But windows is pervasively spammy now - the start menu hosting ads was bad enough, but now it’s the task bar too. I’m trying to remember when I saw a cross sell in apples stuff - my memory is only when I’m in something like the TV app or the wallet, or some place where the cross sell is contextually relevant.
The more Microsoft and Google tilt towards becoming persistent privacy threats and advertising companies, the more apple will see it as a differentiator as a hardware company with software services to be the opposite. I’m good with that dynamic, but I think it’s useful to acknowledge that’s the case. Pretending windows isn’t a persistent adware spyware bundle doesn’t help the situation.
I think you'd be surprised. I just checked, and my desktop gaming GPU is only 31% faster than that nvidia, and I'm typing this on a MacBook Pro M2 Max.
I bought the laptop and the video card because they are quiet and their price/performance is better than the high end stuff anyway.
I just tried running steam on the macbook, and was very disappointed. My Linux gaming desktop will live on for another few years, I guess.
edit: I think I got the GPU in ~ 2019, though it was released in 2015.
Rosetta translation + D3D12OnMetal (which developers aren't allowed to use to publish their games, so you'll have to do it on your own and work with a subpar version) will happily eat that 30% difference. Not to mention the massive changes that drivers bring, where Apple will never either want or be able to do as much work as Nvidia does.
The 30% faster hardware is running Linux. The main reason I'm disappointed with steam on MacOS is that only a third of my library works at all, and that the stuff that does run is hit or miss, performance wise (especially the indie / casual games, which this hardware should laugh at).
Also, another 25% of my library actually was ported to MacOS, but it is 32 bit only, so it won't run on an M2. (Also, typing that sentence was painful.)
I guess if I want to run the vast majority of the MacOS software that I have ever purchased on an M2, my best bet is to install Asahi, and use the Windows ports under proton. Lame.
Steam has mislabeled many of the older Mac games as being incompatible. Several of them will work just fine. It's worth double checking on one of the Mac gaming wikis if you want a particular game.
Have no idea why the mislabeling happened. Maybe Steam is working solely off dates despite many games being 64 bit before the 32 bit cutoff.
The latest consoles are also running mid range cards from a few years ago and are doing just fine. They are running games at medium at lower FPS/resolution than PC so games will mostly continue to target and work well on medium hardware. High end PC gaming is the exception, not the norm.
The Mac audience is not trivial and has deep pockets so as long as porting games is fairly easy the it's an obvious choice.
I'm sure it'll run a 4 year old game on medium settings great! But that's kind of the point, that it doesn't stand a chance against the rest of the industry.
Who's supposed to be impressed by getting middling performance on Death Stranding 4 years after it came out? Literal definition of "also ran".