That is on an M1 laptop. No dedicated GPU, not even an M1 Max. Just plain M1. Of course, it's not running super fast on Ultra settings, but imagine how slow it would be on a comparable Intel laptop with onboard graphics eating through your battery. Especially considering it only costs $1,299, which is not a lot for this kind of performance. And then it's not even an x86 CPU, for which Cyberpunk was developed. So yes, it's fantastic.
If you want to set the bar that low for fantastic, be my guest. Fps per dollar, any metric I can think of is lousy. You've not convinced me. Was impressed by Rosetta, but not this game porting toolkit, for the record.
This not only does DX12 -> Metal, it also does X86 -> ARM, and still manages to give you decent performance if you lower the graphics settings, you can manage easily 30fps, wich is enough on a laptop, considering it runs with a battery
So for 2020 laptop chip, it's pretty great achievement I'd say!
I don't know of any project that does X86 -> ARM this well
Well if there are no changes made to the game whatsoever, then it has to be using Rosetta. Fantastic would be DXVK approximate levels of performance hit. This is far short of that.
Rosetta is a known quantity of approximately 20% of a drop. DXVK can do about a 20% performance drop in certain situations, and perform better than that in others.
This is at about 50% performance drop translating DX12 to Metal, on top of the drop from Rosetta.
50% drop of what? Where are you getting your baseline from? Because, as far as I know, there is no native macOS port of Cyberpunk 2077.
At 1440p, and if we take this [0] at face value, it would be 50% of the performance of a RX 6700 XT paired with a 5950X, both desktop parts, which I think is pretty good.
The m1 gpu is broadly equivalent to a gtx 1650 in a host of benchmarks. This is getting less than half the fps a gtx 1650 does at these settings, and I am being charitable.
I don't know why you are looking at 1440p (2560 x 1440) as the m1 here is running at 1440x900. While I'm there, that 6700 XT posts 50fps, half of which is 25fps, which would indeed be alright. However this is putting out less than 15fps most of the time.
> I don't know why you are looking at 1440p (2560 x 1440) as the m1 here is running at 1440x900.
Ah, you are right, I messed up with the resolution.
> The m1 gpu is broadly equivalent to a gtx 1650 in a host of benchmarks.
Which ones?
> This is getting less than half the fps a gtx 1650 does at these settings, and I am being charitable.
... without having to emulate both CPU architecture and graphics layer.
I mean, this is not really a debate. DXVK is not comparable, Wine does not do the same either. We are talking about translating both CPU instructions and graphic API calls in real time, good enough that a triple A game runs without any modification on a laptop with 16GB of shared memory.