Hacker News new | past | comments | ask | show | jobs | submit login
Expert overclockers successfully push DOOM Eternal past 1k frames per second (bethesda.net)
78 points by austinprete on Aug 28, 2020 | hide | past | favorite | 67 comments



TIL that there's such a thing as a 'grandmaster overclocker.'

I also learned that literally pouring liquid nitrogen over a CPU from a cup is a grandmaster overclocker move.


If that's true, how would one rank pumping supercritical liquid nitrogen at high rates through a heatsink? Super-grandmaster?

Seems like the heat flow would be substantially impeded by any boiling of the LN2.

Or, for that matter, simply using a chilled copper ingot as the heat sink? There must be some threshold at which the limiting problem is getting the heat out of the die, not getting the heat out of the chip's package.


Exposing the die directly and removing heat spreaders is increasingly common in "grandmaster" overclocking circles.


Quantum computer research might be super-grandmaster

https://www.qats.com/cms/2019/08/02/quantum-computing-coolin...


They also screwed a cool little tower on top of the CPU and are wearing a Doom helmet. Don't under represent their mastery.


With no gloves, to boot!

Of course, the real grandmaster move is to use liquid helium - its boiling point is about 70C colder than nitrogen :)


Carrier freezout actually makes this non-workable -- there's a limit to how cold you can make CMOS devices before they stop functioning. To say nothing of the specific heat of liquid helium, which is miniscule compared to LN2


Using a blowtorch as a heat-source is common in extreme overclocking, to heat up VRMs and memory modules up enough to boot the machine, and keep it working.

liquid helium might get into the device itself and cause other problems. (e.g. iphones stop working in the presence of helium)


Yeah, my favorite part of the article was the warning to please use proper safety precautions with liquid N, right under the no-gloves picture...


Due to the Leidenfrost effect, incidental contact of bare skin and LN2 is not usually a problem (it boils off locally and creates a gaseous N2 barrier). Other things superchilled by it will not be as forgiving, however.

http://cookingissues.com/primers/liquid-nitrogen-primer/#sec...


Then again, pouring a liquid out of a cup without touching the liquid isn't exactly high level coordination.


How does even N2 not crack and warp the packages and boards? If I plunged my TV into liquid nitrogen I would expect it to not work - in a violent way.


You insulate the motherboard. The aim is to allow only die contact the cup your N2 is in.

There are still problems like cpu/board/components not working due to low temperature or condensation. Generally they are solved by better insulation and trying different components until you find ones that work.

Overclocking is 25% cooling and 75% picking parts that can operate at low temperatures and high frequencies.


Don't forget the crazy binning for ram sticks, cpu dies, and graphics cards. There's a reason all the top guys have major sponsorships, it takes a lot of money to find the top 1% of parts.


TIL processors can actually hit a stable 6.6 GHz if you just pump liquid nitrogen through them…


Today we didn't learn actually. They don't mention it being stable:

“It's a lot easier to control a benchmark which is always the same”, explains Rywak. "A game makes the whole process less predictable in terms of hardware load which could lead to stability issues while working on frozen hardware.”


Is it overly picky of me to wish they specified resolution and game settings?


You can see on the photo at the top that they were running at 1280x720. As for graphics settings I would guess minimum.

Edit: seems mobile and desktop have a different crop of that image. Here is the image that shows 720: https://images.ctfassets.net/rporu91m20dc/1XYHhlYZzNI1NxRRJl...


So, does it count if I push 5000fps on 320x200?


1x1px seems simpler. Or a 4k static background. I’d say they all count, though some are less exciting.


Well, there's always the 3990x humming along at 2.3 trillion instructions per second.


Looks like the benchmark was cpu bound, so I doubt going to an even lower resolution would help much.


Thanks - I even looked in the images for it.

That would make sense as well in their discussion about CPU power - that resolution would require a lot of it compared to, say, 4k.


Cant wait for my 1000hz monitor!


My last CRT Iiyama monitor had quite a refresh rate - 120Hz or more - and the picture looked absolutely gorgeous.

It weighed a tonne and took up quite a lot of desk.


Monitors with high refresh rates (usually 140Hz but up to 240Hz) are quite common among gamers these days.


And worse than CRT in almost every way.


Mass? Real estate? Ergonomics? Power consumption?


Also no x-rays


Any better than a CRT?


Not every high-hertz LCD will be better than a CRT in every dimension - black levels and off-angle viewing being the two points that a CRT could hope to lead on - but a 120hz OLED, on the other hand...


I have an MSI 32" 2K with 1ms gray-to-gray 144hz refresh rate. My old Philips CRT was 120hz. The new tech is absolutely better.


1ms Gray to Gray means nothing since the move from one color to another is not linear in LCDs


At least it is mouthful.


I'd think so. CRTs could never match the resolution or size what with we see in LCDs today.


CRT did have blurry and distorted image.


Can't wait for my 1000hz eyeballs!


Human eyes don't have a refresh rate, per se. I recall hearing that they can discern differences in frequency up to 3 kHz or so; though obviously you can go much slower than that and still feel smooth and relatively responsive.


Fighter pilots can recognize an object flashed at them for ... 1/240th a second? Something like that?

I would consider that approximately the upper bound.


Object recognition requires you to see an insane amount of detail. You can reconstruct a lot, but you still have to see a lot before you can recognize an entire object.

That puts the threshold for basic perception much higher; you may not be able to see the details of the objects flashed at you for 1/500 of a second, but you can tell that something was flashed. If a bright light is strobing at 500Hz, you can tell it's strobing, not just on.



What is interesting, from the screenshot, is that the game is actually CPU bound. Contrary to an often held belief in the high-end video games optimization circles.


You can just lower graphics settings to get more out of the GPU, but there is no CPU equivalent to that like "lowering AI quality".


I’d want motion blur on then for super natural looking motion


But can it run Crysis?


No. I just bought a 4k screen only to find out that 12k is coming down the pipe. I do not want to think about what 1000hz 12k screens will cost. Stop this madness now.


No one is gonna make 12k at 1000hz for decades, at least. And if they do, it would be irrelevant without a comparable GPU.

A RTX 2080 Ti can't even push 144fps on 1440p at max, much less 4k.


Just the monitor cables alone would never handle the bandwidth required.


It easily pushes more in doom.


No you're right, I'm getting like 350fps on Doom Eternal. Impressive optimization l.


You aren't obligated to own the highest spec hardware in existence.


Something is always coming down the pipe. Fortunately you have need to walk the treadmill, and if you're only just getting a 4K screen now, then you're probably not the kind of insane early-adopter who is.

12k is possible but it's hardly around the corner. Even 4K has both content and hardware issues around it. 8K is going to be the next mass market push but we're not even done with the 4K party.

Also 4K display devices are available at a modest price point now. The bigger issues are content. We're mostly there with mass-market media, but if you want to drive a AAA video game at 4K resolution you're having to make compromises and spend a lot on the hardware to drive it.

They're going to keep making new things. And the new things are going to have bigger numbers. It's okay.


Frames per second is not the same thing as your display refresh rate. Further, I'm not aware of any monitor capable of 1000hz operation.


Fastest consumer ones seem to max out at 360hz right now;

https://www.digitaltrends.com/computing/asus-rog-swift-360hz...


12k? Never heard of. Let's go with 8k, which is 7680x4320.

Assuming our current standard of 8 bits per color with no alpha (3 bytes per pixel), which may be too low if you care so much about your monitor, your required bandwidth becomes:

7680 * 4320 * 3 * 1000 = 99532800000

99532800000 / 1024 / 1024 / 1024 = 92 gigabytes per second of bandwidth you will consume just to pump stuff to your monitor. Better not use integrated graphics!

To give a comparison, here's 4k@60hz:

3840 * 2160 * 3 * 60 = 1492992000

1492992000 / 1024 / 1024 = 1423 Mb/s.

Also notice that 8k monitors already employ tactics such as "visually lossless compression" (which means: lossy compression but they think you won't notice) and other stuff aimed at trying to not really submit full frames all the time.

Forget your 12k. It will only be useful to increase your energy bill.

Edit: fix calculations.


In real life it would be subsampled 4:2:0 and 1000hz is nonsense because it's not divisible by 24 or 30. So a more reasonable 8k@960hz 4:2:0 is (76804320960) * (8 + 2 + 2) = ~356Gb/s or only 35.6Gb/s if you pick a reasonable 96Hz.

By 960Hz even a lossless delta coding scheme on the wire could reduce the bandwidth by over 10X for any normal footage.


But then if you add some kind of encoding to reduce the bandwidth you sacrifice the monitor response time that a gamer who wants to run doom @ 1000fps would demand, because decoding takes some time.


> 12k? Never heard of. Let's go with 8k, which is 7680x4320.

If you have a few million dollars to spare, you could jump to 16k: https://www.techradar.com/news/sonys-16k-crystal-led-display...


> 92 gigabytes per second of bandwidth

For comparison, netflix was just barely able to saturate a 100gbps (that's gigabits per second, so only 12.5 gigabytes) network link from one computer, and that's just pumping data without having to render anything.


At some point, it’s not worth upgrading resolution. I don’t know what that point is for you; but, eyes only have a certain arc-length resolution, beyond which everything additional as far as resolution is meaningless.

For me, that’s a bit more than 1440p at 3 feet at 27”.


Neither your eyes nor your brain would be capable to cope with that.


I had a quick look about to see what the eye/brain can actually perceive and the below is interesting. We can appreciate frame rates far higher than I thought. A pilot identifying a plane displayed for 1/220th of a second (reddit link) is pretty impressive.

https://www.quora.com/What-is-the-highest-frame-rate-fps-tha...

https://www.reddit.com/r/askscience/comments/1vy3qe/how_many...


Yeah, and those tests were to comprehend the image to the point of identifying an air craft. If you're just trying to identify motion, you could probably perceive much higher frame rates.


There is no proven limit of how many frames per second our eyes can see, and I'm sure you would be able to discern a difference between 144hz and 1khz. You may not be able to fully comprehend each still image, but the procession would almost certainly appear smoother, especially for fast moving objects.


You would easily be able to tell the difference. 500hz vs 1000hz, I'm not so sure. And I don't think anyone knows, like you said.


1000fps on a 1000Hz display gives you blurless motion without needing flicker:

https://blurbusters.com/blur-busters-law-amazing-journey-to-...

This is probably good enough in practice, although you can see differences even beyond 1000Hz by observing the phantom array effect of flickering signals during fast eye movement.


Nonsense. We wouldn't cope with it in the same sense we can't cope with the millions of colors in a 24-bit color space. Do we distinguish each individual color? No, but he full spectrum enables continuous color flow.

When it comes to such a high framerate, the upgrade is akin from going from 256-color palette-swapping VGA to 24-bit HD, or ather from stop motion to realistic motion blur.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: