Hacker News new | past | comments | ask | show | jobs | submit login

No. I just bought a 4k screen only to find out that 12k is coming down the pipe. I do not want to think about what 1000hz 12k screens will cost. Stop this madness now.



No one is gonna make 12k at 1000hz for decades, at least. And if they do, it would be irrelevant without a comparable GPU.

A RTX 2080 Ti can't even push 144fps on 1440p at max, much less 4k.


Just the monitor cables alone would never handle the bandwidth required.


It easily pushes more in doom.


No you're right, I'm getting like 350fps on Doom Eternal. Impressive optimization l.


You aren't obligated to own the highest spec hardware in existence.


Something is always coming down the pipe. Fortunately you have need to walk the treadmill, and if you're only just getting a 4K screen now, then you're probably not the kind of insane early-adopter who is.

12k is possible but it's hardly around the corner. Even 4K has both content and hardware issues around it. 8K is going to be the next mass market push but we're not even done with the 4K party.

Also 4K display devices are available at a modest price point now. The bigger issues are content. We're mostly there with mass-market media, but if you want to drive a AAA video game at 4K resolution you're having to make compromises and spend a lot on the hardware to drive it.

They're going to keep making new things. And the new things are going to have bigger numbers. It's okay.


Frames per second is not the same thing as your display refresh rate. Further, I'm not aware of any monitor capable of 1000hz operation.


Fastest consumer ones seem to max out at 360hz right now;

https://www.digitaltrends.com/computing/asus-rog-swift-360hz...


12k? Never heard of. Let's go with 8k, which is 7680x4320.

Assuming our current standard of 8 bits per color with no alpha (3 bytes per pixel), which may be too low if you care so much about your monitor, your required bandwidth becomes:

7680 * 4320 * 3 * 1000 = 99532800000

99532800000 / 1024 / 1024 / 1024 = 92 gigabytes per second of bandwidth you will consume just to pump stuff to your monitor. Better not use integrated graphics!

To give a comparison, here's 4k@60hz:

3840 * 2160 * 3 * 60 = 1492992000

1492992000 / 1024 / 1024 = 1423 Mb/s.

Also notice that 8k monitors already employ tactics such as "visually lossless compression" (which means: lossy compression but they think you won't notice) and other stuff aimed at trying to not really submit full frames all the time.

Forget your 12k. It will only be useful to increase your energy bill.

Edit: fix calculations.


In real life it would be subsampled 4:2:0 and 1000hz is nonsense because it's not divisible by 24 or 30. So a more reasonable 8k@960hz 4:2:0 is (76804320960) * (8 + 2 + 2) = ~356Gb/s or only 35.6Gb/s if you pick a reasonable 96Hz.

By 960Hz even a lossless delta coding scheme on the wire could reduce the bandwidth by over 10X for any normal footage.


But then if you add some kind of encoding to reduce the bandwidth you sacrifice the monitor response time that a gamer who wants to run doom @ 1000fps would demand, because decoding takes some time.


> 12k? Never heard of. Let's go with 8k, which is 7680x4320.

If you have a few million dollars to spare, you could jump to 16k: https://www.techradar.com/news/sonys-16k-crystal-led-display...


> 92 gigabytes per second of bandwidth

For comparison, netflix was just barely able to saturate a 100gbps (that's gigabits per second, so only 12.5 gigabytes) network link from one computer, and that's just pumping data without having to render anything.


At some point, it’s not worth upgrading resolution. I don’t know what that point is for you; but, eyes only have a certain arc-length resolution, beyond which everything additional as far as resolution is meaningless.

For me, that’s a bit more than 1440p at 3 feet at 27”.


Neither your eyes nor your brain would be capable to cope with that.


I had a quick look about to see what the eye/brain can actually perceive and the below is interesting. We can appreciate frame rates far higher than I thought. A pilot identifying a plane displayed for 1/220th of a second (reddit link) is pretty impressive.

https://www.quora.com/What-is-the-highest-frame-rate-fps-tha...

https://www.reddit.com/r/askscience/comments/1vy3qe/how_many...


Yeah, and those tests were to comprehend the image to the point of identifying an air craft. If you're just trying to identify motion, you could probably perceive much higher frame rates.


There is no proven limit of how many frames per second our eyes can see, and I'm sure you would be able to discern a difference between 144hz and 1khz. You may not be able to fully comprehend each still image, but the procession would almost certainly appear smoother, especially for fast moving objects.


You would easily be able to tell the difference. 500hz vs 1000hz, I'm not so sure. And I don't think anyone knows, like you said.


1000fps on a 1000Hz display gives you blurless motion without needing flicker:

https://blurbusters.com/blur-busters-law-amazing-journey-to-...

This is probably good enough in practice, although you can see differences even beyond 1000Hz by observing the phantom array effect of flickering signals during fast eye movement.


Nonsense. We wouldn't cope with it in the same sense we can't cope with the millions of colors in a 24-bit color space. Do we distinguish each individual color? No, but he full spectrum enables continuous color flow.

When it comes to such a high framerate, the upgrade is akin from going from 256-color palette-swapping VGA to 24-bit HD, or ather from stop motion to realistic motion blur.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: