Hacker News new | past | comments | ask | show | jobs | submit login

All of the actual work can easily happen in under a millisecond. What makes input to screen output slow is anything asynchronous - polling (on the input side, e.g. in the USB protocol itself - USB to PCI uses interrupts!) and waiting for the next frame on the output side. Let's say the console waits for vsync to render its next frame, then the compositor grabs it and renders it in the next frame. That is one frame of gratuitous latency.



Then there's also the input lag of your monitor too which is really important.

A lot of monitors have really bad input lag, in the 50-60ms range and it's highly variable. This spec is also not usually listed by the manufacturer either and it's not the same thing as response time which is typically 1-10ms in most modern LCDs.

Your monitor's input lag plays a very big role in how fast key presses are perceived because ultimately what makes something feel fast and snappy requires an end to end measurement of you pressing a key and then your eyeballs being able to register it.

The monitor I picked has about 10-14ms of input lag which is very good compared to the average. That's running at 2560x1440 (60hz) at 1:1 scaling too.

If anyone is interested in that sort of thing, a while back I put together a very detailed post on picking a good monitor for software development at: https://nickjanetakis.com/blog/how-to-pick-a-good-monitor-fo...

I still use the same monitor today and I would buy it again today if I were thinking about upgrading. Although I kind of regret writing that blog post now because the monitor is almost twice as expensive today as it was 3 years ago.


> Physical size doesn’t constitute how much you can fit on a monitor. For example my mom thinks that a 25” 1080p monitor is going to let her fit more things on her screen than a 22” 1080p monitor. Don’t be my mom!

> The only thing that matters for “fitting more stuff on the screen” is the resolution of the monitor.

This is only true under the assumption that your eyes have infinite resolution. In the more likely case that they don't, the larger size of the pixels at a higher physical screen size means you need fewer pixels, with the result that you can indeed fit more stuff on the screen at the same resolution.


It's not just eye resolution, its how the application is designed. An older WinForms application will have very compact ui and mad information density, but some modern web apps will dedicate an entire 1080p window to display 1 icon and two buttons.

I would say that combined with DPI scaling, his port provides a reasonable rule of thumb.


One frame of latency is not gratuitous; it's perfectly reasonable to not draw directly "ahead of the beam" and instead double-buffer. Otherwise you get a lot of tearing.

I very much doubt the console is watching vsync either.


The extra frame that I mean is: console renders, vsync, compositor grabs it and renders, vsync. That is what happens on X11 with compositing AFAIK. Only one frame time / one vsync is really required before you consider specifics of the software involved. I've read somewhere that Windows also has an extra frame of latency for similar reasons as X11.


The answer is triple buffering and not using a compositor/ using triple buffering and a compositor that plays nice with it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: