Hacker News new | past | comments | ask | show | jobs | submit login

I definitely hear you. As a heavy gamer myself, and a person who likes to do things fast to avoid slowing down my train of thought, our current tools are insanely slow.

The researchers telling me I don't notice 100ms delays are smoking something. Yes, human reaction time is 200ms on average but we process information much faster than that. Moreover, the delays make it impossible to do "learned" chains of actions cause of the constant interruptions.

Hackers typing insanely fast and windows popping up everywhere in movies? The reason why that looks very unrealistic is just that our tools do not behave like that at all.




Those researchers never played Quake2 / Quake3 / Unreal Tournament.

You can absolutely detect when your ping gets above 25ms even. It can't be missed.

> Hackers typing insanely fast and windows popping up everywhere in movies? The reason why that looks very unrealistic is just that our tools do not behave like that at all.

Right on. That's why, even though I have an insanely pretty Apple display (on the iMac Pro) I move more and more of my day work to the terminal. Those movie UIs are achievable.

Related: I invest a lot of time and energy into learning my every tool's keyboard shortcuts. This increases productivity.


I would argue that it's more noticeable in those older games where they weren't using lag compensation and you had to lead your shots in order to hit other players. If you're testing on a game which has rollback netcode then lag matters less because the game is literally hiding it from you.

What task is actually being measured here matters, too. For example, while it is true that humans cannot generally react faster than 100ms or so; most actual skills being tested by competitive gameplay are not pure reaction tests. They are usually some amount of telegraphed stimulus (notice an approaching player, an oncoming platform, etc) followed by an anticipated response. Humans are extremely sensitive to latency specifically because they need to time responses to those stimuli - not because they score really well in snap reaction tests.

Concrete example: the window to L-cancel in Melee is really small - far smaller than humanly possible to hit if this was purely a matter of reaction times. Of course, no player actually hits that window, because it's humanly impossible. They don't see their character hit the ground and then press L. They instead press L several frames in advance so that by the time their finger presses the trigger, their character has just hit the ground and made the window. Now, if I go ahead and add two frames of total lag to the display chain, all of their anticipated reactions will be too late and they'll have to retrain for that particular display.


All true. IMO the point is that people actually made effort for things to both be fast and seem fast. Unlike today.


And input lag (eg. local, mouse-to-screen lag) gets you before that.


>> Moreover, the delays make it impossible to do "learned" chains of actions

Yeah this resonates for sure. Multiple times per day i tell citrix ctrl+alt+break, down arrow, return (minimise full screen citrix, go to my personal desktop) and about 50% of the time an app inside the citrix session will be delivered the down arrow, return keystrokes :-/


This. Any application that doesn't properly queue the user inputs gets my eternal hatred. Either your application needs to work at the speed of thought, or it needs to properly queue things so when it catches up it executes my commands in order.

Surprisingly, I find MS Windows native stuff to be head-and-shoulders the best at this queuing.


The star menu itself seems to fail at this. And pin entry on a locked windows machine seems random whether it accepts the first keystroke as part of the pin or not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: