Hacker News new | past | comments | ask | show | jobs | submit login

I disagree. The fastest human reaction time is something around 100ms. (https://humanbenchmark.com/tests/reactiontime). I've just measured, and mine is 230ms. As such 10ms lag wouldn't make any difference. I've used Shadow Tech PC for a while during pandemic. With good upstream and downstream bandwidth it was a fairly decent experience, even for playing something like competitive Overwatch. I noticed the difference with normal gaming PC due to some other factors (quality of sound, etc.). Standard accessories worked seamlessly for USB-over-UDP.



You can easily observe how significant latency is in videos like this: https://youtu.be/vOvQCPLkPt4?t=80 (Microsoft Research presenting its ultra low latency displays for touch interactions). Many mobile games have you drag and drop things, so it's not like it's just first person shooters that suffer from latency.

You're a lay person, you couldn't have known this, you're using words with very specific meaning to streaming (like latency) and you're comparing it to human reaction times, which are measuring something else entirely. You kind of reasoned about from a first principle in a very Paulgrahamarian way, and it led you deeply astray. That happens. And you're not the only person doing this, this is a comment section full of people who play games and parrot stuff they seen in YouTube, and don't have a concrete grasp of what it is they're even talking about, so it's understandable when it's laypeople shouting at laypeople that it's just a bunch of blah.

One of the reasons I hate HN and write in throwaways nowadays is that the comments section is a better example of Knoll's law than actual journalism.


Thanks for posting this. This comment section has been particularly frustrating to read, since it's a mirror of what I've seen in the real world. There are teams at big tech companies making TERRIBLE decisions about the future of gaming because they don't actually understand how latency affects games, and they aren't hardcore gamers so they can't feel the effects themselves.

Even the ~50ms total latency you get from locally streaming over a 1ms wired network (from buffering/inappropriate firmware design) ruins whole genres of high level gameplay. You miss tricky shots in FPS games, you can't confirm/link in fighters, etc.


Wow. Condescending much?


A bit condescending yes but he showed a really good example of how input lag is noticeable.


Appropriate in response to the breathtaking arrogance-in-ignorance of what it was responding to.


This is the most factually true comment that I've ever downvoted.


That comment should be sent out as a blanket text message to everyone who commented on this post about latency IMO

The word needs to get out


Human reaction times have nothing to do with perceived input latency. There is a latency budget that is different for every individual that determines whether or not something will be an acceptable experience. This budget is divided between everything in the signal chain like the input devices, the computer/console, the monitor/tv, and any other processors along the signal path. Streaming games adds additional latency to the signal chain. Generously if your target is 60fps and you have a round trip latency to their server of 8ms, that's a half frame of added latency. On its own it's almost certainly imperceptible to most people, but it's not working in a vacuum and most people don't live right next to the datacenter. It can very easily go over the threshold for what is acceptable to most people.


Humans can detect 10ms of latency easily. The problem is more than just reacting slightly later to events, its also how quickly the game/system responds to your inputs because its a round-trip interaction. This ends up usually being where the latency becomes more noticeable to most people. People can generally adjust for consistent latency, but any latency gains are pretty noticeable once you get used to looking for it.

Also 10ms ends up being close to the average input latency of a single additional frame at 60fps, and you just have to look to the efforts that have gone into Super Smash Bros Melee (especially in netplay) to see how far people will go for a single frame.


Practiced musicians begin to feel discrepancies in time starting at latencies as low as 10ms. I learned this when investigating whether bands could practice live over the internet (spoiler: most of them can't). Turns out that due to limitations of physics, even absolutely optimal connections still have enough lag/jitter to ruin it for professional instrumentalists.


The bigger issue is jitter. People can compensate for consistent delay (e.g. by leading shots in an FPS game). But when the delay is inconsistent and varies quickly, it becomes much more difficult to anticipate movements and execute time-sensitive maneuvers.


You're definitely right for most people, but even 10-20ms is noticeable by experienced players and can be very impactful at pro-level -- e.g. some high-level LoL players feel 35ms ping is unacceptably high for competitive play: https://afkgaming.com/esports/news/ls-talks-about-why-35-pin... (though it probably doesn't matter much for Stadia's use cases)


Doubtful. Pro gamers are known primadonnas. If anyone ever tested them and added synthetic lag with double blind study I suspect they wouldn't identify it more accurately than what a random chance would dictate. Sorry, but pure speed of electrical signals/chemicals traveling in the body puts a constraint on that.


It's not even just "pro gamers", the most popular fighting game in the world (Smash Bros U.) is enjoyed by casual players and pros, and has an entire mechanic based on "two-framing" for edge guarding.

One absolutely does not need to be a pro to pull it off, and the whole interaction window for that mechanic is based around being able to react within ~32ms (1/30th of a second) to edge guard an opponent. It is exponentially harder to pull off in online play.


https://www.youtube.com/watch?v=vOvQCPLkPt4&t=90s you can literally see the difference.


Unless cloud gaming company intend to put servers in every single city across the globe it's not going to work. Even in Boston with good, fiber internet streaming games have too much lag and the compression artifacts are horrible.

When there is fast movement the compression is much more noticeable, worse then the lag. Many reviewers doing graphical comparisons do it with static images. It's quite common for the whole screen to become a blur of compressed and pixelated blocks at the slightest network hiccup.

Also, you are misunderstanding what "reaction time of 100ms" means. It does not mean that any event that takes less time then 100ms imperceptible, it absolutely does not. The sound of a clap lasts 22ms and you are able to hear even shorter sounds. You can see light pulses of arbitrarily short length so long as they are bright enough.

What 100ms reaction time means is that you can't react to a given stimulus in less then that. Here's the important distinction, you don't react to lag, you perceive it.

To experience this for yourself, go this lag simulator webpage [1] and experiment with various lag times. You will quite easily be able to feel the difference in 0ms, 100ms, and 200ms of added latency. Keep in mind this is on top of whatever latency OS layers and browser sandboxing introduce.

1. https://www.skytopia.com/stuff/lag.html


Moore's law is our friend and in the future seamless cloud gaming will be possible.


you cannot reduce streaming latency with smaller transistor, nor with more transistor density. or maybe there's a new more law interpretation I'm not aware of that makes speed of light in the connectivity medium faster?


This is confused. Reaction time is irrelevant, you can still notice very short delays between two events. The fundamental issue is that when you make an input that corresponds to an action in a game, you expect that to action happen near-immediately, and anything else feels terrible.


I tried Shadow and, well, you could really tell they host in a budget datacenter with how often there was stutter or missing keyframes (they host with OVH in Europe). I never had such issues with GeForce Now.

Also, I found it kind of scummy how they will not actually tell you what hardware you'll be getting beyond "4c/8t". Mine turned out to be a low-clocked Haswell, a CPU so outdated that Steam downloads were CPU throttled. I used it for about an afternoon and then immediately cancelled.


Maybe not 10ms but anything above 50ms is known and proven to degrade pro players' performance in competitive FPS games


It matters for MMOs. If there are two pro gamers A and B both with 100ms reaction time, but gamer A has 10ms ping while gamer B has 30ms ping, gamer A has a consistent advantage. This is not strictly a Stadia problem but it may be exacerbated if the display data adds latency on a slower line.


Pro gamers can definitely feel +/- 10ms of lag. It makes a difference at that level.


Agree. I used Shadow for a bit something like 2 years ago and it was pretty seamless. Sure, not as good as having a PC but it was pretty darn close.


a test on a mouse click? really? finger travel time is going to dwarf and eat up whatever reaction time you have.


100ms times are cheaters; 200ms is probably closer to the absolute lower bound of human reaction time.


I don't know about that. I'm off form and this was my third try after going wired.

https://imgur.com/a/8inHYSf


Wow, this site is cool. I consistently get ~188, best was 176ms. I wonder what some of the esport gamers get!

After doing it several times it let me save the score.

Reaction Time 181ms

74.46% percentile


Top level players don't tend to do better on these than slightly above average, because reaction time is something you train for a specific task.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: