Hacker News new | past | comments | ask | show | jobs | submit login

> my experience is that it's flawless and indistinguishable from native in twitch games like fortnite and counterstrike

This is just your opinion and it's wrong for at least 50% of humans. Nobody with a mouse will find fortnite or cs playable with 40ms added lag. 40ms lag is higher than the mere 16.66ms you get from going from 60 FPS to 60 Hz vsync (that is, running the monitor at 60Hz and enabling vsync. it will be the same framerate but much laggier). In the 60 FPS days, EVERYONE complained about this mere 16.66ms added difference. This included people who just started playing their very first video game one day ago. On Twitch, still everyone notices vsync and turns it off once they figure out that it is the cause of the shitty sluggy mouse feel.

Your next idea

> The "trick" here is that native devices already have some amount of latency, just they are in an acceptable range for most people. However nvidia can optimize PC hardware to reduce the device's latency such that even with the network latency added it's still faster than the average person's native device. Hope that makes sense.

Is also utterly wrong. You don't need "le graphics pipeline" to make input lag noticeable. Just plug in a monitor with 16ms input lag to your computer and you will notice that the mouse is annoyingly difficult to position over anything in the Windows or Linux desktop.

OR another explanation: Make a program to move a small shape around the screen using the mouse position. It waits until the monitor is about to start scanning out a frame from the framebuffer and only then draws the shapes new position in the framebuffer where the mouse is at that time (since it's just a blank image with a small shape it's near instant to do this in the CPU, a few microseconds). If you use a monitor without lag (like a CRT) and it's running at 60Hz, that means there is now a total input lag of up to 16.66ms (accounting for the time between frames), and perhaps 1ms if you run the mouse at 1000Hz like you should be doing, then maybe some small delay from the OS. Now if you make this program render the slow vsync way, i.e, rendering the frame to temporary memory and doing nothing for the next 16.66ms then swapping it to the framebuffer once the monitor is about to scan out the next frame, this adds another 16.66ms of input lag. And it will feel terrible.

Ergo, getting far greater than 16.66ms lag caused by some network streaming service will NOT magically feel less laggy than native hardware. Also the idea that optimizing small isolated components randomly will fix input lag is laughable, it's all about timing. Almost no lag is caused by CPU (or GPU core) bottlenecks, aside from the straight up most basic problem of low framerate.




It's not 40ms of added latency. The chart can't be used like that. I agree 40ms would feel incredibly laggy, like 20fps laggy. That's not how GFN feels, it's perfectly smooth, more smooth than native hardware. Just look at any video of people moving around in gfn. Here's a random one I found. https://youtu.be/q-Fzkp-az9Q?t=154

The chart really only says it's more smooth than native hardware. Like think about it, if you're comparing 16.6ms to 100+ then the game would be running at like 1fps and is completely useless. Come on that's obviously not what it is.


The chart says there is a total latency on PC of 80ms. It's unplayable. 80ms is actual garbage. Comparing it to console works because consoles typically are garbage these days. FPS/TPS on console sucks. Yes, Geforce Now may be comparable to a console, because you're just running the PC version of the game remotely, and the PC version of games typically probably have less lag due to consoles sucking. Of course, Geforce Now is still back to the 80ms a console might get, that's still terrible.

> Like think about it, if you're comparing 16.6ms to 100+ then the game would be running at like 1fps and is completely useless.

Not how it works, you could have 2000ms of lag and still have a completely smooth game running at 500fps. But why are we talking about smoothness all of a sudden?

That youtube review is clickbait and bikesheds on some stupid detail like tree rendering distance on some certain scene. I don't think it even talked about input lag. He did state that cloud gaming being bad is a misconception because he feels like it's good, though. Also Genshin Impact is a laggy game. I have played it on a PC, and it's framelocked way down to 60FPS, and so it only uses a small fraction of CPU/GPU, and it's still so laggy you can't even aim right with the mouse with an aiming weapon like the longbow. Just moving around is tedious, which indeeds gives me that post 2000s console feel. The game with vsync off plays far worse than a misconfigured system. Even any FPS like UT or Quake with vsync on feel more responsive than Genshin Impact.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: