Hacker News new | past | comments | ask | show | jobs | submit login

The latency issue is increasingly disappearing or at least becoming negligible in most population centers. For example, in my country almost every company uses Citrix (no affiliation) or similar workspace solutions, where the entire workstation is virtualized in a data center and you only access via a thin client. Entire nations of people work like this already.

Cloud gaming will probably be the next frontier in this.




My experience is the opposite on this. I have had my first job where Citrix was used for day-to-day tasks in 2023 September, and I would estimate the latency was around 300-400ms. It was very noticeable and frustrating, especially when coding. I would be typing code and I knew I made a typo, but had to wait for the characters to actually show up on the screen, before I knew how many characters to backspace over. Switching windows and workspaces felt sluggish. This was with both the server and the client being in the same country.

It was a bad enough experience, that it is now a part of my interview questions if the company works through remote desktop solutions.


> The latency issue is increasingly disappearing or at least becoming negligible in most population centers

I believe that thats your experience. Its not really because the technology is improving though, its because you're growing older.

The latency is absolutely horrendous, and anyone thats used to a decently performing system will not agree with your opinion.

As a simple example: i can easily code 6+h with no break on a good system, with these mainframe system i'm gonna take a break at least every hour because the fatigue builds up so quickly. Its every little interaction, simple input that doesn't appear for 50+ms, switching owrkspaces thats delayed for 150+ms.


Alternatively, they're young enough that they've never experienced good latency and don't know what they're missing. See:

https://danluu.com/input-lag/


(Real time) cloud gaming only works with a very low latency internet connection, which requires wiring, which leaves out most of the non-city users (and still some city ones).

Not to mention that it's ridiculously wasteful.


What percent of consoles/gaming pcs are being used for gaming in average per hour? 15%?


I'd say it's rather more wasteful for everyone to have expensive rigs to play games a couple of hours a day when we could share computing resources in the cloud


I would love to have this, but this is very naive in the current economic climate. There is already a notion of trying to get rid of general computing devices from Apple, Google, and Microsoft. They would immediately use the opportunity to corner the market, lock down everything, and start extorting more money. There is no way I would give up my personal desktop.


By definition, everyone doesn't have expensive rigs.

But you're right, one shouldn't automatically assume that streaming is more wasteful than letting the resources sit idle... (one issue here is the assumptions about how fast computers are replaced for consumerist reasons ?)

(And this would still leave the issue of the loss of ownership.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: