Hacker News new | past | comments | ask | show | jobs | submit login
Sony and Microsoft set rivalry aside for cloud gaming alliance (nikkei.com)
322 points by jmsflknr on May 16, 2019 | hide | past | favorite | 261 comments



It seems like all the major players are taking game streaming seriously. I'm still not very sure about the viability.

There's the obvious DRM win. Accessibility will be higher with way lower or non-existent hardware costs. These probably look very enticing on a cooperate slide deck. Not to mention the subscription model.

On a more technical side game streaming is only really desirable for games that don't run well on commodity hardware (see AAA 3D action titles). Latency is very important for these titles and the general rule of thumb is that 50ms of input latency is readily noticeable to humans. Average home connections have ~20-30ms round trip latencies. These speeds are not a reality everywhere or consistent. Combined with hardware input, world simulation, and rendering latencies it can be difficult to get latencies low enough to be comfortable.

Modern games just have the client run its own world simulation and rectify with the server later to hide the network latency. This strategy won't be possible with streaming. Is there some alternate optimization that can be made for streaming? The architecture will definitely be simpler with the main-frame paradigm. Maybe if there's only one client it could be feasible to send some batteries included chunk of frames that can be easily constructed based on the next set of inputs. Is there any hint that progress can be made in this direction?

If the latency problem can't be solved, I'm definitely bearish or game streaming. Latency insensitive games for the most part are not difficult to run on commodity hardware.


I have no idea how reliable it is, but quoting wikipedia [1]:

"Testing has found that overall "input lag" (from controller input to display response) times of approximately 200 ms are distracting to the user. It also appears that (excluding the monitor/television display lag) 133 ms is an average response time and the most sensitive games (fighting games, first person shooters and rhythm games) achieve response times of 67 ms (excluding display lag)."

These numbers are from the PS3/XBox 360 area, which certainly sold massively well. Since display lag is not included in these 67ms, this is what these streaming services would have to aim for. It doesn't sound completely insane given the 30ms round trip latencies of home connections.

On a more subjective note, I think it's going to be really hard hunting down high end gamer PCs. On the other hand lots of games don't run particularly smoothly on consoles (especially "normal" PS4 and XBox One, rather than PS4 Pro or XBox One S) that these services are directly aiming at. I can perfectly imagine a stadia version of The Witcher 3 running circles around the PS4 version that lags so much... But I don't think it would hold against the PC version.

But I don't game much at all anymore, so my knowledge is mostly outdated. The last games I played were 80 days and A Dark Room on my iPhone (both pretty good games). So I guess I'm now more in the target demographics of Apple Arcade.

[1] https://en.wikipedia.org/wiki/Input_lag


I followed up on the wiki article. The citation was a just eurogamer article that had little to with human perception of latency. It did have this nugget: “Criterion said it was aiming for 50ms latency [for burnout paradise]” which agrees with what I’ve heard making handheld electronics. The goal was always <50ms latency.

Personally I noticed a significant improvement switching to a 165hz monitor in basically every first person game. The only difference here is going from 16ms to 6ms frame granularity (10ms off the worst case). I’m not sure how much is visual smoothness or latency, but I’d gladly take refresh rate/fps over UHD.

I’m definitely in the enthusiast group though. I didn’t even realize that console players dealt with such high latency. It will be very interesting to see how the market plays out.


Hm... the topic is quite complex. Human perception is actually much slower than many people want to believe. Parts of the human vision system react no faster than around 1/25th of a second (other parts are faster - its complicated!). Hearing is similarly divided. AFAIK, auditory and visuak stimuli are processed together and merged over a rather large time window (to the point where one system can induce illusions in the other). But we are able to perceive time delays within a complex stimulus with very high precision.

I am not totally convinced that a high refresh rate monitor allows human vision to respond faster. I would rather suspect that most game loops are VSync-locked and a higher refresh rate leads to better input sampling and processing.

Maybe I should set up a simple test where I show a non-interactive sequence at various framerates and ask users to identify the highest framerate. And then repeat with interactive camera controls.


>I am not totally convinced that a high refresh rate monitor allows human vision to respond faster.

Have you ever watched a high-level FPS player? They can aim and shoot in the span of about ~1/5 of a second, occasionally even faster. That's 200ms. At 25hz, that's 40ms. A big part of it is going to be increased input processing, but a lot of it is muscle memory at that skill level. At 165hz, that's 6ms. With two equally skilled players, one will see the other up to a whole 34 ms sooner, in a process that takes about 200ms total. That's going to show up as a highly statistically significant advantage. Of course, the increased sense of connectedness to the game at higher rates is another big advantage, but these effects combine. Pit one skilled shooter against another at equal refresh rates, but with one dealing with 34ms of additional input latency, and they're not going to play as well as they should.

Competitive players explicitly disable vsync, because the higher the game framerate, the faster it processes input. Also, the sooner you see updated information on the screen mid-refresh, but that doesn't help as much as a wholly higher refresh rate.

All of this only matters if cloud gamers were playing against locally-rendered players in the same match though.


There is a reaction delay but it is added on top of all the other delays. If the monitor is delayed by 1/25th of a second and your perception is delayed by 1/25th of a second then the worst case latency is 2/25th of a second. That is a 100% increase in reaction delay. If you think a 100% increase isn't significant what do you think about the 250% (or more) increase that streaming will cause?


I am not arguing against the existence of such an effect. I am loathe to state that the cause is that your eyes see more images per second on screen. I am not convinced that a higher framerate improves perception of the image content when over ~50Hz. I'm rather suspecting that the traditional coupling of game simulation and input processing rates to screen refresh rate is the cause: games seem more responsive because input sampling is more precise when the game loop runs faster.


>I have no idea how reliable it is

This is not reliable at all. You can really feel 100ms latency.

That's why in the world of pro music, barely anything above 10ms keypress-to-sound latency is deemed acceptable.

Quake III will make all these streamed games feel slow.


Console gamers have long been conditioned to accept high input lag with laggy televisions, controllers and low-dexterity games and game mechanics.

Quake III makes all modern games feel slow. But that hasn't had the slightest effect on the typical gamers mindset. If only certain type of games work with cloud gaming then AAA publishers will only make those type of games. Exactly like the case with EA and microtransactions.


Well sound uses different neural pathways than audio, so it's not a totally fair comparison.


100ms is pretty bad in shooter. Around 60-70 is where I cant distinguish the difference. But I am very far from a great player.

I can see this working very well for games like Total War where hardware specs are super high but response time is not critical


Is this a typo? Anyway, audio feedback is a critical part of games. Pressing the "fire" button and waiting 100ms for the sound of the gun would be insanely noticeable


The inverse is also true. Sound triggers and local animations are often used to mask latency in games like DOTA which need a server round trip before your commands have any affect on the game world. It works remarkably well to make games feel responsive even when they’re not.

However, with these streaming systems I’m not sure developers will be able to use tricks like that. It seems like these tech stacks are running all code remotely. I feel pretty skeptical about whether they can make the game feel good enough this way.


you mean video? "pathways than audio"


Actual video gamers right now go out of their way to get displays with very low input lags. 1080p144hz TN low input lag displays are used quite a bit in esports to shave a few msec. You literally win more often if you react faster. People aim to get TVs with sub 16ms input lag.

However that's not most of the market. The appeal of cloud gaming is for single player non-competitive games. These are the most pirated games and the games where input lag matters the least.


In arena shooters like quake and similar games, input lag of 10ms (1-2 frames) is noticeable. 200ms would be an absolute joke.


You can probably do even better than 20-30ms round trip if you're one of the large Cloud companies with data centers everywhere. Amazon so far doesn't seem interested, but Google and Microsoft definitely have the infrastructure to pull it off.

Sure you won't get high end PC gamers, that's not really the target demographic here. It's the millions of console and mobile gamers.


I can easily notice the difference between 20 and 40 ms in FPS games. I have gotten out of FPS's due to keeping up with the hardware requirements, but a streaming service like this could make it possible. That being said, if they cant stay below 50ms I probably wouldn't waste my time.


The Stadia trial convinced me that streaming is a near-term viable option. Latencies were not an issue. Bandwidth was, but when I got to play on a 50Mbps+ connection any issues I had encountered disappeared.

Moreover I was able to play a AAA title on my 2011 Macbook Air and that was fantastic.


What kind of games were played? Whether or not latency is an issue is hugely dependent on the title. A game like Sekiro: Shadows Die Twice where the bulk of the game play revolves around precision timing is going to be killed by any significant amount of latency. Unless it does something smart like trying to compensate for the latency on the client end (e.g. make a block succeed even if it was late, if it suspects that the player blocked at the right time from their perspective when viewing the delayed stream) I can't see how a game like that would be playable on Stadia.


> Unless it does something smart like trying to compensate for the latency on the client end

This is actually already done in many AAA titles. The idea is to basically always be showing a prediction of what will have happened in ~70 ms, and update your state model accordingly. Given a datacenter to run the games on and all of the announcements about being able to take a snapshot of the state, this doesn't sound super far-fetched.


But stadia can't use these techniques. The game engine and rendering isn't done on the client side. Normally a network game the client displays the user's actions immediately, and the lag is only seen from other player's perspectives.

On a streamed game, all interactions need a round trip before the user sees any response.


Didn’t Google say something about using AI for this kind of game state prediction at Stadia presentation?


Doing anything where mispredictions result in a few frames of doing the wrong thing escape to the user is going to make games feel really inconsistent. If the game is running at 60fps and you have a system that can predict with 99% accuracy (which seems wildly optimistic) you’re still going to have mispredictions at a pretty high rate.

Not to mention that the game will have to be written especially to allow for the simulation to be rolled back and forth. For example a misprediction triggers a sword swing it turns out you didn’t make that traps the player in a state for a second. You end up giving the player a horrible experience where the game seemingly has a mind of its own. Or you can roll back and interpolate from the bad state back to a good one which would be ideal. Then the middle ground is to rollback and have a cut in the action back from misprediction. Then you realise this is happening every second or two.


Game state prediction doesn't cut it, or rather there's no point to it. The rendering is done server-side - remember they're streaming HD AAA games to televisions and tablets. There's always going to be a round trip between pushing a button and something happening onscreen. Google did say something about device inputs being sent straight from the controller to the server. Maybe this means they will cut down on latency between the input and the server, but it's still a trip back down from the server to the display on top of that.


I believe they're talking about the Assassin's Creed: Odyssey trial.


Yep!


Whether or not latency is an issue is hugely dependent on the title.

It's hugely dependent on particular network conditions. I wonder if SpaceX Starlink might have an option to optimize for latency?


It certainly will but it will be premium for high frequency trading.


I'd be quite satisfied with a reasonably priced tier that guaranteed 40ms round trip or better 99.8% of the time. The HFT levels could shave that time off and start stacking 9's for exponentially more money.


Stadia has a "frame token" that can be sent with the use inoit for cases like this.


I’ve played a lot of Sekiro on XBox One. I’ve played Bloodborne on PS Now and it works great. They have a free trial you should check it out.


Bloodborne on PS Now? That sounds like hell. Plenty of later bosses have very narrow dodge windows. I already have a physical PS3 and PS4 so there's no reason for me to get PS Now. A couple co-workers of mine used it and none of them liked it. It works okay for RPGs, especially turn based RPGs where there is zero emphasis on timing or reactions.


> sounds like

This is why I'm suggesting you try it. Because to be more blunt, you do not know what you are talking about.


If you're close enough to the server it's possible for network latency to not add much compared to the combined latency from the tv + console + wireless controller. (most modern tvs add at least 20ms, even on "game mode", some add more than 100ms).

It's a different story on a pc with a decent gaming monitor though.


For a networked multiplayer game, the total amount of network lag is effectively the same, except that the network lag is moved from between client and server to client and "display". Now that the server and client are now just one instance, there is zero lag there.


Considering that pretty much all multiplayer games in existence perform some sort of local prediction / state interpolation to hide the lag on the local machine, cloud only multiplayer will be considerably worse. After all you can no longer hide the lag locally, since you're not computing anything on the local machine, so the minimum precieved lag will go from 0 (for movement of your own player character) to the RT between the datacenter and your PC :/


Not entirely true. Many networked games use client side prediction, and collision detection (at least for collisions against terrain and walls, weapon hits are often computed on the server to curb cheating).

If you have a 100ms latency to a stadia server, then there's a 200ms latency between pushing forward on the stick and your character moving forward. This is not the case on a networked game, the client would start moving your player immediately (even if there is a 100ms delay to send those updates to the server).


Except, that's not actually the case as you still have latency between servers.


This is only relevant if you're deliberately splitting the game across multiple servers (MORE work) so that people living in different cities can play against each other and have low latency to their own server. Other than that there's no reason to run one multiplayer game on more than one server.


That's right. If Stadia does what Google says it does then gaming console business faces long term risk. I was quite impressed with Stadia demos.


Rather than purely streaming frames, you could push a frontend client in e.g. webgl + wasm on demand, cache artifacts locally, and send view-independent texture/mesh streams down. Doing so is technically more complex than having a rigid client/server split, but if they start with full streaming they can progressively make the client more intelligent, without being burdened with any specific hardware/client infra choices.

It's also easy to port edge-rendered games to many platforms. It makes sense for MS to ally with Sony and shift their effort to cloud before both the Xbox and PlayStation are little more than rich web clients.


Streaming anything but frames defeats the purpose of a frame streaming service.


Yep, see polystream.com


Running managed clients in a "compute at the edge" architecture that streams video to the end-user might be a good compromise.

CDNs are deploying compute/lambda at the edge capabilities, so "gaming at the edge" could be seen as a reasonable expansion of that.


This seems a lot more feasible. In fact, that's what the Nvidia Shield essentially is (although the edge computation is done with the customer's desktop computer).


I see this as being a more plausible implementation for say multiplayer twitcg shooters. Market needs to scale up a lot for this to be viable though...


Latency isn't an issue with games that have high time to kill or are purely exploration based, which seems to be something Stadia developers have noticed in the examples.

There will have to be local rendering for nearby objects and particles, but this is feasible.

DRM isn't totally likely, REnouveau does exist, and basic reverse engineering tools could be created to document the most common game engine functions.

Twitch shooters will never be replaced by this though.


Streaming based DRM doesn't need cloud rendering to be effective. The client can download 3D models and use prediction and local collision detection while the server is still responsible for the secret sauce like combat rolls, item drops, missions. Reverse engineering is still possible but even the private MMORPG server scene is actually pretty "dead" compared to the actual games they are cloning. Adding content is expensive and replicating content is sometimes impossible so the developers basically end up building what amounts to a custom game which uses the same rendering engine as an existing game. Becoming an indie dev and working for free on your own game ideas is a far better strategy although the chances of success are low.


It's still a bit of an issue, especially if you're controlling a camera with a mouse even 50ms latency is likely to be very noticeable.

However much lower latency to these services is definitely viable in a lot of places as long as they put enough servers spread out across the country.

In the USA at least there's going to be a lot of areas where the available options for internet are just not good enough though.


Just watch a VR video. They stream in a full 360 degrees.


> On a more technical side game streaming is only really desirable for games that don't run well on commodity hardware (see AAA 3D action titles).

I think it's actually more or less the opposite?

Twitch platformers like Celeste and Cuphead can't afford input lag, as they require very quick and precise movements from players. It so happens that these types of games almost always have low system requirements. Rhythm games and fighting games tend to be similarly lightweight. Street Fighter 5 won't run on a potato, but the hardware required is far from state-of-the-art.

By contrast, Assassin's Creed Odyssey and Red Dead Redemption 2 already contain large amounts of input lag when played locally according to Digital Foundry, and consumers don't seem to care much. This isn't to say that adding more latency is a good idea—and I have no idea if the existing latency could be whittled down to compensate for streaming—but it clearly doesn't bother people much.


Modern games just have the client run its own world simulation and rectify with the server later to hide the network latency. This strategy won't be possible with streaming. Is there some alternate optimization that can be made for streaming?

I'm working on a system that unifies single player and multi-player development. Think of it like zeit.co's "NodeJs Now" but for games, with a local-only configuration option. The way my system relates with streaming, is that the client is scaled way down in complexity. Basically, the sync protocol is a stream of positional updates from the server. The client becomes little more than a screen entity display. Then, the developer has the option to add a bit of "own world simulation" to further hide latency. (Or, in the case of the game I've implemented, just "own entity simulation" is enough.)


This kind of nuanced discussion of latency is something I only see among really serious gamers.

If you look at things like the Steam hardware survey, it becomes quickly apparent that the hardcore gamer (e.g. someone who has a high refresh rate display or similar and cares about this) appears likely a very small part of the overall gaming landscape, which is a very broad church now.

I think game streaming will ultimately win out hugely - “ordinary” (for lack of better term) gamers simply don’t care about the technical disadvantages. Huge numbers of console gamers already often add 100ms or more lag via their TV’s slow image processing and don’t even notice or care. I don’t think the average gamer is as latency sensitive as many readers here might be.

Most people are going to see that they can have a high end console/PC-like experience for a fraction of the cost of owning a high end console/PC, for the vast majority that is surely an appealing prospect. Finally, John Carmack himself believes the latency issue can be mitigated well enough for most things - that’s good enough for me ;)

> https://twitter.com/id_aa_carmack/status/1108144741932249088...

My only real concern in all this is the future of video game mods. This is a pretty big thing now and hard to imagine this surviving well in a streaming based future. This would be a loss, creatively and functionally, for some people.


20-30ms seems high?

At least in the UK a typical home broadband latency is 6 to 9ms in my experience.


USA has no (local) competition for internet, it’s almost all regional monopolies or duopolies where one (cable TV company) is expensive and ok and the other (copper wire phone company) is slow and unreliable.


Some places (mine) are, by local ordinance, unable to receive high speed internet because the company that owns the rights/stupid ass poles that they run everything on along the road offers high speed internet access ALREADY. . . .

SO COOL!!

How do they do it? By defining high speed as dial-up speeds in the ordinance they helped the local small, technologically inept government craft!

God Bless The USA


Back in the early 2000's in Ohio, under 80ms round trip latency was once advertised as 'leet gamer stuff.


There's a lot of people who can't get fiber when you get outside the cities. I'd be thrilled if I could get under 35ms.


6-9ms is not fibre here - I was referring to copper ADSL that is available pretty much everywhere in the UK (apart from some really isolated tiny villages/hamlets)


Clown gaming is compelling in every way except gaming.


We care about the 50ms latency because the current games are designed for the platforms with <50ms latency. If the platform is changed, so the game too.

> Latency insensitive games for the most part are not difficult to run on commodity hardware.

Majority of people don't have a PC with a dedicated GPU. Or PC at all. Thanks to the Smartphone.

That said, I don't like the current trend and I hope they failed on streaming video game.


>Majority of people don't have a PC with a dedicated GPU. Or PC at all. Thanks to the Smartphone.

Problem is, smartphones still are shit for input (lol touchscreen) in any precise manners so what point is there in trying to run intensive games on it via streaming.


Outside of most FPS and some really tight-action platformers streaming pioneered by OnLive is more than enough especially for the casual gamer.

When OnLive was operational I had no trouble playing Batman (something something Arkham?) on their American servers from a MacBook Air in Sweden. I had a similar experience later with LiquidSky which was running regular games downloaded from Steam in their cloud and streaming video to my laptop.

If they do cooperate on creating a similar service, this will definitely be a net win for the gamers.

However, if that's what happens, and if the service is successful, we might end up seeing less games tailored to run on the console's hardware with low latencies, and we'll end up having cloud-optimised games (much like we have mobile-optimised games now). But this is pure speculation at this point.


I remember using Gaikai a bit. It was quite usable with non-reflex based games like From Dust. But that's quite common these days. I sometimes RDP over wifi to my gaming PC when to lazy to transfer saves for games like Factorio or KSP.

Still I have a hard time imagining first person games, since I despise those with significant input lag. Even just on a local PC some games have a horrible delay unless you run very high frame-rates (quite a bunch of those Unreal Engine) and I always have a hard time getting into those.

Now with an additional 20-30ms network latency... for this to be close to acceptable, they would have to get the on server delay close to nothing at the very least. I doubt that will happen while games are still release on multiple platforms.


The casual gamer is perfectly happy with cookie clicker or Fortnite on mobile. 2D games should be able to run on almost any platform and 3D games can be toned down until they run well too. The only type of gamer that isn't served by the things above is the "core" gamer that wants high fidelity 3D graphics and real time experiences. Really the only need that streaming serves is to make DRM possible which means the companies want to have complete control over the user experience and they certainly are going to abuse that control as much as they can.


Most 3D games don't even have to be toned down. Because if you stream video, you don't care if you send a video of a 2D game or a 3D game. And that's the main appeal of such services: you don't need to spend large sums of money to be able to play modern games in high or ultra high settings. All you need is a beefy internet connection with low enough latency.


Also keep in mind how important multi player is for most big titles now. If you add up player -server-player latency it's actually the same. Given more consistent performance on the rendering/frontend side it's quite possible the overall experience will turn out better.


Client-side prediction does a very good job of hiding the network latency. Granted a lot of games run their servers at lowish ticks/sec (people do complain though) which is 30ms worst case at 32tps.


You can do similar tricks in a cloud setup also though; notably you can do hit detection based on what the user was seeing when they entered the input rather than what the server was doing later. This is pretty similar to client side prediction.


When playing online against other opponents, having the gaming machines be very close to each other might make up part of the latency introduced by the distance between the player and the gaming machine. Maybe.

Your first point is key though. Piracy is a big deal. People can say until they're blue in the face that it "helps", or that "they wouldn't have bought the game anyway", and for some unknown percentage its true, and for some other unknown percentage its a straight lie.

If they go from "optionally play a game via streaming", to "this game MUST be played via streaming" eventually, they get the second group (the ones who WOULD pay if they couldn't get it for free) instantly.

Only if it doesn't suck, obviously.


They also lose everyone not interested in streaming games (like myself). I'm sure they've run the numbers, but I know plenty of people who actively purchase games and are not at all interested in streaming them.


For sure, I'm in that camp too. I was in the anti-Steam camp back in the days too, and well...that didnt work out for me.


The other consideration is bandwidth. Streaming a game is far more intensive than just playing it and many US households are under tight data caps (not to mention throttling under peak hours). I just am not convinced by the advantages, for much of the gaming demographic.


Additionally the market share of AAA-games is currently decreasing.

People with sensitive hearing can tell the difference if a sound arrives <1ms too late and the sound system needs adjustment.

All my bets are on bubble on this one. Maybe it could be resurrected as a zombie to offer sandboxed streaming for all those crappy mobile apps.


I’ve played Bloodborne on PS Now and it works great. This game is really dependent on fast reactions and smooth camera panning. I think if you try it out you might learn something.


They will just host turn based strategy games, like yahoo chess.


I recall we all said the same crap about streaming HD movies with a monthly fee too. I think the only true problem with streaming gaming is that the quality of games will diminish even more.


Without net neutrality, the companies running game streaming services will be able to pay for prioritized transmission to reduce latency and jitter.


Won't have much to do with net neutrality really since they will want the traffic onto non public fiber as fast as possible for more direct routes which will have a bigger effect


Yeah the bandwidth needs are going to be high enough that laying dedicated fiber is not only an advantage in terms of latency but also absolutely necessary because the regular lines will be at maximum capacity if everyone streams video games at 20mbit/s but ISPs will still refuse to upgrade their infrastructure.


Honestly I just don't get why every single time there is discussion about game streaming, the topic of lag as a potential show stopper comes up, and every time it's a hypothetical discussion, which doesn’t make sense at all - the solutions are out there! Don’t assume - just verify it.

And when you do, you will find out that there isn’t any actual issue. I’ve played games like The Witcher 3 via streaming over three years ago.

Input/output lag simply isn’t an issue, and I’m your average German VDSL user, nothing special.

From my point of view, there is no sense in any further discussion - the empirical data is already in.


You've specifically chosen a game that isn't sensitive to lag.

It's like saying "Switching from a car to a bike just isn't an issue - I once rode all the way to my neighbour's house!"

Try playing a multiplayer first person shooter on the same service, you'll find latency is an issue.

By all accounts, the Doom demo that Google gave was plagued with lag.


Yeah, speed of light is actually a lot faster than people think. Honestly things such as controller and display lag are probably bigger than the round trip latency. And experimentally, anyone who has tried Stadia at GDC or I/O has said that latency wasn't an issue.

The bigger limiting factor I think will be bandwidth, especially in America with the awful internet provider situation.


"Yeah, speed of light is actually a lot faster than people think"

This would only be relevant if you had a direct fibre link in a straight line from your house to the service. It's packet switching\routing etc. that introduce latency. And even if you DID have direct fibre, the worst case scenario for distance - server on the other side of the world, but with no other latency accounted for - is about 200ms, even half of which would be noticeable in many games.

I'm not saying it's impossible to do game streaming, but you can't just divide the distance by the speed of light and say it's low. That's not really the issue.


Most of those hops would be on ISP equipment with 10G or 100G interfaces, modern routers forward packets in tens to hundreds of microseconds and switches can do it in a few microseconds, so as long as there is a bit of distance it will account for most of the delay, and SOL is 2/3 of its speed I'm vacuum. Switches can even sends out packets on a port before it has been fully received on the incoming port


> modern routers forward packets in tens to hundreds of microseconds

Only when network is underutilized, and there is no throttling/bufferbloat.


True, though routers with 100G interfaces do not have a lot of buffer space. But most ISP's make sure that isn't happening as it its pretty bad for their customers experience. I know some ISP's do not put enough effort into that at all but every single one I worked for did (Europe). In this case Google can put effort into direct peering relationships to make sure that their customers have a good experience.


Right, which is exactly big Cloud companies such as Azure and GCP have a huge advantage over most other smaller companies, as they have their own dedicated backbone, allowing them to minimize the number of hops.


> server on the other side of the world, but with no other latency accounted for - is about 200ms

In practice, it’s a lot worse than 200ms. 500-700ms are typical for servers on the other side of the planet.


That definitely varies greatly from provider to provider. I usually see <300ms for world-wide pings from Texas from a FTTH residential connection.

Average Pings from Digital Ocean regions: BLR1: 251ms FRA1: 129ms SGP1: 206ms

Other sites: yahoo.co.jp: 179ms rakuten.co.jp: 138ms belarus.by: 167ms

As most things go, YMMV.


> Honestly things such as controller and display lag are probably bigger than the round trip latency.

The problem with latency is that it adds up, lag from the server, lag from the TV, lag from my brain, lag to my fingers, lag from the controller to the machine, lag back to the server, lag in processing and then start again.

Client-Server lag might be acceptable in isolation but not as part of a system.


There are press articles talking about how latency is an issue when they tried it out: https://www.pcgamer.com/i-tried-googles-stadia-and-latency-g...

So not really anyone. And that's in an extremely latency friendly environment where the servers are just up the road.


Yep, I played a couple of the Arkham games on streaming services, the first one way back on OnLive before Sony bought them. It was fine.

I now play games with some friends and while I'm using a local PC, they use Shadow PCs (ie a VM with a GPU in a datacentre vaguely near them) and it's fine.

Maybe it'd be worse on an LTE/5G connection, but my phone is getting 25ms to the nearest Speedtest.net instance right now, which doesn't seem substantially terrible.

I would be much more concerned about the pricing of all of these things, than the technical issues. How many gaming subscription services am I going to need to add to my video subscription services?


Mind sharing what the solution you referenced?

From a hard theory network perspective, having higher bandwidth, lower latency, lower error rate on transmission path combined with closer geographical proximity to servers seem to help mitigate increases in latency.

However, if you argue the problem is solved by your empirical evidence gained by playing Witcher 3 of all games, I disagree with your point. Play a real time strategy game, a MOBA, or a first person shooter. 70ms~ (a report for Stadia reported 70ms-130ms, may be stale data) of lag is noticeable to people. On the other hand, I don't mind streaming a turn based game.


I had rolled my own setup with a GPU-powered AWS EC2 instance, streaming the video signal via NVENC etc.


Witcher 3 is a single player game and the users who care about lag by far the most are multiplayer gamers.

Certain games like rhythm games would also be pretty miserable with jitter...


I am sad with this threads discussion. Or I'm old. This is a logical step in lock-in disguised as user experience.

Back in the day, if you could maintain the data, my 15 physical 3.5 inch rather unfloppy floppies were monkey island. I lost my personal copy, but if I moved the data, I could have my own copy of monkey island right here. I personally failed on my own maintenance there.

Then we had - skipping some steps - something like private WoW Servers. they worked 90% of the way. But Blizzard kept some code and that never worked privately. Most stuff worked on the client side, most stuff on the server side could be reverse engineered.

And now we're planning to stream games via something similar to team viewer / citrix, put bluntly. At this point, what difference is there to an interactive movie? I can replay several single player games I acquired on steam off-line as long as I have the binary and a windows instance. I'm sure I can play FTL or slay the spire in 2040 without any developer interference.

If you stream those games, those games will die harder than the multitude of multiplayer only games with dead servers once the streaming servers get shutdown due to business decisions. Which can and will happen < 1 year after a failed launch in my book.


We already see this problem with other streaming media services; just because something is available today doesn’t mean it will be available tomorrow.

I don’t want to give that much control over to someone else. I don’t need all-you-can-eat content. Just let me pay for the music/movie/book/game that I want and enjoy it for what it is, and know I always can just use my copy without needing to be plugged into the internet.

I guess this view probably makes me look like an old man yelling at a cloud, but I’ve managed to find a way to survive without signing up for Netflix, and I hope I can still play games in the future without needing to sign up for a streaming service.


I have lost dearly held content to the internet deciding to not host it anymore, including spotify.

At this point, I trust my backup 2x SSD mirror more than some cloud. So I guess we're two old guys yelling at clouds.

And this will just go even more fucked once the article 13 of the EU goes weaponized in 2 years.


I remember an xkcd that was basically saying if you really want your files to be portable and available reliably until the end of time, the only solution is to pirate it



Yes, it really sucks when you add a movie to 'my list' and then you think about watching it tonight and poof, it's as if it was never there. It's adding insult to injury that it just vanished off '_my_ list' too.


It’s not all doom and gloom, that sort of mentality is still alive and well in the indie scene. Specifically people who make games like FTL (Rimworld and Factorio are two more recent examples of fantastic games that don’t require internet connectivity) are keeping the dream alive. My rule of thumb is to never buy games on Steam and only get them through GoG or directly from the developer. Sometimes there are a few titles that I really really want to play but are too tied to Steam and not on any other platform (I’m looking at you, Tabletop Simulator!) but for the most part as long as you are ok with staying away from AAA titles you can avoid always-online lock in and still have an amazing gaming experience.


I am not so worried about games and movies — having access to them is a plus but I wouldn’t be too upset over not being able to revisit a specific game or rewatch a specific movie. Music bothers me a bit more because I like to listen to a lot of my favorite music again.

What worries me the most about this trend is what impact it will have on the computers we use.

Today you can buy a powerful laptop or desktop. You can write software on it, for it, for any purpose at any time. You can distribute that software to others freely, and you can permit others to run your software for any purpose under permissive terms of your liking that still protect the rights you have over your intellectual property, the software that you wrote.

I might be falling trap to the “slippery slope” fallacy here, but the way I see it, once the majority of new games, movies, music and other entertainment is all accessed via streaming, and all of the big players in software move the applications that are used in business and in education to the cloud, the computers we use may easily:

1) Become too weak to do intensive work locally, because everyone are using them as thin clients only, for interacting with software that is run in the cloud.

and/or

2) Become so locked down that we lose the ability to distribute software outside of cloud delivery mechanisms controlled by corporate entities.

I am not anti-corporate. But we cannot allow the corporations to be in control over the software that we use, in this way. Not because they are corporations or because they are capitalists or anything that. The same would be true of any entity, whether that entity be a business, a state, a body of the government, a non-profit organization or even just a person.

When we allow someone else to gatekeep access to the platforms we use, we are handing our freedoms over to them.

I use an iPhone that is running iOS. I can afford to allow my phone to be this restricted because I have other computing devices that allow me to develop software for them on them, and to run software written by others on the operating systems that those other devices are running.

But if none of my devices allowed me to do this, and no devices on the market allowed me to do it either, then where would that leave me?

And where would it leave us? The humanity.

In my opinion, the ability to develop software freely, to distribute that software to others freely, and for them to be able to run that software freely, is so essential to our society and to the future, that if we lose this then we need to take drastic measures to regain the control that we lost as quickly as possible. We would need to crowdfund the development of computers that allow us to do what these future locked down computers don’t. And we would need to ensure that these freedom-friendly computers become used by so many people that we could continue to develop these machines and be able to continue to produce them, at a reasonable price. So that the ability to develop and run any software for any purpose will always be possible.


I can understand that you dislike the idea of being beholden to a 3rd party in order to run your game but you lost me with

>At this point, what difference is there to an interactive movie?

What is an interactive movie and how does having a game stream from a remote server change a video game into one of these things?


This has been a complaint especially in the adventure game community. If I go to the cinema to watch the endgame or a john wick movie, I pay 10 bucks for 2 hours of movies. A lot of AAA games are a bunch of boring filler with some interactivity, but you end up with a much worse money per minute ratio. If you discount cutscenes, I've spent 60 bucks on less than two hour of gameplay on some games.

Now, I could replay those games after some time. That improves my overall cost-benefit ratio from 60 bucks -> 2 hours of gameplay to 60 bucks -> 4 hours of gameplay. Or, if the game is good, even more. I have a bunch of adventure games I paid 10 bucks for, and I have replayed them a lot and I will replay them a lot.

However, if you have a AAA studio, they will consider a game a failure shortly after release. You get one play-through for 60$ and that's all you get, because then they shutdown the servers for their next thing. And of course they will "charge less" and "run updates" and "keep expansions going" and such so it's less obvious. I'm jaded at this point.


> Now, I could replay those games after some time. That improves my overall cost-benefit ratio from 60 bucks -> 2 hours of gameplay to 60 bucks -> 4 hours of gameplay.

Or someone has modded it, and you can play it again with different content, or altered content, or altered gameplay, or some combination thereof. And if it's owned and local, there's nothing they can do to really stop someone from doing that definitively. The first mods for games weren't because developers decided they wanted people to be able to mod the games, they were from fans diving in, reverse engineering and making changes. Developers embracing modding came later.

With a cloud based game or digitally verified synced content, some level of consumer control is definitely lost, and that's a shame.


I do think it's possible that streaming game services and what you are talking about wind up a little different. I could imagine that streaming game services could spin up and down servers more dynamically depending on demand. There would be a lot of motivation to do this in a more automated fashion than regular game studios because the streaming service will be dealing with 1000's of titles not just a handful. I think generally you keep your saved game info locally and then it is fed to a server which starts you at the appropriate place? I'm making a lot of assumptions but I do think the basis of my argument 1000's of titles vs ~10 is a pretty good reason to think this won't be handled in the same way.

Also, games like GTA V have been supported for a crazy amount of time. Transitioned into online play. People have probably gotten 1000's of hours of gameplay from a $60 purchase. I play things like fallout and minecraft so I also get a huge gameplay time to cost ratio, so I'm not very sensitive to this from my personal experience.


What AAA $60 game only has 2 hours of gameplay?


I took it to be a comparison to Netflix. Cloud gaming is analogous to streaming movies with similar tradeoffs, the difference being that games are interactive.


Cloud gaming probably isn't targeted to people like you, then. When the choice is between not being able to play the game at all (due to lacking sufficient hardware) and cloud gaming, the potential downsides of cloud gaming are worth it.


You're right, it's not for people like him. It's not for people that have been around the block and have learned a thing or two. They're hoping to take advantage of the young and naive.


Some people don't care about hoarding game discs or cartridge. I have drawers full of Nintendo Cartdrige, playstation 1, 2 and 3 discs, a few older xbox games, that I can't play anymore because the required console died, or they are back in my childhood home where I grew up. To be honest, I don't even want to play any of those games anymore, but if for some nostalgia infused reason I wanted to, I'd much rather they just be automatically available to stream on my TV or my laptop than having to spend hours tracking down all the required hardware for me to play a game that I'll most likely get bored with within an hour or two.

The same thing happened with movies, with some people wanting to hoard boxes and boxes of dvd, vhs or blurays, thinking that they are somewhat smarter than people who just pay some subscription (or digital rental) to stream whatever movies they want to watch when they want it, on the device they want to watch it on.


How is it only for young and naive? If they don’t stop selling physical versions with the arrival of video game streaming (just like they didn’t stop selling boxed versions of movies), I will just stream a lot of stuff first and then buy boxed versions of the games i end up liking a lot. Just like I already do with movies, music vinyls, and digital versions of video games.


> "If they don’t stop selling physical versions with the arrival of video game streaming (just like they didn’t stop selling boxed versions of movies)"

They will once it becomes normalized. DRM for software has long been far more odious than seen in the movie industry.


Oh, they will stop selling physical versions if it gets successful enough.


What I meant by "people like tetha" is people who can't play games on their own hardware, don't want to buy their own gaming hardware, and/or don't care about these rights.


I know what you meant and I think you're whitewashing the intentions of these corporations. They aren't motivated by good will, but rather by greed. They want to seize more control from consumers and they are going to normalize this first with the youngest consumers who don't know any better. Once these systems have proven technically feasible and consumers become compliant, the option to download the software and run it yourself on your own hardware will be taken away. Before you know it, there will exist games that only exist in "cloud" form.


Agreed. If we go full in on game streaming we can kiss actually preserving the art goodbye entirely. At least with streaming of non interactive media we can rip it to a hard drive somewhere.


I agree with you. I dislike digital distribution in general, largely for this reason. I believe it's likely that as long as I can find (say) a working PS4, my physical media PS4 games will continue to work as long as I would like them to.

Unfortunately, the general public wants convenience over these benefits. Instant gratification.


MS is contributing Azure resources, Sony is providing the camera sensors, and they're both working on AI. So they're collaborating on enhancing the panopticon and going to charge people to train it.

I wish I was kidding, but it's right there in the press release minimaxir linked[0]

[0]https://news.ycombinator.com/item?id=19931405


Google finally advances Linux+Vulkan gaming using a side door that bypasses the market share blocker altogether. MS and Sony don't like it and plan to mirror Google. Instead, they should stop being lock-in jerks, and should support Vulkan as well.


AMD is the big winner here, being behind the MS and Sony consoles for the last (and likely future) console generations and the Google Stadia service.

And with their GPU's going into datacenters and a move away from CUDA lock-in to OpenCL (example: https://www.openwall.com/lists/announce/2019/05/14/1 ), they're in a great position to increase market share of GPU accelerated tasks like AI and deep learning.


Isn't it already confirmed that next gen Song and Microsoft consoles are still with AMD?


Yes


I'm a huge fan of the fact that Google is embracing a Linux + Vulkan stack for gaming, but I also want them to have meaningful competition. Yes it would be a dream scenario if MS & Sony started supporting Vulkan, but it would also be a nightmare scenario if Stadia started dominating the gaming market, and gaming technology stagnated for a decade at Vega56-class GPU's because there is no competitive pressure to push the envelope.


Sure, competition is good. But it doesn't mean it should come with lock-in problems. MS learned not to mess things up on the Web (remember ActiveX and Silverlight?). But when it comes to gaming, they are still as nasty as in the '90s. And Sony is even worse in this regard.

I agree about Stadia in general. I wouldn't want it to become too dominant, especially since it's like DRM on steroids - you can't back up anything from there. In general, I don't think regular gaming stores that sell rather than rent games are going anywhere. I.e. GOG, Steam and others are going to stay around. As long as Stadia won't be pushing exclusives, things will be good, since some will always prefer to run games locally.


You hit on my biggest concern, the fact that Stadia is DRM on steroids. Relative further reading for anybody that has questions on DRM and it's effects[1][2].

[1] https://www.defectivebydesign.org/

[2] https://www.defectivebydesign.org/what_is_drm_digital_restri...


I think a lot of people are going to prefer to run games locally. I'm looking forward to playing strategy games on Stadia or games where lag isn't too critical. For quick reflex games, lag is important. And if VR games ever take off, lag is going to be critical.


Nobody cares what people think they want. In 5 years you'll be able to play AAA games on a $500 pc, pay by minute, and it'll be the only way to play them, and people will. Grudgingly, maybe, but they will. Everything on the server ('cloud gaming') is the only air tight DRM method, and publishers will jump on it as soon as it becomes viable at scale.


VR is the streaming killer. I was not entirely sure they'd even be able to get 2D video streaming like they have, because that already pretty much tapped out the technical prowess we can have. Heck, just running it over a network at all means you're going to experience pauses long enough to drop frames.

Since I assume Sony & Microsoft know this, and they aren't countering with a VR push but instead trying to match them with streaming, I suspect they judge the market either isn't there yet, or may never be there (on any business-realistic time scale). Or they just can't resist the DRM and lockin.


In the long term, streamed games (Stadia or similar) are definitely going to become dominant. Steam, et al, can still be storefronts, if they position themselves as such – we still need content curation.

All one needs is a screen with support (e.g. smartTV, chromecast, steam box), internet, and controllers. Classic game systems might be needed for another 5-10 years. Controllers & form factors matter (e.g. Switch, VR), but the "power" of PCs & game systems are no longer differentiators.


I suppose if Google starts kicking their butts in cloud gaming, they might loosen their stances on lock-in technologies in an effort to attract developers.


That's what I hope for. MS stopped being lock-in jerks on the Web only when they practically lost the browser wars. They need a stronger kick in gaming too, to start using cross platform technologies.


Do we feel like Google's efforts are a ... viable thing yet?

Not really sure the Sony and MS thing are a reaction, and I'm not sure Google's thing is so much a thing.

Some of Google's PR about how whatever it is they're doing is going to be all things (their statements have been strange IMO) could just be unfocused PR, or just an unfocused product.


How is that Google streaming gaming thing not Google being "lock-in jerks" then?


Presumably if Google makes game development for Linux commercially viable, that will make for much more vibrant community support (tooling etc.) for local Linux as well. That plus what Valve is doing with Proton could potentially break the stranglehold Windows has on PC gaming. I personally would go 100% linux on my media PC if I did not occasionally play games on Windows.


I think a lot of people don't expect games released on Stadia to ever be downloadable. It won't help linux at all.


They most likely wont be released, but the hope is that google contributes to open source like vulkan, linux video drivers, graphics rendering, as well as potentially release their own tooling.


Some of them not (from legacy publishers). Others from normal ones will, especially since they'll see it as "why not reach more users, we already did all the heavy lifting anyway", instead of legacy publishers' scornful "why do it, who is using Linux?".


In their context, lock-in could happen if they'll start pushing exclusives. But in technology sense, they are advancing common stack (Linux / Vulkan), unlike MS and Sony who use tools lock-in to discourage platform development by making it more costly.


Can you explain how its not a lock-in? It seems to require a Google account and is limited to games that are blessed by google. AFAIK, I don't think I can download Stadia and run it on a home server and stream games to multiple PCs in my own home. If Google made it open source, then yes, I'm 100% with you...


Lock-in as in requiring to use Stadia to get certain games. Store lock-in (exclusivity) isn't much better than technology lock-in.


The graphics API isn't that big a deal anymore. It's often a pretty small part of the engine codebase. Many games, especially those built on the popular engines like Unity, support both DirectX and Vulkan (and Sony's API too).


It is a lot bigger deal than you think, including for those very engine developers who work on Unity and etc. Having to deal with all extra lock-in APIs doesn't come for free. It takes time, resources and slows down releasing features to their users.


Vulkan _is_ a lock-in API, locking you into Google platforms; the one commercially viable platform using it is [recent versions of] Google's Android with another one on the horizon - Google Stadia. It's not necessary on Windows, not available on two of the three consoles, and not available without a wrapper on iOS and OSX.

(Technically, it is also available on Nintendo Switch, but there's also an arguably better alternative there.)


What? Vulkan is cross platform, not limited to Google or anyone else. You want Vulkan on your hardware and your OS? No one is stopping you from making a driver for it, it's an open API. For example, radv was made as Vulkan driver for AMD on Linux without AMD directly involved even (with big contributions from Valve).

If MS and Sony wanted, they could have Vulkan driver for Xbox and PlayStation tomorrow, since they are all using AMD GPUs. And no one stopped Apple from supporting Vulkan on their systems either. They don't, since they are also lock-in jerks, which is exactly the point I was making above.


That's just plain incorrect. There are already mainstream AAA titles targeting Vulkan published for Windows and macOS (via MoltenVK). Doom Eternal will only target Vulkan on PC : https://en.wikipedia.org/wiki/Id_Tech_7


And OpenGL and Metal, and even the web! With a sweeping generalisation I'm going to say that a games company is either too small to care about the graphics API, or too big to care.


Tell that to graphics programmers. Imagine how much better performance you could expect from games if they could spend their time optimizing for one API, and not making compromises to target a half dozen.


> Imagine how much better performance you could expect from games if they could spend their time optimizing for one API, and not making compromises to target a half dozen.

That is not really an accurate statement. Graphics performance is heavily dependent on the actual hardware design of the underlying platform. You can't wave a magic API-wand and have your executable or shader be magically fast on every hardware platform.


It also depends on design of the engine, which can be improved, instead of spending time on supporting a lot of redundant APIs. Support for lock-in is essentially a tax put on developers, and it's exactly the way lock-in proponents want it.


Why couldn't it, you dont think software updates can improve performance across the board? But his point was developers could focus on a single api rather than having to deal with quirks in directx, opengl, vulkan, webgl, etc.


>But his point was developers could focus on a single api rather than having to deal with quirks in directx, opengl, vulkan, webgl, etc.

That is really an imagined scenario. An extremely tiny minority of games are targeted for such a wide variety of platforms. Real-time high-performance code is always tightly coupled with the hardware - if you want to squeeze performance out of hardware, you want to minimize abstractions to such a level as they don't hurt your productivity. If you're targeting consoles, you can maybe target the PC platform. But there is no way you're targeting the nintendo DS or a smartphone with the same codebase without major modifications to the graphics code.


> An extremely tiny minority of games are targeted for such a wide variety of platforms.

On the contrary, most games are targeted for most platforms, to increase reach and sales as result of it. Only some exclusives of console makers are not doing that, and those are clearly outliers.

> Real-time high-performance code is always tightly coupled with the hardware

That's not common at all. Unless you want to beat modern compilers with assembly code, you only will produce something worse. Sure, there are are rare cases when using hardware specific features yields extra performance. Codec developers do that for example. But for games? Not usually needed. Shaders are provided in cross platform fashion (such as SPIR-V) and that's compiled into GPU machine code by the driver. And actual game code is commonly using some high level systems language (C++, etc.) + a scripting one.

Good performance is achieved by parallelizing the engine properly, since modern hardware is increasingly multicore.


How does Stadia advance Linux+Vulkan? The games run in a data center (and they're not running a standard Linux port, the presentations I saw made it sound like they were running basically another console port).


More gaming developers becoming familiar with Linux+Vulkan, more games ported to Stadia's Linux, all that means more potential releases for normal desktop Linux as well. Not all who target Stadia will clearly do it, but potential is surely better than without Stadia. So I see it as net positive for Linux gaming.


Still, the games will have a vulkan renderer and run on some form of linux, making an official port to the actual OS more likely.


The Xbox One has an NT kernel and DirectX as the graphics API. Stadia's use of Linux and Vulkan are probably about as helpful for a port to desktop Linux as an Xbox One game is for a Windows port.


That analogy is the biggest thing to happen for Linux porting since Valve's Proton SDK - all future MS-funded Xbox One games do come to Windows (albeit as UWPs) and run great.


I think "MS-funded" is doing a lot more of the work there than the kernel and graphics API are.


You mean, the project Google will cancel in 3 years? Yeah, I'm sure the two gaming behemoths that have been demoing the cloud working in conjunction with games for the past decade made their decision based on that. Microsoft demoed cloud compute doing physics calculations in games...what? 8 years ago? Sony already has a game streaming service. Microsoft is expected to make an announcement this E3 on the topic.

Microsoft and Sony want to be relevant in a mobile world too...that's what this is all about. They both were shutout of the mobile piece of the pie in gaming, and they are coming back for it. Microsoft and Sony want their gaming ecosystem on mobile devices with the ability to cross-play with wired devices powered by their SaaS models, Xbox Live and Playstation Online. That market is much bigger than the market they've already captured.


I'd be happy if Valve became a middleperson for AAA titles native on Linux+Vulkan. They have SteamOS and a decent controller ready to resurrect, and everyone to slap on PC hardware consoles again, given a makeover.

(Things like Wine are effectively sabotage.)


I'm not convinced they have the core competencies required to pull it off. It seemed like they had the hype and market positioning to pull it off the first time around, but just expected 3rd party hardware vendors to do the work and make SteamBox a thing via the magic of the free market. I think it might have worked if they put together an excellent 1st party product and gave it a real marketing push.


It would've helped if the top titles were there. (If you bought/built a SteamOS box, you have a rough-looking thing that doesn't play top games, when you could instead just run Windows+Steam, or get a Sony or MS console, any of which would play most things.)

Valve seemed to not make the final marketing push on Steam Machines, that I could see. Maybe because they realized that it wasn't coming together. And/or maybe it was intended as a warning, to not be forced out when MS was grabbing the app store.

Your idea of Valve in making a console themselves is interesting. They did get some limited experience with hardware, with the controller and the thin device. I don't know what all would go into some kind of manufacturing and branding partnership.

One thing that I think didn't happen is an initial loss-leader on the console, to bring people into the lock-in ecosystem, like the console companies might do. I don't see how the third parties had that incentive. And I don't know how loss-leader works if whatever anyone builds is just a commodity PC.


Wine and Proton are exactly the type of solution that I want. It works and it will get better every year. Linux users can run software without explicit vendor support. (software freedom 0 [0]) It doesn't burden corporations with expensive Linux support. Everyone is happy.

[0] https://www.gnu.org/philosophy/free-sw.html


The FSF is frequently misled by volunteers into bad strategy moves for its goals. RMS's criteria of "is it Free" is insufficient, and also easily exploited, just by spinning your thing to check off the checklist boxes.


Very interesting talk about Stadia, latency, Linux, Vulkan

https://www.youtube.com/watch?v=qdz4b5psrhE

> The inside story of how DOOM came to life on Stadia. id Software delivers a bird’s eye view of real-world Stadia development from conception to execution. Learn how high-performance games are made on Google’s new streaming platform. Recorded at GDC 2019.


Microsoft has been working on this ~2008. Google only started a few years ago.


Yeah but Google has serious bonafides as far as delivering high-res video at scale. Stadia is just Youtube with input handling and a game renderer instead of video files producing the frames.


'Just'? I'd say game streaming is an order of magnitude more difficult of a problem. YouTube has high throughput yes, but latency is the far more difficult problem, and potentially unsolvable with American ISPs the way they are.


I'm going to bet that YouTube's experiences with interframe compression will help them out a lot in this regard, as it could help them remove some of the data they need to transmit.


Wrong. Microsoft perfected this years ago with their Azure hosted desktop product. They’ve been working on high-res compression of GPU frames before Google even put a dedicated GPU cluster in their data centers


Yet MS is clearly uneasy now, since Google kicked them with Linux+Vulkan choice which will erode MS grip on gaming development.


Absolutely not. Game developers have deep deep knowledge in making games for Windows and PlayStation type OS’+hardware. Studios aren’t going to spend years having their devs relearn nuances specific to Google’s Linux implementation


They totally will, if Google will wave the huge user base size in front of them (which Google already did). So MS grip on gaming developers' mind share will fly out of the windows. That's the best thing I see coming out of Stadia and it's something MS clearly are scared of, since when it comes to their gaming business, they are used to lock-in and years of developers' mind share domination, rather than competition on merit.


>While details have yet to be hammered out, the partnership will center on artificial intelligence and the cloud. The latter category includes plans for joint development of cloud gaming technology.

So this agreement mostly focuses on Sony using Azure infrastructure for Cloud Gaming. I wonder how will Xbox will compete in this scenario, it's kind of weird hosting your competitor.


It's not like the tech industry is exactly new to using competing companies products.

Microsoft using Mac (not even discretely).

Apple using AWS, Google Cloud, Azure

Amazon Selling competing products

etc, etc

This may be a bit of a specialized case, but if you run a cloud hosting provider that is not your primary (or original) business you will need to accept that likely be involved in running a competing service. (I am sure AWS hosts a fair number of retailers and related technology)


The cloud changed everything from Microsoft's perspective. They don't care what language or OS or tool you use anymore, as long as you host it in their cloud, they are making money.


Apple relying on Samsung to build components for iPhones is a huge one as well.


Spotify is also using Google's cloud as well. They used AWS previously.


First normal assessment on this thread, spot on!


Seems like Netflix and Amazon get along fine.


Doesn't iCloud use Google Cloud?


Yes


Amazon Prime Video is not any real sort of revenue source. But yeah giving all your game analytic info to your competitor is sort of stupid. But Microsoft probably has access to much of that info anyway.


It's a great selling point for Prime which is more than enough.


It kind of functions as a hedge: this way they can monetize gaming even if PlayStation is winning.

Also Sony and Microsoft have a shared interest in having viable competition in the market against Stadia in case it takes off.


>it's kind of weird hosting your competitor.

Same as Apple using Samsung's tech in one way or another. Money talks after all. Different divisions, different profits, etc.


I suppose the risk is totally coincidental outages for Sony during critical moments in the product cycle. They'd better have a solid SLA in place!


Finally, someone listens! I've been saying for years that not enough services are dependent on unreliable internet infrastructure! I can't wait to add single player games to the list of things I am unable to do when my internet provider has yet another massive multi-day outage.


Would be comical if Sony still claimed that they can't allow crossplay.


There's something depressing to me about this more than other things moving to the cloud, irrational as it may be. Yeah they already could track you, and you already didn't own the game in many sense, but this just seems extra "casino" like. PC games in particular are like the last bastion of desktop software people actually care about; it seems like the end of PCs and the desktop.



Even trying to stream via LAN is unplayable. It's especially much worse streaming from a server that's thousands of miles away from your home.

I got 200mbs download and half the upload speed, I played plenty of PS Now titles, and the only problem I have with it is the input lag. There's no escape to it. You cannot play fast-paced games with such delay, especially multiplayer-FPS games, it's just terrible. It's better to have the hardware to play than to play remotely. Unless you don't mind 1-3 second (+ additional cause you're using the internet) input lags, then game streaming is just for you


I've tried to like my Steam Link, but it just isn't worth the hassle - wired with cat6 from one room to the other I've always had lags and glitches and disconnects.

Throwing in ping times of 50-100 ms to some datacenter, not to mention whatever traffic shaping shenanigans Comcast will inevitably apply, and I don't have a fuzzy feeling about the viability of this rash of streaming services.


Does your HDMI cable also have "lags and disconnects"? I mean, when you have hardware compression (presumably?), it's not all that different to send it over HDMI than ethernet.

Compression latency should be in low milliseconds (1 ms should be plenty.) The latency over ethernet is going to be measured in hundreds of microseconds. Sending a compressed 100 kB frame over it takes 800 microseconds, which can partially overlap with compression. Noticing 2-3 milliseconds extra latency is going to be pretty hard.

Something is badly wrong with your setup or with Steam Link.


If you find game streaming on your LAN unplayable you probably have a setup problem. If you have a solid router and you're only going through ethernet, no wifi then stuff like steam in home streaming should be basically flawless as far as framerate and input lag goes(I get ~3ms added from the streaming iirc). You will notice video compression artifacts though of course.

I do always use it with an xbox controller rather than mouse+keyboard though, so it's less latency sensitive.


Pricing is a key. As far as I know, Game streaming currently doesn't have viable business model.

PlayStation Now (former Gaikai) was a startup that wanted to sell game ads (try before buy), and after expanding into actual game streaming struggled to make money and attract users. It is just too expensive.

To me, the biggest difference bt movie streaming and game streaming is that gamess are much more resource hungry. Video can be watched on cheap 30 bucks usb stick, for gaming you need 400$ console.


Of course they are. Google is better at cloud and better at Linux, and Steam/Proton are already turning the tide for big games being available on Linux. If Stadia forces all triple-A titles to be Vulkan compatible, and more likely, Linux-compatible, Windows's strangehold on desktop gaming further weakens.

As a Linux user, cloud gaming excites me even though I have no intentions of ever using it.


Thanks for the constructive feedback as always.


Is the Akamai (largest CDN in the world) partnership with Microsoft announced last month related?

https://www.akamai.com/us/en/about/news/press/2019-press/aka...


No.


Having tried several video game streaming services over the years, I am not optimistic that streaming games will work for fast action games like shooters. The lag between a controller move and screen response is just not adequate, even when the server is only a few hops away with low ping values on an excellent network connection.


I wonder if these cloud gaming efforts will also lead to latency improvements in remote application usage such as rdp and/or vnc. We are considering using 'heavy' software dev and CAD/CAM workstations on servers in our server room or maybe even into the cloud, so that we can use lightweight machines locally.


OnLive envisioned cloud gaming really soon and was then shut up by SONY. My internet was not the best but the service was quite promising. It was about time to hear about cloud gaming again. I feel like onlive was more than 10years ahead of its time and Sony 10years late.


I'm curious to see what the long term implications for this technology are. While game streaming is a big market, I can't imagine it being that big of a market, so it'll be interesting to see what sort of longer term plans they're aiming for.


Will the most powerful titles white label or use their own custom cloud gaming? For example, Dota 2 or what pubg did with its App Store.

Removing the end user capital expense of a gaming rig sounds like a huge change. Would drastically reduce the cost of VR setups as well.


I can't imagine cloud gaming will be viable for VR anytime soon. It's far too sensitive to latency.


An alliance like this is a decades-long proposition.


Recently I saw a talk where John Carmack said he is working with camera manufacturers to find ways to reduce firmware overhead on the image coming out of the sensor for VR/AR applications to reduce latency. A lot can happen in a few decades, but this is a very hard problem to solve.


It's possible.

The trick is to render the pictures using the latest position of the headset. Include depth maps to make the rendering more accurate. Use ML to in-paint the gaps.


But the rendering happens at the server. You'd still need a local GPU to do reprojection for the headset.


Yes, but you need a GPU to decode 4k video anyways.


Reprojection for VR is typically done on the currently rendered frame, so the difference that the additional camera movement the reprojection is trying to account for is between 1/60th and 1/120th of a second. Here you would be trying to compensate for 30+ms, so the result is going to be a lot lower in quality.

Also for a moving scene, you'd need to send a depth or position buffer and a velocity buffer for each frame, meaning you'd need something like twice the bandwidth for your video. Probably more, since I can't imagine how you would compress that information: any artifacts are going to give weird results in the reprojected image.


Viable interactive VR will not happen over streaming (video sure). It's already tough to reduce latency to an acceptable range when games are played locally.

John Carmack has some good blog posts about this if you're interested in the details.


I didn't think this would ever happen. Sony execs presumably know the history of how alliances like that have gone (i.e., you almost invariably get backstabbed, sometimes first being given an offer you can't refuse, of a buyout).


Sony is the biggest backstabber of them all


I haven't had that impression, relative to some in industry. My external (and non-MBA) view is that Sony is definitely money-driven, like any other big company, and they aren't stupid, but, when it comes to excesses beyond the norm, they might be more authoritarian than malevolent.

For example, consider that the PS3 Other OS mess might've been more a Japanese "we can no longer have this, and must discontinue it, for the good of the company/platform" (not that I agree that was OK), than "now is the time to screw over those other guys, on things we never intended to honor in the first place" (which has been well-known SOP of some other companies).


I for one hope it works out, I like the idea of instantly loading any game I feel like playing for an hour. So what if I don't own it, it's a game. I don't any of the movies I watch on TV either.


Cloud stuff aside, the title can also read "two massive media companies will explicitly coordinate and not try to compete with one another in an otherwise practically unpopulated product space"


Why is it that these articles always mention Google's new Stadia service but never mention Nvidia's Geforce NOW, which has been in the cloud gaming space for years now?


nvidia doesn't have the scale to endure years of losses while making enemies of half of the gaming industry in order to pursue this. AFAIK they have 0 exclusives lined up, meaning DOA.


It's been operating for years already. They already have customers. In other words, the "A" has already passed.


This marks the beginning of the end of golden age of gaming.

Two historic foes setting aside their differences to target their common enemy - The gamer.


Sony’s got the game connections, M$ got the clouds


Console and when unused, a cloud gaming streaming server could reduce lag.

That's what I'm thinking about right now.


I assume this is in response to Stadia?


Safe assumption since the first paragraph says they are doing this to better compete against Google


And Nvidia GeForce Now. I'm wildly impressed by Nvidia's streaming solution.


By the way, Nvidia GeForce Now is free right now. Go sign up for their free beta and a couple months later you'll get an invitation

https://www.nvidia.com/en-us/geforce/products/geforce-now/wa...


Is it available for Linux? I can only find info about Windows PC and Mac. One of the main appeals to me of streaming services is I don't need Windows to play the latest games, in theory at least...


It's not and probably won't be any time soon, if ever.

https://forums.geforce.com/default/topic/1034029/pc/geforce-...


Is it naive to think these services might ever work through the browser and be truly cross-platform? Like I'm just streaming a video almost?


Probably when Stadia is released, you’ll be able to pay directly through Chrome.

Especially with how ubiquitous Chromebooks are, and how the original beta for Stadia was given to players through Chrome, it’s pretty much a certainty.

Since Microsoft and Sony have a more vested interest in keeping their traditional consoles afloat, you might have less luck with them.


I remember when Nvidia first launched GeForce Now in 2015 with their attempt at mobile gaming using the Android Shield. It was a good idea in principle, so I was sad to see it not do as well.

Glad that Nvidia is pivoting to moving the service back to a full computer setting. The Shield while cool, really could have done with a better screen.


One of my friends has a Shield, and let me play with it for a while (on and off). I really liked it- everything was so snappy and it was so powerful.

I wish nVidia would bring it back, TBH.


The only thing they pivoted on was their pricing model. GeForce now was just them beta testing on shield users. Once they were feature complete they flipped on the switch for PC and started charging more money.


They're charging money? I'm in a closed beta for free.


Perhaps this was mentioned on a different HN post/thread, but Azure facilities/farms are more widely distributed from a global-geographic perspective than either GCP or AWS. For game streaming, this should - in theory - provide a slight edge in speed.


I assume if they're announcing now they've been working on the deal for a lot longer than Stadia has been around.


Stadia has overcome the speed of light, so yes it's a viable product now and needs a response.

/s


There might be cases where Stadia will be too high-latency for some games/gamers, but I would wager it will make huge inroads in the untapped market of people who want to play games but can't or won't invest in gaming hardware.


I think it isn't the technical part of Stadia that will determine its success, but its business model. Without the pricing information, not much can be said about it. I'm a little surprised that Google hasn't been more up-front about how much it's going to cost.

PS Now accounts for about half of all game subscription service revenue, so half of 273 million (quarterly).

Doing the math, that leaves you with somewhere in the realm of 2.38 million active subscribers (273 million *.52 / $60).

The PS4 installed base is about 86 million consoles. That means only about 2 or 3 percent are active PS Now subscribers.

Source: https://www.gamespot.com/forums/system-wars-314159282/ps-now...

If Stadia comes out with a massive Steam or PS Store type of catalog where you can access any game, I think it'll be a runaway success.

If it's just PS Now (along with a similarly small catalog) without the need to spend as much on hardware, I don't see it being anything more than a ~2-5 million user subscription service.

It's a nice pile of revenue, but it's not "overtake the rest of the market as the preferred way to play games" type of numbers.


I wouldn't be surprised if it's not a subscription model, and they monitize like any cloud service. As a user you just pay for the games, and the publisher has to figure out how to pay for the bandwidth.


I think I'm starting to see why Google hasn't said anything about pricing:

> The tech giant believes that game developers will no longer be limited to computing and will be able to create games with "nearly unlimited resources". [1]

So the paradigm shift is that as a developer, your game simply consumes the amount of cloud resources that it requires to run, and nothing more. i.e. Stardew Valley is cheaper to run ("publish") than Red Dead Redemption.

So perhaps the idea is that the game publisher has to pay for the compute time and price the game accordingly.


Let’s revisit this in a year.


Sure thing


yes, instead they buy phones for 1k.


Exactly and to address the other child comment, those $1k phones have hardware to render AAA games now. Google doesn’t realize they don’t actually have a sustainable audience. It’s sad.


I predict the audience will be huge. The audience for watching Let's Play's on Twitch and Youtube is enormous. I would be surprised if a significant portion of that audience doesn't convert when the barrier to entry is clicking a "play" button on Youtube.

It's not just the people with $1k phones: it's all the students who's parents got them a MacBook Air for college, and all the kids and teenagers who's parents won't buy them a console for Christmas. Not to mention markets where gaming hardware is several times more expensive than in the US or Europe.

edit: if the experience is decent


And now they will be able to play AAA games on those phones.


just saying that for 1k you get a decent PC too.

besides it's not the "real" experience.


Most people will buy a 1K phone they can take with them than a 1K PC they can't.


And yet, the average American will never be able to use cloud gaming comfortably.


Apple is not doing a game streaming service.


No, but it announced Apple Arcade which is an iOS/TVOS/macOS gaming subscription service.


Right, it's a new business model, but it's odd to lump it in with Stadia etc.


Their subscription service is looking pretty anaemic compared to this and Stadia.


OnLive was envisioned cloud gaming in 2012 and was bought by SONY. My internet was not the best but it was quite promising. It was about tine to hear about cloud gaming. I feel like onlive was more than 10years ahead of its time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: