More likely millions per hour. But even then it's not like all those sales will be lost. Most people will probably just come back later to make their purchase.
I d be interested to know the numbers - with the amount of impulse buys or marketing driven links, there s probably a lot of ads being paid for that redirect to nothing a user might just give up on
I think peak Christmas time shopping was a couple million per hour back in 2006.
I know we used to contrast that to million per minute (or second) kinds of outages that NYC financial firms could have (along with the SEC escorting you out in handcuffs if you really screwed up).
Having played both, my opinion is that they are very different games. Satisfactory is much simpler in its mechanics.
Factorio is built on a custom engine and heavily optimized. I doubt you can build a factorio-style mega base in satisfactory and keep things performant.
Factorio has procedural map generation, enemies that build bases and attack you, blueprints, construction and logistics robots, logic circuits, incredible mod support, and you can play with dozens of people in multiplayer and things stay performant and enjoyable.
In the the two games chose a very different development cycle. Factorio released early and often and listened to their community by adding and changing what players were asking for. Satisfactory first released a game that was pretty close to finished, and released, so far, very few updates.
I will say that I enjoy both games, but for different reasons.
Actually, I'm pretty sure those are generated by a neural net. There are clear artifacts on the images that give it away. Maybe the author experimented with different source material before going for abstract paintings.
That's one definition. Another valid definition of a framework would be something that is a fundamental part of your project, or something you build upon, whereas a library would be something that you use to enhance a particular part of a project.
Bootstrap would seem to fit that definition in most cases.
Or a PC game can just slap a minimum RAM requirement on and be done with it.
I’m impressed with the tech but it seems like the end goal was to keep console manufacturing costs down. Now
it’s being sold as a gameplay-enabling feature and reason to upgrade. The upgrade only looks impressive because the PS4 by now is so old.
Relying on cheap SSD storage instead of expensive RAM, and relieving CPU effort via the storage streaming chip is a cool trick. But that tech alone enables absolutely zero gameplay experiences.
It hasn’t been proven to us whether or not a typical gaming PC’s increased memory just overcomes the need for this tech. If I have a PC with 32GB of RAM and my GPU has 8GB of its own RAM I’m not convinced that a PS5 with 16GB of shared RAM will do anything that the PC setup can’t.
Desktop computers eclipsed the performance of current consoles gen consoles so long ago that I am still suspect: my prediction is that a decent mid-range gaming computer is completely capable of playing any PS5 game.
Most tests I've seen of real-time game asset loading between the various types of SSDs on PC are incredibly inconsistent - for most games it is hardly noticeable. I'd be excited if I was proven wrong but this really feels like the typical console hype ramp up to black friday that South Park portrays so well...
This is not great logic, as those games are not designed around taking advantage of the faster SSD storage like new games will be with the launch of the new consoles. Most games are built with an HDD in mind and thus the ssd is not the bottleneck.
All AAA games need to be cross platform to maximize revenue with a long tail, so they target the lowest common denominator for hardware requirements.
No one is going to design gameplay for a special hardware constraint unless the gameplay can degrade to lowest common denominator. Which of course, makes needing the special hardware optional.
There are few exceptions to this rule. Some platforms pay for exclusivity, effectively covering lost revenue from other platform streams. And Nintendo alone makes a profit on hardware, so they can produce platform exclusives to drive addtl revenue from hardware sales.
Special SSD pipelines, while PC gamers are still using 7200rpm HDDs, are about as appetizing to game devs as waggle controls or Kinect sensor games.
The new consoles include these SSDs not to make something possible now, but to remain relevant in ten years time when PCs may have caught up.
This is the game industry's equivalent of supporting IE 11.
This is simply not true. That's like saying no game on PC is possible because not everyone has a good enough graphics card. Just have a minimum spec for required storage speed and you're golden.
I have had a similar experience when first using a 144hz display. I was amazed how responsive the mouse was. How "in control" I felt.
Then, going back to a 60hz display, I couldn't NOT see the gaps left by the cursor's movement. I had never before seen this as a problem, but seeing something better ruined 60hz for me.
I'm French Canadian and I usually avoid the space before those characters because they have a tendency to wrap on a new line by themselves.
You are supposed to use non breaking spaces but most tools won't do it automatically, and they are not trivial to type.
IIRC alt-space inserts a NBSP utf8 character on macOS and Linux basically anywhere. It’s been too long since I used Windows but I recall something like that working (maybe only in some select software though).
Now the visual feedback though... It usually appears as a regular space.
In our app the JWTs have a 5m lifetime. When a JWT is generated we also generate a single-use refresh token.
When a client tries to use an expired JWT the request fails, and the client will then exchange the refresh token for a new JWT/refresh token pair, and finally retry the request with the new token.
The refresh operation can reject the request if the user has been deactivated (it's basically a new login request, using the expired JWT as the username and the single-use token as the password).
Perhaps I'm misunderstanding, but how does this help in the case of a compromised token? Doesn't this assume the attacker hasn't also compromised the refresh token?
Presumably, the token and refresh token are both stored in the client-side app. If that gets compromised, the attacker now has the username/password combo they need to restore the session after the T+5mins has expired.
Since the refresh-token is single-use, the user will be logged out when trying to refresh their own token, and will presumably then login again, which should invalidate the attacker's refresh token.
But I agree it's not a perfect system. This is meant to specifically address the problems of long-lived tokens, since JWTs are hard to revoke without checking a blacklist on the server-side.
The main problem is that localStorage is more vulnerable to some classes of attacks than secure, http-only cookies.