Hacker News new | past | comments | ask | show | jobs | submit | leddt's comments login

More likely millions per hour. But even then it's not like all those sales will be lost. Most people will probably just come back later to make their purchase.


I d be interested to know the numbers - with the amount of impulse buys or marketing driven links, there s probably a lot of ads being paid for that redirect to nothing a user might just give up on


It's back


I think peak Christmas time shopping was a couple million per hour back in 2006.

I know we used to contrast that to million per minute (or second) kinds of outages that NYC financial firms could have (along with the SEC escorting you out in handcuffs if you really screwed up).


Yeah, but it's been 15 years. Amazon traffic has probably grown since then, most likely a lot.


Do you think that might have been why I specifically cited the year in my comment?


You can see on the photo at the top that they were running at 1280x720. As for graphics settings I would guess minimum.

Edit: seems mobile and desktop have a different crop of that image. Here is the image that shows 720: https://images.ctfassets.net/rporu91m20dc/1XYHhlYZzNI1NxRRJl...


So, does it count if I push 5000fps on 320x200?


1x1px seems simpler. Or a 4k static background. I’d say they all count, though some are less exciting.


Well, there's always the 3990x humming along at 2.3 trillion instructions per second.


Looks like the benchmark was cpu bound, so I doubt going to an even lower resolution would help much.


Thanks - I even looked in the images for it.

That would make sense as well in their discussion about CPU power - that resolution would require a lot of it compared to, say, 4k.


Having played both, my opinion is that they are very different games. Satisfactory is much simpler in its mechanics.

Factorio is built on a custom engine and heavily optimized. I doubt you can build a factorio-style mega base in satisfactory and keep things performant.

Factorio has procedural map generation, enemies that build bases and attack you, blueprints, construction and logistics robots, logic circuits, incredible mod support, and you can play with dozens of people in multiplayer and things stay performant and enjoyable.

In the the two games chose a very different development cycle. Factorio released early and often and listened to their community by adding and changing what players were asking for. Satisfactory first released a game that was pretty close to finished, and released, so far, very few updates.

I will say that I enjoy both games, but for different reasons.


Actually, I'm pretty sure those are generated by a neural net. There are clear artifacts on the images that give it away. Maybe the author experimented with different source material before going for abstract paintings.


That's one definition. Another valid definition of a framework would be something that is a fundamental part of your project, or something you build upon, whereas a library would be something that you use to enhance a particular part of a project. Bootstrap would seem to fit that definition in most cases.


The difference with PCs is that since the hardware is standard, developers can now create gameplay that depends on those capabilities.

Until all (or most) PCs are equiped with high performance NVMe SSDs, those kind of features won't be possible other than on consoles.

Also, the PS5's architecture is optimized end-to-end for faster loading times, it's more than just faster storage.


Or a PC game can just slap a minimum RAM requirement on and be done with it.

I’m impressed with the tech but it seems like the end goal was to keep console manufacturing costs down. Now it’s being sold as a gameplay-enabling feature and reason to upgrade. The upgrade only looks impressive because the PS4 by now is so old.

Relying on cheap SSD storage instead of expensive RAM, and relieving CPU effort via the storage streaming chip is a cool trick. But that tech alone enables absolutely zero gameplay experiences.

It hasn’t been proven to us whether or not a typical gaming PC’s increased memory just overcomes the need for this tech. If I have a PC with 32GB of RAM and my GPU has 8GB of its own RAM I’m not convinced that a PS5 with 16GB of shared RAM will do anything that the PC setup can’t.

Desktop computers eclipsed the performance of current consoles gen consoles so long ago that I am still suspect: my prediction is that a decent mid-range gaming computer is completely capable of playing any PS5 game.


Both worlds win here:

Finally the gamer gets low/no loading which is nice to have.

But also it becomes much easier for Game Developers.

I'm still looking forward to it, after all, it is an huge improvement to current gen, independently of how long it took and how old the ps4 is.

And i'm not 100% sure if this doesn't affect PC Gaming. After all Direct Storage will hopefully fix small SSD Issues you also have on PC right now.


Most tests I've seen of real-time game asset loading between the various types of SSDs on PC are incredibly inconsistent - for most games it is hardly noticeable. I'd be excited if I was proven wrong but this really feels like the typical console hype ramp up to black friday that South Park portrays so well...


This is not great logic, as those games are not designed around taking advantage of the faster SSD storage like new games will be with the launch of the new consoles. Most games are built with an HDD in mind and thus the ssd is not the bottleneck.


All AAA games need to be cross platform to maximize revenue with a long tail, so they target the lowest common denominator for hardware requirements.

No one is going to design gameplay for a special hardware constraint unless the gameplay can degrade to lowest common denominator. Which of course, makes needing the special hardware optional.

There are few exceptions to this rule. Some platforms pay for exclusivity, effectively covering lost revenue from other platform streams. And Nintendo alone makes a profit on hardware, so they can produce platform exclusives to drive addtl revenue from hardware sales.

Special SSD pipelines, while PC gamers are still using 7200rpm HDDs, are about as appetizing to game devs as waggle controls or Kinect sensor games.

The new consoles include these SSDs not to make something possible now, but to remain relevant in ten years time when PCs may have caught up.

This is the game industry's equivalent of supporting IE 11.


This is simply not true. That's like saying no game on PC is possible because not everyone has a good enough graphics card. Just have a minimum spec for required storage speed and you're golden.


I imagine it will not take long for NVMe to be part of the required or recommended specs for gaming.


Gaming PCs are standardized, in practice. People who have insufficient hardware don't play game X, or they upgrade.


Depends on what packages you install, but my VS 2019 install folder is 2.5GB.


My VS folder itself is 4.1GB, but there are tons of other stuff that gets installed too. I know I downloaded about 28GB on my first install.


I have had a similar experience when first using a 144hz display. I was amazed how responsive the mouse was. How "in control" I felt.

Then, going back to a 60hz display, I couldn't NOT see the gaps left by the cursor's movement. I had never before seen this as a problem, but seeing something better ruined 60hz for me.


I'm French Canadian and I usually avoid the space before those characters because they have a tendency to wrap on a new line by themselves. You are supposed to use non breaking spaces but most tools won't do it automatically, and they are not trivial to type.


IIRC alt-space inserts a NBSP utf8 character on macOS and Linux basically anywhere. It’s been too long since I used Windows but I recall something like that working (maybe only in some select software though).

Now the visual feedback though... It usually appears as a regular space.


> ... Linux basically anywhere.

Probably not in KDE, as that (by default) has alt-space bring up the Plasma "Search Bar".


In our app the JWTs have a 5m lifetime. When a JWT is generated we also generate a single-use refresh token.

When a client tries to use an expired JWT the request fails, and the client will then exchange the refresh token for a new JWT/refresh token pair, and finally retry the request with the new token.

The refresh operation can reject the request if the user has been deactivated (it's basically a new login request, using the expired JWT as the username and the single-use token as the password).


Perhaps I'm misunderstanding, but how does this help in the case of a compromised token? Doesn't this assume the attacker hasn't also compromised the refresh token?

Presumably, the token and refresh token are both stored in the client-side app. If that gets compromised, the attacker now has the username/password combo they need to restore the session after the T+5mins has expired.


Since the refresh-token is single-use, the user will be logged out when trying to refresh their own token, and will presumably then login again, which should invalidate the attacker's refresh token.

But I agree it's not a perfect system. This is meant to specifically address the problems of long-lived tokens, since JWTs are hard to revoke without checking a blacklist on the server-side.

The main problem is that localStorage is more vulnerable to some classes of attacks than secure, http-only cookies.


Imho this is a useless safety 'feature'. It only saves you from a mitm-attack and even then you hope that the mitm did not see your refresh token.

The moment you store data with Javascript it will be visible for any Javascript.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: