Gorgeous! It's a shame that CGI is so prevalent, because without context I would totally assume this was CGI. Some of the shots would fit right in with a Portal cutscene.
Imagine playing on the one in the center of the orb. It's the ultimate Steam Deck gaming throne.
I am continuously amazed at how awesome valve is and what awesome products it makes. Also steam supporting linux single handedly advanced adoption of that os. Apparently the best types of companies are those founded by deeply technical people, still owned and run by them, no venture capital. In an ideal world we would favour such companies over toxic ones.
Without taking away from the great things they are also doing, I'm mostly amazed at how bad and laggy the Steam app is, and has been for years, on all 3 platforms I've used it on.
(I'm sure there are loads of people who have never had any issues but to me that's like people saying there are no problems with Linux on the desktop because they don't have any problems.)
It could be improved yes, but that’s an acceptable kind of issues. Steam’s important functionality works well and is reliable - package management for games. It doesnt spy on me, doesn't push crap, doesnt use dark patterns. We need more apps like steam tbh.
I wouldn't say it works well. On some machines it takes ages to start up and will always say it's updating even though it isn't. The ingame browser that is used to view the store breaks often and thenI can't open it inside Steam at all. On Macos, the system menu completely stops working most of the time and I can't exit the app normally, I have to kill it instead.
OK, the core functionality of downloading and running games works. But I think we can expect a little bit more from a billion dollar company.
not acceptable when other stores did it. I guess that's my problem with the steam fanbase at times. Almost cult like how it upplayes the benefits and downplays the flaws.
I'd prefer a la carte TBH. if you just need a place to host your game, having a mod workshop, item marketplace, multiplayer servers, input management, etc. doesn't matter to you, then why pay for it? People who want that can negotiate it. For a base store hosting, 10-15% is probably fair.
But to be real honest I think the pricing parity is what goes too far. You want to charge 30%? okay, I should have the ability to pass that to the customer. the game is $15 but if you go to Itch or GoG you can get it for $10. But that's against steam's TOS.
no one else in the PC space charges 30%. GOG does but at least offers (or used to) an upfront payment in exchange for taking more share until it's paid back. Yeah, I think if they take that literal financial risk on my its worth giving them a larger share.
Personally I've been pretty disappointed with valve over the last 10-15 years because of the almost complete lack of output on their games and the way they abandon others (TF2). They'd rather squeeze more money out of steam than release new games
Adoption on desktop is 100% more related to Steam support, at least in the gaming segment.
Anecdotally, among friends and colleagues, people are only staying with Windows for gaming support.
People generally dislike Windows but are forced to stay their for gaming. As support for Linux improves, they'll be less willing to put up with Windows' BS.
Android is a Linux distribution unlike any other. They use a completely different framework for drivers for hardware, they have their own patches for binder and other things that see no use outside of Android and will never get upstreamed. They have their own libc which is not used in any other distro. Very little of the work that has been done to make Linux work better on Androids has benefited the rest of the Linux ecosystem. Even the WiFi/Bluetooth drivers, which is a massive shame.
Binder has been part of the mainline kernel for quite a while.
Also drivers for things on Android are for the most part done by third party manufactures and so its up to those third parties to upstream them. No different than any other driver used on more generic PC or server HW.
Also you realize that there's like 3 or 4 different libc projects for Linux distros to use right? Bionic isn't special in that regard.
>They use a completely different framework for drivers for hardware
No, both use kernel modules or statically compiled code for the part of the driver that actually talk to the hardware.
>they have their own patches for binder
Binder is part of mainline Linux, but yes I guess technically there are some patches that are related to binder, but remember that Android works on a mainline kernel.
Android uses the Linux kernel and keeps up with upstream to some extent.
macOS uses the XNU kernel.
Though as a user that likes having control over the software, I recognize that not having GNU/Linux being number one is a bit of a waste. (though one weekend of fighting NVIDIA and wayland tamed that quite a bit. Somehow my DE does not load with the proprietary driver unless I also load nouveau for some strange reason).
> though one weekend of fighting NVIDIA and wayland tamed that quite a bit
Looking back on that, this was mostly self-inflicted (used Debian testing rather than stable, and upgraded from Bullseye to Bookworm then to testing rather than clean install). I like twinkering a bit so I don't mind the pain that much but this is absolutely not representative of what new users would experience: my comment was clearly wrong (cannot edit/remove unfortunately).
This is not telling at all for regular stable distributions (Debian stable, Fedora). In general I had pretty good experiences with clean installations of those.
This is awesome. Video production just takes so many hours for each second and I don't think people know how much work it takes until you help out on a shoot.
It's why AI-generated movie shots would make a lot of sense. Hollywood spends billions of dollars, builds and blows up elaborate sets, and hires 100,000s of thousands of people... just to be able to have pixels move in a pleasing way. How much of that will be cut out when we just go straight to generating pixels? CGI goes in that general direction but it's still very labor intensive.
This is a surface-level argument that could be applied to literally every industry if you trivialize the product produced.
For example: NASA spends billions on research and development just to be able to push a rocket upwards into space? How much of that will be cut out when we just go straight to AI-generated plans and schematics?
It only makes “a lot of sense” if you’re willing to compromise your standards and eliminate labor no matter the consequences.
I thought about this as well but I think research activities and such might be qualitatively different, at least for now. Movies are merely looking to please our senses, on the other hand, researchers look for truths and facts about the world. It's okay to have movie visuals not based on reality, but it's usually not okay to have non-factual research.
Also, no one is going to force actors and movie studios to go full AI and those who wish can stay with the old paradigm. It will be up to customers to decide where their money goes. However, if AI allows for the same or higher quality and more creative freedom at lower costs, I'm pretty sure the old ways will get outcompeted.
... and taking it one step further, humans spend millions of hours watching those shows and movies, just to be able to consume movements of some pixels and parse it into some storylines.
What an inefficient way. How much time will be saved when we go straight to computers watching that content for us and summarising it in 140 characters.
I’ve actually gotten hooked on watching movie recaps on youtube. Even boring mid tier movies are interesting when they’re condensed to ten minutes. It also exposes me to Spanish and Russian movies, because the narrator recaps everything in English, and I don’t have to pay for any subscriptions anymore.
Would it be wrong and not art if one wouldn't need a $100 million budget to fully execute their vision?
And art is a term that's quite ambiguous and often means different things to different people. I think you would agree that a lot of people consume movies simply as entertainment. From that perspective, it matters less where the data about pixels came from. This is coming from a person who loves movies, artsy, not artsy, and everything in between.
Rereading what I wrote... It does sound like satire - but no, I'm serious haha. If the tech is able to deliver without compromises in quality, I think the transition would be inevitable.
When I see things like that I always think about their lifecycle. The frame will be stripped of the devices for another display. It'll still hang around in some honored spot, then they'll get new staff and it will be shunted into storage. In 3 years it will be disassembled and tossed into the dumpster because no one will have room for it at home.
I work for a company which has made a huge satellite constellation and after a revision of a hardware component has run its course thousands have made their way to space, incomplete/failing units are shredded, and only a handful are kept. When the satellites eventually burn up in the atmosphere those ~3 builds are all that remain of the entire project, thousands of man-hours to engineer and build hardware that the world will never see. I think about those few survivors a lot.
Props, acts and jigs are always like this. In reality, a small metal structure like this is not really so precious that it need be cherished or repurposed. Metal is very recyclable. It’s function was brief in time, but relatively high in impact.
I love how each of the decks being a proper computer made this much easier to pull off. Like, driving a display in a dummy device would probably get you the same effect, and take longer. But that's likely the route a third-party production company would take.
On the note of OLED increasing battery life, does anyone know the max number of hours of gameplay you can get on a single charge when the game is as simple as "snake"?
I have searched for answers on reddit and Google etc but I have only found the number to be 8hrs which is not for low power games. Given steam runs Linux and is hacker friendly, one should be able to juice it for much much longer if only early retro games are played.
8hr sounds accurate. There’s a wattage slider and you can only set it so low. You might be able to get more out of using the desktop and forcing some idle modes, however it’s impressive enough as is at a few watts and many hours on a portable Linux machine occupying less total space than a 13” laptop.
This ad brings out the inner kid in me (I'm 44 so, feeling a bit middle-aged here). I so want one badly but can't afford it and there's no way I'd be able to justify the purchase. Sad.
Great to see OBS also being used on this one - although I imagine setting up and configuring that many instances of OBS must have been a bit of a pain.
Only if you drive the display at the top end. If you end up driving in the middle, it helps prevent burn in
> By counting the time each subpixel is displayed and at what brightness, a "wear level" can be determined for each pixel, using an algorithm to estimate the luminance degradation this can be compensated for. However, to do this, you must have some spare luminance headroom that gets utilized as the display gets older. Or alternatively, if the display unlocks full maximum luminance when new without saving any headroom, the algorithm would dim the other pixels over time to bring them down to the level of the burned-in pixels, so the peak luminance of the display would diminish over time as the burn-in occurs.
Fortunately in case of Steam Deck it's possible to replace screen in case this become a problem in a couple of years and new parts wont cost like a new device.
I hope replacement is easier than with the OG Steam Deck because to replace the screen you have to completely disassemble the device, heat to unglue the screen...
so this was actually on an open source android keyboard (florisboard) which is made by one guy and does, in fact, have terrible auto correct. hilarious that it is in the same league as apple.
It’s probably better than Apple’s at this point. Apple’s has gotten so bad I disabled autocorrect. Seriously it doesn’t autocorrect Im to I’m, it leaves it as Im. Where is the QC?
Burn-in on OLEDs is really just uneven wear. I don’t think it really matters how bright they get for that, unless panel heat is an issue at higher brightnesses.
I remember encountering CRTs burnt in so badly it was hard to read stuff in the worst areas (e.g. the taskbar clock or login prompt) but I haven’t encountered anything remotely close to that with current OLEDs. My iPhone and TV have no signs at all, and the last device I used that had legitimately easy-to-detect burn in was a Nexus One test device that sat on my desk with the screen on all day every day while I built an Android app in 2012.
I assume that's what "some Steam Deck clips made by an accessory designer on Reddit" is referring to? Going back and watching the video again, yeah, it definitely looks like Deckmate grips holding all the decks to the orb frame.
Imagine playing on the one in the center of the orb. It's the ultimate Steam Deck gaming throne.