Hacker News new | past | comments | ask | show | jobs | submit login
The Steam Deck OLED spot ad was made with Steam Deck OLEDs (idlethumbs.social)
444 points by neffo 10 months ago | hide | past | favorite | 87 comments



Gorgeous! It's a shame that CGI is so prevalent, because without context I would totally assume this was CGI. Some of the shots would fit right in with a Portal cutscene.

Imagine playing on the one in the center of the orb. It's the ultimate Steam Deck gaming throne.


> Some of the shots would fit right in with a Portal cutscene.

The music really helps sell that idea.


I am continuously amazed at how awesome valve is and what awesome products it makes. Also steam supporting linux single handedly advanced adoption of that os. Apparently the best types of companies are those founded by deeply technical people, still owned and run by them, no venture capital. In an ideal world we would favour such companies over toxic ones.


Without taking away from the great things they are also doing, I'm mostly amazed at how bad and laggy the Steam app is, and has been for years, on all 3 platforms I've used it on.

(I'm sure there are loads of people who have never had any issues but to me that's like people saying there are no problems with Linux on the desktop because they don't have any problems.)


It could be improved yes, but that’s an acceptable kind of issues. Steam’s important functionality works well and is reliable - package management for games. It doesnt spy on me, doesn't push crap, doesnt use dark patterns. We need more apps like steam tbh.


I wouldn't say it works well. On some machines it takes ages to start up and will always say it's updating even though it isn't. The ingame browser that is used to view the store breaks often and thenI can't open it inside Steam at all. On Macos, the system menu completely stops working most of the time and I can't exit the app normally, I have to kill it instead.

OK, the core functionality of downloading and running games works. But I think we can expect a little bit more from a billion dollar company.


not acceptable when other stores did it. I guess that's my problem with the steam fanbase at times. Almost cult like how it upplayes the benefits and downplays the flaws.


And they only take a 30% cut! Practically altruism.


What is an acceptable fee structure for the service they provide?


I'd prefer a la carte TBH. if you just need a place to host your game, having a mod workshop, item marketplace, multiplayer servers, input management, etc. doesn't matter to you, then why pay for it? People who want that can negotiate it. For a base store hosting, 10-15% is probably fair.

But to be real honest I think the pricing parity is what goes too far. You want to charge 30%? okay, I should have the ability to pass that to the customer. the game is $15 but if you go to Itch or GoG you can get it for $10. But that's against steam's TOS.


That’s messed up indeed.


They should do like everybody else, take 30% cut and NOT improve Linux.


no one else in the PC space charges 30%. GOG does but at least offers (or used to) an upfront payment in exchange for taking more share until it's paid back. Yeah, I think if they take that literal financial risk on my its worth giving them a larger share.


AND spy on people, steal their data, and force sell couple of broken games.


Personally I've been pretty disappointed with valve over the last 10-15 years because of the almost complete lack of output on their games and the way they abandon others (TF2). They'd rather squeeze more money out of steam than release new games


Considering how they turn the games they do maintain into casinos, maybe that's for the best.


>Also steam supporting linux single handedly advanced adoption of that os.

Android using Linux is what single handily advanced the adoption of Linux among consumers.


Adoption on desktop is 100% more related to Steam support, at least in the gaming segment.

Anecdotally, among friends and colleagues, people are only staying with Windows for gaming support.

People generally dislike Windows but are forced to stay their for gaming. As support for Linux improves, they'll be less willing to put up with Windows' BS.


I'm one of the rare people on HN who likes using windows as a software development platform


Saying Android is Linux is like saying macOS is BSD.


No it isn't.

Android is as much a Linux distro as any other.


Android is a Linux distribution unlike any other. They use a completely different framework for drivers for hardware, they have their own patches for binder and other things that see no use outside of Android and will never get upstreamed. They have their own libc which is not used in any other distro. Very little of the work that has been done to make Linux work better on Androids has benefited the rest of the Linux ecosystem. Even the WiFi/Bluetooth drivers, which is a massive shame.


Binder has been part of the mainline kernel for quite a while.

Also drivers for things on Android are for the most part done by third party manufactures and so its up to those third parties to upstream them. No different than any other driver used on more generic PC or server HW.

Also you realize that there's like 3 or 4 different libc projects for Linux distros to use right? Bionic isn't special in that regard.


>They use a completely different framework for drivers for hardware

No, both use kernel modules or statically compiled code for the part of the driver that actually talk to the hardware.

>they have their own patches for binder

Binder is part of mainline Linux, but yes I guess technically there are some patches that are related to binder, but remember that Android works on a mainline kernel.


It's a Linux distro but not a GNU/Linux distro


Android uses the Linux kernel and keeps up with upstream to some extent.

macOS uses the XNU kernel.

Though as a user that likes having control over the software, I recognize that not having GNU/Linux being number one is a bit of a waste. (though one weekend of fighting NVIDIA and wayland tamed that quite a bit. Somehow my DE does not load with the proprietary driver unless I also load nouveau for some strange reason).


> though one weekend of fighting NVIDIA and wayland tamed that quite a bit

Looking back on that, this was mostly self-inflicted (used Debian testing rather than stable, and upgraded from Bullseye to Bookworm then to testing rather than clean install). I like twinkering a bit so I don't mind the pain that much but this is absolutely not representative of what new users would experience: my comment was clearly wrong (cannot edit/remove unfortunately).

This is not telling at all for regular stable distributions (Debian stable, Fedora). In general I had pretty good experiences with clean installations of those.


Android uses the latest LTS kernel and works using a mainline kernel provided mainline supports the hardware you are on.

>though one weekend of fighting NVIDIA and wayland

Wayland is freedesktop software which is different than GNU.


Not as a desktop.


This is awesome. Video production just takes so many hours for each second and I don't think people know how much work it takes until you help out on a shoot.


It's why AI-generated movie shots would make a lot of sense. Hollywood spends billions of dollars, builds and blows up elaborate sets, and hires 100,000s of thousands of people... just to be able to have pixels move in a pleasing way. How much of that will be cut out when we just go straight to generating pixels? CGI goes in that general direction but it's still very labor intensive.


This is a surface-level argument that could be applied to literally every industry if you trivialize the product produced.

For example: NASA spends billions on research and development just to be able to push a rocket upwards into space? How much of that will be cut out when we just go straight to AI-generated plans and schematics?

It only makes “a lot of sense” if you’re willing to compromise your standards and eliminate labor no matter the consequences.


I thought about this as well but I think research activities and such might be qualitatively different, at least for now. Movies are merely looking to please our senses, on the other hand, researchers look for truths and facts about the world. It's okay to have movie visuals not based on reality, but it's usually not okay to have non-factual research.

Also, no one is going to force actors and movie studios to go full AI and those who wish can stay with the old paradigm. It will be up to customers to decide where their money goes. However, if AI allows for the same or higher quality and more creative freedom at lower costs, I'm pretty sure the old ways will get outcompeted.


I agree. If AI can reduce labor costs at NASA, I'm pretty sure the old ways will get outcompeted.


... and taking it one step further, humans spend millions of hours watching those shows and movies, just to be able to consume movements of some pixels and parse it into some storylines.

What an inefficient way. How much time will be saved when we go straight to computers watching that content for us and summarising it in 140 characters.


I’ve actually gotten hooked on watching movie recaps on youtube. Even boring mid tier movies are interesting when they’re condensed to ten minutes. It also exposes me to Spanish and Russian movies, because the narrator recaps everything in English, and I don’t have to pay for any subscriptions anymore.


The explosion is even more vivid in one's own imagination!


Only on hn would you have someone miss the point of art this hard. That's not art anymore, that's not creative.


Would it be wrong and not art if one wouldn't need a $100 million budget to fully execute their vision?

And art is a term that's quite ambiguous and often means different things to different people. I think you would agree that a lot of people consume movies simply as entertainment. From that perspective, it matters less where the data about pixels came from. This is coming from a person who loves movies, artsy, not artsy, and everything in between.


What a sad world that would be


No.


Calm down, Satam.

Unless that's satire. But I'm not sure it is.


Rereading what I wrote... It does sound like satire - but no, I'm serious haha. If the tech is able to deliver without compromises in quality, I think the transition would be inevitable.


Inevitable sure, but sad.


I do a lot of YouTubing and the consensus is that every minute of video takes at least 1 hour of work - and this is just amateurs messing around.


In the pro world it's common for one person to spend hours on seconds, multiplied over dozens of people working on those same seconds.


When I see things like that I always think about their lifecycle. The frame will be stripped of the devices for another display. It'll still hang around in some honored spot, then they'll get new staff and it will be shunted into storage. In 3 years it will be disassembled and tossed into the dumpster because no one will have room for it at home.


I work for a company which has made a huge satellite constellation and after a revision of a hardware component has run its course thousands have made their way to space, incomplete/failing units are shredded, and only a handful are kept. When the satellites eventually burn up in the atmosphere those ~3 builds are all that remain of the entire project, thousands of man-hours to engineer and build hardware that the world will never see. I think about those few survivors a lot.


Oh the pain


We know you're talking about spacex


Props, acts and jigs are always like this. In reality, a small metal structure like this is not really so precious that it need be cherished or repurposed. Metal is very recyclable. It’s function was brief in time, but relatively high in impact.


is there something inherently bad in this? do you feel something is lost?


I do feel a twinge sometimes. I tend to get a bit too sentimental about artifacts.


I love how each of the decks being a proper computer made this much easier to pull off. Like, driving a display in a dummy device would probably get you the same effect, and take longer. But that's likely the route a third-party production company would take.


Norm from Tested has a short vid on Twitter of him in the sphere:

https://twitter.com/nchan/status/1722688222713749881


On the note of OLED increasing battery life, does anyone know the max number of hours of gameplay you can get on a single charge when the game is as simple as "snake"?

I have searched for answers on reddit and Google etc but I have only found the number to be 8hrs which is not for low power games. Given steam runs Linux and is hacker friendly, one should be able to juice it for much much longer if only early retro games are played.


8hr sounds accurate. There’s a wattage slider and you can only set it so low. You might be able to get more out of using the desktop and forcing some idle modes, however it’s impressive enough as is at a few watts and many hours on a portable Linux machine occupying less total space than a 13” laptop.


Part of me thinks this crazy, the other part wants to see this running doom with 360 vision.


This ad brings out the inner kid in me (I'm 44 so, feeling a bit middle-aged here). I so want one badly but can't afford it and there's no way I'd be able to justify the purchase. Sad.


Do we know who's the supplier of the OLED panel? Samnang? LG? BOE?


Samsung, but BOE may be dual-supplying it: https://twitter.com/SadlyItsBradley/status/17227592388066431...


Last I heard, it's the same supplier as the Switch OLED; that's why it uses MIPI over eDP. Samsung is the manufacturer I think


That's what I've heard too. And yeah, Samsung makes the Switch's panel according to iFixit's teardown.

https://www.ifixit.com/News/53272/nintendo-switch-oled-teard...


(GP)> "same supplier as the Switch OLED"

> That's what I've heard too

Is any of this confirmed or just rumor from the LTT video?


The other comment chain (2 levels above but in the same first-level thread) links to this tweet with a code screenshot that seems to confirm it: https://twitter.com/SadlyItsBradley/status/17227592388066431...

    if ((vendor_product->product == GALILEO_SDC_PID) || (vendor_product->product == GELILEO_BOE_PID)) {
        // ...
I'm assuming SDC ("GALILEO_SDC_PID") is Samsung Display [Corp?] (a search seems to confirm that's their acronym).


Great to see OBS also being used on this one - although I imagine setting up and configuring that many instances of OBS must have been a bit of a pain.


I got pretty excited that it was Jake Rodkin posting this on something with the Idle Thumbs name.


Ultra-bright means higher burn-in risk, or am I wrong on this one?

Edit: thanks for clarifying


Only if you drive the display at the top end. If you end up driving in the middle, it helps prevent burn in

> By counting the time each subpixel is displayed and at what brightness, a "wear level" can be determined for each pixel, using an algorithm to estimate the luminance degradation this can be compensated for. However, to do this, you must have some spare luminance headroom that gets utilized as the display gets older. Or alternatively, if the display unlocks full maximum luminance when new without saving any headroom, the algorithm would dim the other pixels over time to bring them down to the level of the burned-in pixels, so the peak luminance of the display would diminish over time as the burn-in occurs.

https://arstechnica.com/gadgets/2023/11/why-oled-monitor-bur...


Do we know if the steam deck has this feature?


Just about any OLED made in the past 5 years do.


Fortunately in case of Steam Deck it's possible to replace screen in case this become a problem in a couple of years and new parts wont cost like a new device.


I hope replacement is easier than with the OG Steam Deck because to replace the screen you have to completely disassemble the device, heat to unglue the screen...

https://www.ifixit.com/Guide/Steam+Deck+Screen+Replacement/1...


It's one of the improvements they listed regarding repair-ability

> Improved display repair/replacement to not require taking rear cover off


Does it still work as a very efficient SD card cutter though ?


It does.


They have made a number of changes to makebit easier to fix, not sure about the screen though.


> makebit

I make this same mistake on iPhone all the time. Is it just me or does Apple need to step up their keyboard and autocorrect game?


so this was actually on an open source android keyboard (florisboard) which is made by one guy and does, in fact, have terrible auto correct. hilarious that it is in the same league as apple.


It’s probably better than Apple’s at this point. Apple’s has gotten so bad I disabled autocorrect. Seriously it doesn’t autocorrect Im to I’m, it leaves it as Im. Where is the QC?


Burn-in on OLEDs is really just uneven wear. I don’t think it really matters how bright they get for that, unless panel heat is an issue at higher brightnesses.


Higher brightness leads to faster wear. If the wear is uneven, this leads to faster burn at higher brightness.


Burn-in on CRTs was just uneven wear, too, but still a pain in the ass.


I remember encountering CRTs burnt in so badly it was hard to read stuff in the worst areas (e.g. the taskbar clock or login prompt) but I haven’t encountered anything remotely close to that with current OLEDs. My iPhone and TV have no signs at all, and the last device I used that had legitimately easy-to-detect burn in was a Nexus One test device that sat on my desk with the screen on all day every day while I built an Android app in 2012.


So, how much for the orb and when we can order one?


Damn, no call out to DeckMate by name in the thread? Sad


I assume that's what "some Steam Deck clips made by an accessory designer on Reddit" is referring to? Going back and watching the video again, yeah, it definitely looks like Deckmate grips holding all the decks to the orb frame.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: