Hacker News new | past | comments | ask | show | jobs | submit login
Plasma Bigscreen (plasma-bigscreen.org)
640 points by sudenmorsian on Sept 15, 2022 | hide | past | favorite | 269 comments



The killer feature of Kodi (on most hardware, for example the Raspberry Pi) is that it can switch the output resolution to match the video you are playing. It also bypasses things like ALSA in order to be able to bit-bang audio in supported formats directly to an external receiver. These are basic functions of normal video playback devices like blu-ray players or cable boxes.

A Plasma skin can't do any of this. So while it's probably very suitable for things like games and photo frames on a big screen, I fear that it is disadvantaged from the start for video and audio.


You don't typically don't want to switch output resolution to match the resolution of whatever your outputting on modern displays, but instead want to match the panel resolution and software scale the content. You're way too likely to confuse the monitor trying to match resolution. There's something to be said for matching frame rate, but there's even more dragons in that practically.


Doesn't that depend on how good the implementation is? Some modern TVs and receivers have exceptionally good scaling built-in.

I'd also bet with certain types of displays (eg: OLED, HDR and/or local dimming) there's benefits when there's a different aspect ratio (eg 21:9), letting the TV do something different to the backlight when it knows there there is no content in parts of the display vs getting a black signal.


Actually matching framerates is a lot more common, for example to provide smooth playback of 23.976/24 or 25 fps when the native refresh is 60hz (not cleanly divisible). Usually this is done by going up to 72hz or 75hz rather than down but it's a lot better than keeping at 60hz and stuttering. Previously this was accomplished with something called telecining which.. isn't great.,


Shouldn't the optimal solution be to adjust the display refresh rate using vesa adpative sync these days? No need for shitty interpolation when most panels can run at variable refresh rates.


adaptive sync isn't well supported yet


You can have multiple displays connected to an HDMI hub with different refresh rates. I have this problem with a projector that runs at 240 Hz in a TV that runs at 60 Hz. It confuses my receiver.


Truly modern TVs have much better scaling and frame interpolation than whatever your OS or media player can do.


Most people hang on to their TV for 10 years or way more than that. Expecting people to have truly modern TVs is a crapshoot. Only enthusiasts upgrade to the latest and greatest.


And unless you're a very informed technical savvy consumer who does his research and knows what he's buying, upgrading your TV might end up being a downgrade for you as you could end up replacing a perfectly good dumb TV with a new WiFi "smart" TV that spies on you and serves you ads.

No thanks, modern TVs!


Unlikely. I doubt that state of the art video processing is available in hardware. There are custom finetuned neural upscalers and frame interpolators available for every kind of content.


TVs have software too. My LG OLED gets updates to the upscaling software relatively frequently.


At the cost of seeing the lips move a little after the actor said the words :D


What makes the TV software better than OS software?


Usually software designed explicitly for the hardware where it runs, while OS software is designed to run on general hardware. If there is hardware specific fixes, they might be skipped as it'll make compatibility worse while software designed for exact hardware doesn't make tradeoffs like that at all.


I don’t think the actual hardware components inside a TV are all that different from an SOC because they’re essentially doing the same thing.


Changing the monitor resolution doesn't seem good. A software upscaling (like lanczos, or ML upscaling if you like that) can be much higher quality than whatever upscaling (probably bilinear) the display does.


Audio passthrough on the pi is done using ALSA. There is no bitbanging of anything going on here.


If it's using mpv, the xrandr plugin could work for resolution or frame rate switching.

https://gitlab.com/lvml/mpv-plugin-xrandr


I have an Nvidia Shield and the latest Apple TV for media playback. These pair with a pretty expensive home 7.1.2 theatre system.

I’ve made my peace with software upscaling of video to match resolution/frame rate. At least the Apple TV does it pretty well. If you don’t you get a really annoying flicker as the display changes resolution. The shield on the other hand has really bad up scaling by comparison, particularly with frame rates so I just leave it on.

Far more important in my book is properly sending the raw audio out. I have a very expensive audio unit that will do a far better job upmixing the sound. Unfortunately the Apple TV doesn’t let me send the raw audio out.


I don't think this has anything to do with kodi, kodi does not seem to have a browser or allows running external applications, you can instead run kodi as an app on this


I'm sure you could run Kodi on top of this, but then it would be just as restricted as any other app, and not be able to do the things I mentioned compared to running it directly via something like LibreELEC.

Kodi can actually run third party apps, for what it's worth: https://kodi.tv/addons/matrix/


But kodi is still an application with addons, Kodi is not going to allow you to run native applications on top of it


You can run native apps from Kodi.

I used to launch Steam games in Big picture mode from Kodi just fine.

Actually I was waiting for a desktop environment like this Plasma Bigscreen for a while but then got a new TV with Android and use that instead.

I would still prefer to use Kodi as the main app launcher on Android but it gets Killed to save memory and takes a while to start up again.

And using Kodi over HDMI is not an option anymore since a lot of content I watch comes in 4K 10bit HDR or 4K Dolby Vision and sadly desktop Linux does not support HDR and variable refresh rate over HDMI (so Linux Steam experience also bad).


Indeed, Kodi shines because it has native addons that make a consistent experience.

Everything is easy to drive via a TV remote.

The skin system lets you have as much UI eye candy as you like.

Chances are someone has made an addon for whatever you want to consume.

You're just running ye old HTPC otherwise.


How does Kodi bypass ALSA?


I can attest that KODI has options to bypass the default audio format that was automatically selected during handshake usually through HDMI.

The use case is the following. Consider one that has:

  * 5.1 capable audio system
  * A TVBox (KODI)
  * A TV
  * A projector
Now

  * The TV apps can play 5.1 content
  * The TVBox (KODI) can play 5.1 content when connected to TV through HMDI (5.1 was agreed in the handshake between TV and TVBox)
  * The TVBox __cannot play__ 5.1 content when connected to projector through HDMI because the projector is stereo audio only and there is no option to accept 5.1 content.
KODI gives the capability to __force__ audio output to be 5.1 if the playing video has it. It can transcoding on the fly i.e simple 5.1 in file to DTS.

It does not bypass ALSA. It uses ALSA. It bypasses the original HDMI handshake and follows the user's wishes.

Of course it does not work in my case because a) that's the universe we live and b) it hits some android capability / security problem and it switches back to stereo output.


He's talking about the embedded Kodi derivatives. (CoreElec, OpenElec, LibreElec etc)

They run their own audio stack.


I would really like to know more about this, and I can't seem to find anything about it. Are these projects implementing their own drivers for all the various audio hardware they can use instead of using ALSA? That certainly wasn't the case some years back when I used OpenElec.


"ALSA" has two meanings - there is only one set of Linux kernel audio drivers, and they're all ALSA no matter what distro you use. I think you're talking about this meaning.

The other meaning of "ALSA" is libasound / asound.conf / dmix, the old userspace audio stack, which is deprecated since Pulseaudio except for a few holdout distros. ALSA had difficulty mixing multiple playback streams, let alone bitstreaming DTS. I think the GP is talking about this meaning.


I sort of disagree. ALSA didn't really have difficulty mixing multiple playback streams unless you go back really far. dmix works quite well. It has a lot of problems, particularly dealing with routing, hot plugging, etc. but the dmix stuff always worked more or less fine for me, as long as I had it configured. otoh, pulse took quite a while to mature in my opinion, and had plenty of issues with this basic functionality over time, though I will at least be fair and say that it got blamed for a lot of ALSA driver issues and other nonsense too. (I think up until recently it was defaulting to doing resampling with a Speex resampler that suffered from fairly noticable aliasing in many cases, as a result of the projects paranoia to be seen as a resource hog...)

That and bitbanging audio through GPIO or otherwise bypassing the ALSA stack doesn't seem all that unreasonable. I can't find any specific information about it with LibreElec/etc. but it would not surprise me.

Oh well. I think Pipewire has a lot of potential today. It brings much improvement to routing and session management for audio and the hybrid scheduler design and considerations for pro audio use cases makes it very promising. I hope the need to bypass the Linux audio stack can disappear.


>let alone bitstreaming DTS

On the contrary, that's super easy with ALSA. Literally all you need to do to "bitstream" is "play" the data as if it were audio, and not have any software in between trying to adjust volume or otherwise modify it. You could do it with nothing more complicated than "aplay", making sure to specify the correct card and not dmix.


AFAIK pulseaudio, PipeWire and Jack are all built on top of libasound's snd_ APIs, it's the only sane way to use ALSA - just like you aren't going to make manual syscalls for read or write but going to use a libc


> ALSA had difficulty mixing multiple playback streams

At some point long before PulseAudio was ever a thing maybe but dmix with the default config has been working just fine all this time.


Doesn't pulseaudio solve that though ?


No they don't. They run ALSA.


It does no such thing. It either plays out via ALSA or Pulseaudio.


There's no reason a "Plasma skin" can't do anything Kodi can do. They're both just a bunch of software.


If they expect ordinary people to use this, the Install page could use some love:

https://plasma-bigscreen.org/get/

The very first heading is "Neon based reference rootfs". It gives the impression that you must know exactly what these words mean and what you're doing, or there's no hope of ever using this thing. (The classic 1990s style Linux onboarding experience, in other words.)


Note taken, the developer of Plasma Bigscreen asked for help recently with the content of the website (https://mail.kde.org/pipermail/kde-promo/2022q3/000223.html). So if people have other recommendations or want to help, feel free to reach us in the #kde-promo:kde.org matrix channel or just post a comment to this message.


I'm a programmer with a postgrad education and 10 years in the industry and I don't know what those words mean.


I'm a programmer since >30 years ago, have studied mathematics, can speak >5 languages, have programmed in more dead technologies than an average programmer knows current ones, and I don't know what those words mean.


You would have to actually be in the KDE user community to understand this terminology. KDE Neon is the distro that KDE created. What they are trying to say that its based on their flagship Linux distro. I think they might have written that in the most new user unfriendly way they could.


Root file system (rootfs) that's based on dev-unstable branch of KDE Neon. This doc is obviously not aimed at non-techs, but for Linux-savy people the first sentence should fully explain the heading.


I think you mean KDE-savy - I've been using Linux for over 20 years and had no idea what this heading meant.


"reference rootfs" makes it obvious what it is to Linux-savy people, so you don't need to know what Neon is to assume that it's some kind of a distro.


I would say that is very debatable, but it is possible that I am not Linux-savy enough.


I know what those words mean and I can't figure out if I can install this on an LG TV I'm planning to purchase.


If I understand correctly it uses a RPi or other ARM device attached to the TV.

Is there anything out there tbat overwrite the TV's "smart" firmware? Something analogous to OpenWRT and its router support?

It would be bet cool if you could use the controller and buttons "natively".


If Software Freedom Conservancy win their lawsuit against Vizio for GPL violations in their TVs, you will probably be able to install open source Linux distros with KDE Plasma Bigscreen or Kodi on any Vizio TV and soon afterwards lots of other smart TV vendors will be similar. Allowing the vendor operating system to remain on the device after you purchase it basically means spyware these days.

https://sfconservancy.org/copyleft-compliance/vizio.html


Open source != custom code can be run. Bootloader would be locked by cryptographically signing. There are source distributed TVs already.



A lawsuit against Vizio for GPL violation cannot compel them to open or release software from their OEMs. For example you are never going to get a functional driver for the display panel made by Panasonic.

It isn't even clear that Vizio would be able to comply with GPL enforcement, since they largely just OEM TVs built by AmTran in Taiwan and don't do much more than slap a logo on it.


That's a pretty strong statement to make.

It would highly depend on the contract between Vizio and their suppliers no? In particular if the lawsuit finds that anything from the supplier (like e.g. the display driver) is part of the GPL violation the contract likely contain clauses that would force the supplier to provide source code. Otherwise Vizio has grounds to sue their supplier.

That's the beauty about going after the big customer facing companies, it offers the strongest leverage.


Suing a rebranding company is actually the dumbest move. If I buy Ford trucks wholesale and put my own logo on the grill and my own software on the entertainment system that uses GPL code in violation of the license, you can sue me until the cows come home but you are never going to get all the other software Ford has on that car.

The idea that a court could compel Vizio to provide enough source code that you could run your own OS on the TV is pure fantasy. You would need to obtain binary blobs for firmware and drivers from an original TV, and distributing them would in turn be software piracy.


It happened with routers, the OpenWRT project is a result of GPL enforcement against Linksys. There is no reason the same can't happen with TVs and other devices. If the company can't comply with the GPL, then at minimum the lawsuit could be transferred to the suppliers, or at worst the company can stop using Linux and other software they violate the licenses of. There are plenty of non-copyleft or proprietary projects they can use too. Whatever software Vizio use that isn't subject to GPL requirements can be rewritten or replaced with other components. The main thing is that the GPL code get released.


You could copy the blobs from your own rooted TV, as with Switch jailbreaks.


> It isn't even clear that Vizio would be able to comply with GPL enforcement, since they largely just OEM TVs built by AmTran in Taiwan and don't do much more than slap a logo on it.

They should've looked better at their contract, then. It's not the consumer's task to make sure these companies and their suppliers stick to their licenses. If their suppliers ship software that's not compliant (i.e. the vendor doesn't follow GPL's requirements) then Vizio is not allowed to ship them either.

If they can't comply with the license, they shouldn't be allowed to be sold in jurisdictions where the GPL is considered legally binding; after all, they are violating copyright. Any damages from their vendors' malpractices can be resolved in a separate lawsuit if Vizio wants to hold them accountable but you can't hide behind your vendors to sell illegal wares.

As GPLv3 states:

> All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met.

If they can't or won't release the GPL source code, they can't legally ship the software as they haven't met the requirements for their license and if they do, authors of the GPL'd code could sue them for copyright violations.

In fact, this paragraph could bring them into even hotter water:

> However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation.

> Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice.

In other words, if copyright holders notify them of their violation and Vizio doesn't act in a timely manner, copyright holders could permanently retract their GPL licensed code from Vizio. Should a particularly important Linux developer choose to do so then that could effectively deny Vizio the right to ship Linux on any device it sells (to areas where GPL is upheld, such as the USA or Germany).


Sure. But someone attempting a lawsuit would be better served naming AmTran/Foxxcon as a defendant.


Someone buying a TV doesn't have any relationship with AmTran/Foxxcon, doesn't receive code from them. If the lawsuit was about copyright infringement (it is about third-party beneficiary rights of users under the GPL), then suing AmTran/Foxxcon would be the way to go, but they are in China anyway and suing them would likely be hard, so the best you could get is blocking AmTran/Foxxcon based products at customs.


Yeah, I was super interested until it turned out to be Yet-Another-RPi-MediaPlayer-image.

I mean, it seems cool if the voice commands etc all actually work but it is one of many.

Being able to update the actual TV firmware like: OpenWRT, Canon Hack Development Kit, Valetudo(vacuum robot) is what would really have my interest.


>It would be bet cool if you could use the controller and buttons "natively".

This is what LibCEC is for.


You might be able to use your native remote if it's infrared and LIRC supports it, and you get a little USB IR receiver dongle: https://www.lirc.org/

Or what I like is to get a generic air mouse/keyboard remote that has programmable IR buttons for volume, input select, power on, etc. to control the TV's native settings. There are a ton of inexpensive options: https://www.amazon.com/air-mouse-remote-keyboard/s?k=air+mou...


I use LibreElec on a Raspberry Pi 4 and it already supports receiving remote control messages from the TV. So if you hit the play button on your remote, your TV's IR receiver will pick it up, then pass the signal to the Raspberry Pi over the HDMI cable.


Is there a bit of commonality for remote button frequencies? I have 3 different remotes for TV equipment plus my AppleTV remote. The volume buttons from all 4 remotes work, and I've never attempted to program/train/learn them.


There isn't really. Most manufacturers have multiple sets of patterns.

There are smart remotes with libraries of patterns and capable of switching between those, eg Logitech Harmony (assuming that's still a thing).

There are also "learning" remotes. One of these would be easiest to homebrew. Basically you've got an IR detector. You point your existing remote at it and tell it to start watching for whatever button you want to map. Then you press that button on your existing remote. Voila, key mapped. You just have to keep doing that for all the zillion and a half buttons on your remote and you've got yourself a clone.


Yeah most use a 38khz pulse wave and then encode ones and zeros in it. There are tons of different brand-specific protocols for stuff like volume, power, etc. but you're right there is a lot of overlap and commonality. Some universal remotes just fire as many different brand signals on button press as possible.


Samsung TV i’ve got for my parents recently can control apple tv and random eastern european iptv box. Really surprised by this.


The Google TV remote has trainable buttons. I can operate the chromecast and but also adjust volume direct to TV. And turn TV on and off.

So, rarely have to use the TV remote. It's used only for live TV (rare) and switching input, also rare.


I think remote buttons are passed through CEC. For example, I can use my TV’s remote to open the menu and change settings, but when the menu is closed I can also use it to navigate my Fire Stick UI.


At one point, there was SammyGO firmware to root non-Tizen Samsung Smart TVs. However, it seemed to be for sideloading certain apps only.


> Is there anything out there tbat overwrite the TV's "smart" firmware?

Or use a monitor instead of a TV?


If a display manufacturer produced a quality display and supported something like this, I wouldn't mind paying a good deal extra for it. I wouldn't even mind if it shipped with a proprietary OS and I had to do the installation myself. I'd love to see Framework or Purism make a Linux powered smart TV.


NEC has commercial displays with builtin Pi CM4s.

https://www.sharpnecdisplays.us/products/displays/ma551-mpi4...


The fact that it actually comes with a Raspberry Pi is incredibly cool to me.

Oh, and of course there's a review of it from Jeff Geerling: https://www.youtube.com/watch?v=-epPf7D8oMk

Actually, this thing has some pretty decent specs... my initial reaction was that it was overpriced, but now I'm thinking this is a decent deal for what you get.


Looking at the product page, quite prominently this is displayed:

> WARNING: This product can expose you to chemicals including Styrene and Formaldehyde (gas), which are known to the State of California to cause cancer, and Lead

I'm not sure the risks are worth the price, I've never seen such warning on any product, but I don't live in the US. Is that a common warning to see? How likely is that to happen? Is it a thing that happens in case the TV breaks in half, or is something that slowly slips out in the air around the TV?

The linked website says "Exposure to these chemicals may take place when products are acquired or used", am I supposed to believe that just buying this TV and bringing it home can make those chemicals to leak out into the air in my home?

The statement brings so many questions but so few answers...


Yeah, styrene and formaldehyde are very common everywhere, usually due to styrofoam packaging. Wood and textiles are often treated with formaldehyde, even in Europe. Lead based solder is less common these days, and pretty much banned in the EU due to the RoHS directive, but there are weird exceptions, like how its OK in servers or networking equipment.


California mandated these sort of warnings a while ago so manufacturers just started slapping them on all of their products, just in case. Great example of alarm fatigue [1].

[1]: https://en.wikipedia.org/wiki/Alarm_fatigue


I'd be willing to pay that much if image quality was similar to (or slightly worse) consumer TVs of the same price.

I'm sure smart TVs make some money off data tracking, I know it's a part of Roku's business model. But I feel like most of the additional cost in this went to the "commercial grade" features. It can be left on at full brightness for months on end. It can probably withstand a kick from a teenager at the mall. The 3 year commercial warranty also probably has some actual urgency behind the support.

Do any non smart models exist that aren't targeting commercial usage? It's probably not practical, thr market for that is probably 0.01% of the total TV market.


There are loads of non-smart large displays on Alibaba. Just a dumb display with two HDMI and and DP maybe.


theres a tonmore to panel quality than size and resolution and im guessing that alibaba stuff isn't so great there. or is at best hit or miss.


This looks really great. I know it's expensive for the image quality, but this TV will probably outlive most consumer TVs since it's made for 24/7 operation.

The cost to image quality ratio would probably scare most consumers away, but I might have found my next TV. Thanks for sharing.


A bargain at only $2200USD


Your grandparent comment said:

> If a display manufacturer produced a quality display and supported something like this, I wouldn't mind paying a good deal extra for it.


Relevant: "There's a simple reason your new smart TV was so affordable: It's collecting and selling your data, and serving you ads" (2019) https://www.businessinsider.com/smart-tv-data-collection-adv...


The headline is not true.

> Without that revenue stream, Baxter said, consumers would be paying more up front. "We'd collect a little bit more margin at retail to offset it," he said.

A little bit.

The value of that tracking is worth a lot to a company making near-zero margins, but it's not a very big impact on the full price.


TCL said they're now making more money with tracking than with sold TVs. Just as comparison.


But still somewhere in the single digit percent, I think.

The march of technology is responsible for almost all of the cheapness of TVs, and niche commercial targeting is responsible for the non-cheapness of other TVs.

Hisense has a $430 TV comparable to that $2200 model but with better color. Tracking, I dunno, might be $50.


For a true commercial grade product with an actual 10bit panel? It kinda is? Plus it won’t spy on you


Its not like it doesn't have cons:

- 60Hz refresh rate - 8,000:1 contrast ratio with local dimming on - 500 cd/m2 brightness and "HDR compatibilty" - 86% DCI coverage


For that price you can get a 77in OLED tv from LG and a raspberry which you can use for Bigscreen. Might be better if your main use isn't prone to burn-in (e.g. by displaying the same UI elements over months).


What does this get you over connecting up an old laptop to a TV screen and controlling it with a bluetooth keyboard?


To add to this, if you use an over the top box (eg Apple TV, game consume, laptop), you don’t really need anything more than a display. All the smarts just add lag and add GUI items that can be buggy. My Samsung takes like 20 seconds between power and and display content simply because it tries to load the smart GUI, and times out on network connection (which doesn’t exist).


A much cleaner implementation that doesn't clutter the room with cables and an old laptop. Added bonus that I wouldn't have to explain how it works to each guest that tries to use it (assuming that the UX resembles other TVs people have used).


But people have used computers before. It works exactly the same as a computer. I think it's much easier than using typical smart TV interfaces.

It has the added bonus (over typical smart TV interfaces) that typing text becomes much faster. You can use the web browser as a web browser. You can use uBlock origin. You can have several tabs open at a time and switch between them at a sensible speed. It's so much better than a smart TV.

You don't need the room cluttered with cables, you can put the laptop in a cabinet underneath the television.


Even better, get a tiny computer and use cable ties to secure it to the TV mount (or velcro to the back of the TV). No wires will be seen.


Most new TVs are smart TVs so even if you do this you are still running on like Samsung have been caught screen capturing your TV and phoning home for advertising.


Your TV can't phone home unless you give it internet. Haven't heard of them shipping with mobile network access yet.


But when everyone keeps buying ""smart"" TVs and accepting the spyware, pretty soon they will start shipping with baked in 4G chips, and there will be no other option. Vote with your wallet by never buying a smart TV under any circumstance. They are cheaper than actual TVs because the data collected is so lucrative - buying one is implicitly supporting that slimy business model, even if you don't connect to WiFi.


Don’t even need 4G chips, just a regular ol’ wifi/bluetooth module thanks to stuff like Amazon Sidewalk: https://www.aboutamazon.com/news/devices/introducing-amazon-...


Love how the illustration shows houses directly on the road with no sidewalk.


There are no non-smart/"actual" TVs. Find a 55" OLED that doesn't have "smart" features.

I mean personally I don't care so long as I don't see ads, but the advice you're really giving is "don't buy a TV". Can't say I disagree though.


And that's what the Amazon Sidewalk mesh is for


How would they phone home without your help? If you're going the route suggested above then it would make no sense to bother putting your Samsung TV on the WiFi


Or just simply block their connection with a PiHole.


If Software Freedom Conservancy win their lawsuit against Vizio for GPL violations in their TVs, you will probably be able to install open source Linux distros with KDE Plasma Bigscreen or Kodi on any Vizio TV and soon afterwards lots of other smart TV vendors will be similar. Allowing the vendor operating system to remain on the device after you purchase it basically means spyware these days.

https://sfconservancy.org/copyleft-compliance/vizio.html


There is a missing HOWTO in my life - a FOSS based home guide. I would like to take advantage of linux on the home router, or this TV or better ways to cross manage messaging on different platforms.

Basically years ago FOSS made things simpler because it gave you back control.

Now maybe I am just out of the loop, but I am not feeling the control in the everyday.


That would be an awesome HOWTO

I've been using OpenWRT (previously DD-WRT) on my routers for years now. Mostly I just enjoy not having nonsense on my router!

I think how I got started with it was by finding my device in their devices table and reading the page specifically for it. https://openwrt.org/toh/start

The biggest caveats would be: I've never had any issues personally, but be aware of the risks; avoid bricking your router, and be prepared to recover it in case the worst happens (make sure you back up the firmware before flashing it.)

And once you're installed.... `opkg upgrade` is much riskier than `dnf upgrade`. There's usually no script doing necessary modifications for you. Use with care.


If you want to join forces writing that HOWTO please shout !

I dug out an old router in the garage and might play over the weekend


I don't understand what this is. I read the explanation but didn't fully understand it.

It's an OS one can load onto a smart TV, replacing whatever is there? Like, if I had a Samsung TV that began advertising at me, this could overwrite the UI?

If so, I looked for a hardware compatibility list and saw only Raspberry Pi, which confused me.


My first assumption was it’s just an OS for installing on a little computer like a Raspberry Pi, for plugging into a TV or monitor via HDMI.

But you’re right it’s unclear, this bit of the intro seems to imply you could install it straight to a TV: “Plasma Bigscreen turns your TV or setup-box into a fully hackable device”. So far I haven’t heard of any smart TV that lets you install a new OS, might be possible with hacks though.


I think it is meant as an open-source product OEMs can put on their TVs instead of some in-house Android TV derivative.

Sadly it's a non-starter without widevine certification.


Is "capable of Widevine certification" not tantamount to "Has HW/FW/SW in place to securely lock out the user/owner, and introduce backdoors"?


Most likely, but I mean from OEM's point of view. Noone wants to sell a TV that can't play Netflix.


Does it have widevine for netflix etc al, and hardware media acceleration?

I tried using an rpi4 as a media player a few months ago, and it was simply not fast enough, although one reason is I could not get HVEC hardware acceleration to work. I reverted back to using an old T430 laptop with a broken screen and k/b, which is ideal for 1080p.


I use the pi for storage (external 4TB USB) and minidlna to show on network. Playable by VLC, Roku and other DLNA aware devices. The RPi is headless.


I used a rpi2 for video at 1080p. x264 was no problem, x265 was laggy.

With a rpi3 or a rpi4 both should be fine.

Librelec (and other ROMs) ship the ffmpeg fork which uses the acceleration


It does not tranform your TV into anything. it needs a RPi in order to do anything.


This. And it just kills me how so many things "linux" look so unpolished. Why is it so difficult for projects like this to find and get designers interested?


This comes up a lot… The reason is that no OSS projects want a product manager. They want contributors.

Most projects, if you say “I think the gui should look like X” they’ll reply with “show me the pull request then”. No one wants a manager when they’re already volunteering, and not enough people who have the skills to make things look aesthetic exist and contribute.

Source: look at other comments on this article.


I think the problem is deeper than that: To do a good UI, you need multiple skills: design, development, user testing, etc. It's rare that a single person can do all of that, so you need multiple people to collaborate, which is hard, because of the lack of managers.

So you only get contributions from people who can contribute on their own, which is developers.


I disagree, from my experience designers are generally not interested in OSS projects like developers are. There is one designer for 50 developers.

Probably because for developers, OSS is a badge of honour and can help your CV / find paid jobs - while for designers it's just a dribble nobody cares about.

When designers partecipate, they get to contribute and even impose (sometimes questionable, think kde4) choices.


Maybe it's time for developers to launch projects that make designers look good when participating in OSS projects.


If only there were more people trying to complain less and contribute more. Its not closed source


Windows is also pretty unpolished, I think it is impossible to get people to work on polishing unless you have infinity dollars, like Apple. At least Linux has the excuse that it is a community project -- things that the community decides not to work on are sort of definitionally not annoying enough to be worth working on, or they would.


Windows is polished, but their product people are terrible / constrained by someone higher up to do terrible stuff.

Apple is getting there though, probably because of the same managerial culture.


I disagree. On my laptop running Windows 11, the edges of context menus are horribly pixelated because of the round corners they've added. Maybe there's a setting to fix that but I shouldn't need to look for that after I hit the "update to Windows 11" button.

This is something GNOME does better for God's sake and they can't even show thumbnails in a file picker!

I'd say Windows is selectively polished. Some parts are worked out beautifully while others are hacked together. macOS is a lot better in terms of polish (but a lot worse in many other aspects).


The whole "3 different settings UIs, one from each era of Windows, in Windows 10" thing, despite being a bit over-played, seems a bit unpolished. Maybe the plumbing is really polished, I just don't muck around with that in Windows.


It says it has "silky smooth interfaces" and "polished experience" in the very first page under the very first example image where not only the typeface is butt-ugly, also text layout is all kinds of messed up.


Unfortunately I can't figure out the scope of this project from the content on the site right now. Perhaps if there are any contributors reading this they might like to clarify?

The tagline on the home page is "A privacy-respecting, open source and secure TV ecosystem" so that sounds promising. Is this some sort of alternative firmware to replace the junk installed as standard on most "smart" TVs now - like installing OpenWRT on your network gear or rooting your phone and installing one of the alternative Android distros?


> Is this some sort of alternative firmware to replace the junk installed as standard on most "smart" TVs now…

I believe it's more along the lines of an open-source Apple TV or Roku box.


What could you actually run on something like that? None of “normal” apps would be there for the various streaming services.

I guess it could be a plex front end but outside of that can it really do much people usually do with their TVs?


My TV is WebOS. One of my streaming boxes is Android. I stream to my phone from an Android phone sometimes. I have an old Mac Mini with Ubuntu hooked up to a TV. I have two different XBox systems hooked up to two different TVs. These devices all have applications to stream video from things that aren't Plex.


But those are all non-open source solutions that Netflix, Amazon, etc. are willing to make apps for.

I don’t see streaming services making apps for this district any time soon.


Ubuntu is primarily open source.


I am dissapoint.


It sounds like the answer is no, but if Software Freedom Conservancy win their lawsuit against Vizio for GPL violations in their TVs then Linux distros, including this, could get ported to Vizio TVs.

https://sfconservancy.org/copyleft-compliance/vizio.html


It is not an alternative firmware one can flash, it is something that can run on any existing linux distribution either on a single board computer or your PC connected to a TV screen


Given the Steam Deck runs KDE, I'm surprised I don't see Valve down on the list of patrons at the bottom of the page.

Seems like this type of functionality/framework could be beneficial to the Steam Deck, since anything that looks good on a 10-foot display should also look good on the Steam Deck's display.


I get an impression it's close to direct competition to their console UI. I wanted to see a gaming UI like that that's fully open source.


SteamDeck already does what this does and more though.


Is it known if Valve is contributing things they've improved/fixed upstream.


Misleading to state that this turns your TV or set-up box into a fully hackable device, while the software just runs on a RaspberryPI


In theory this runs just as well on TVs if the right drivers are present. They ship postmarketOS versions of their firmware, for example.

Sadly, pmOS doesn't support any TVs as far as their website is concerned (https://wiki.postmarketos.org/wiki/TVs) but with some work (getting the mainline kernel to boot based on the GPL sources or getting the built-in kernel to boot pmOS) you can flash your TV with this firmware. In practice you won't be able to do much with it as TV SoCs are almost always shipped with underperforming hardware and as little RAM and storage as possible, but you could.


I agree. I thought this was something that you can load onto your smart TV.


I would love something like this, modern smart tv is kinda horrible experience and nobody is not making consumer dumptv anymore.


i was also googling for the supported TV sets and then i found it :D


All this promotional glitz, but no information to indicate this actually currently is usable on a single consumer streaming device or smart tv, let alone the particular one I have. A minute spent on the pages of their linked distros seems to confirm it's vaporware.


Nitpick for anyone involved in this project:

> Plasma Bigscreen turns your TV or setup-box into a fully hackable device

I know what you meant, but given the concerns people have with privacy of their TVs, smart devices, and in general the security of their internet connected home, describing something that turns your TV into being "fully hackable" feels like it's ripe for misunderstanding.


"Customisable" would be a preferable term IMO.

"Owner-customisable" or "user-customisable" perhaps to clarify that you're in control.

"Putting you in control of your TV" might be a good slogan or motto.


Hah. Same line, different nit.

> TV or setup-box

We're talking about a set-top box, right?


I wonder how long it will be before cheap TVs have no input ports (just a WIFI connection). That way the TV gets to decide what you watch to some extent. For example by interrupting your Netflix stream with a few ads from time to time.


Video games are way too big a business. Until game consoles die and it all moves to the cloud (Stadium, etc.) I can’t see ports going away on most TVs.


If I'm not wrong, latest Samsung TVs let you play Xbox games without an Xbox


I suppose they could do this even with ports - they could have small adds pop up in the corners, overlaying whatever is playing. They could charge $5 a month to disable the ads.


Shhh!

But you're right this is quite possible, perhaps you will be able to buy an ad-free model at a premium, like with Kindles.

Probably some ads would be preloaded, so you'll end up giving it wifi just to get new ads instead of enduring the old ones over and over and over again.


It would be hard to achieve unless it was somehow necessary to connect the TV to wifi in order to use it at all (even when something is plugged into a port). That would be pretty confusing for people I think.


Why would I need to install an operating system on the flimsy SoC the manufacturer embedded in my TV, with the sole intention of turning it into an advertising terminal? How fast will that hardware - software combo go out of date compared to the useful life of a TV? I own a 10 year old, non-Android Sony Bravia TV that looks superb, do they support that?

Sorry, but I don't see it as a problem worth solving. Just make sure your TV is airgaped, use it as dumb terminal and deliver signal from a suitable and well maintained device.


Being able to replace the junk that's on every TV these days would be awesome. But unfortunately this is not that.


I hope they'll get some design help.


Unfortunately this was the first thing I thought too and is often the first undoing of open source projects. It might be an impressive technical feat but it _looks_ a mess. Software developers often make the worst UX designers and all that.


> Software developers often make the worst UX designers and all that.

What bugs me is that most people make bad UI designers and they are completely oblivious to it. I see some people desktops and their editor's colour schemes and I wonder if they have any sense of aesthetics at all. A cursory look at those "desktop theme" websites is the proof that 99% of them are just plain terrible.

I swear I saw someone that replaced their default UI font with Times New Roman.

It's like what Steve Jobs said about Microsoft: most people just don't have any taste. It sounds arrogant, but I think there's some truth to it.


Wow, so this is why the marketing departments exists. I thought this was related to actual plasmas TVs lol.

Looks pretty cool though. I'm keen on open source privacy minded software I can run on my TV. Currently, I have Kodi and it would be nice to have other high quality alternatives.


Name is so strange. Even people here think about old Plasma TVs.

We desperately need an alternative to Chromecast for all sort of web videos. There is none (even Apple Airplay is not as good).

That's one of the reasons why I cannot switch to privacy Android forks or non Chromium browsers.


Looks like this is based on two different distros, KDE Neon for RPi4 and postmarketOS for other devices:

https://plasma-bigscreen.org/faq/


[bad comment about design - removed]


>not to mention the harsh centered shadow, is an immediate turn-off. Unfortunately it’s 2022 and linux design standards are still nowhere to be found.

I honestly can't fathom how people can complain about a free software that protects your privacy and gives you full control of your purchased hardware. If some overflowing text, harsh shadow, or whatever is a show-stopper, preserving your privacy wasn't that high of a priority for you anyway.

It's a free and opensource project. Feel free to open a PR or bug report.


Naw, its more like:

  --protecting privacy - check
  --decent UI - In Progress


> It's a free and opensource project. Feel free to open a PR or bug report.

This isn't and never has been protection from constructive criticism. Nor should it be. Choose to ignore it, or dismiss it, sure. But belittling valid criticism is a bad look.


Complaining about UIs on forums that the people responsible for the issue will likely never see is not and never has been constructive criticism.

It's a gripe, or a whinge, which is fine and open source shouldn't be immune to it. Although I will say that open source projects in my experience tend to be much more responsive to bug reports and actual constructive criticism from people paying them $0.00 than proprietary software projects are.


I bought an LG OLED recently after giving up on my dream of finding a dumb consumer OLED. I assumed I would disconnect it from my network and use my Chromecast for media, but instead I've recently unplugged the Chromecast. The interface is surprisingly good, it has every app that I want and it's fast. I can press the Netflix button when my TV is off and be watching a show within seconds.


I used to write software for smart TV's, so ended up using a lot of them. LG were easily my favourites (N.B. this was 6 years ago now).

One of the best things is the dev enviroment was so simple, just a basic linux setup with a sensible toolchain. (Samsung was easily the worst in terms of dev experience due to their tizen crap).


Same here, surprisingly good compared to the last LG TV we had in 2012 or something, they've made it a lot snappier. Maybe as time goes on, with more updates, it'll end up in the same situation...

What I really miss though, is being able to write my own apps for the TV without having to jump through 1000 hoops, sign 5 EULAs/NDAs/contracts and be able to develop said apps on my computer and then simply push them to my TV, just for me.


I was hoping there was a new plasma TV available.


Are there any good tv operating systems which use standard desktop/laptop architectures? Was excited by this until I saw it's only released for SoCs. My hacks/scripts on Windows + Flirc are getting tiresome.

Been looking at one of the Android distributions, BlissOS.


Is it possible to replace LG's webos with this,after rooting it?


I've seen this before when I was setting up postmarketOS. It's an alternative to the crapware on samsung """smart""" TVs.


I love what KDE has been doing with Kirigami on all platforms...except for desktop. This is a great showcase of what it's good for!


If you buy a nice panel, you won't be able to color calibrate it because of the use of Wayland; Wayland doesn't support color management yet.


Good. I am going to try KDE on a TV ever again. They don't seem to care, so why should I?

It's a sad state of affairs when the best GUI for a TV is Gnome 3


Hey I'm not at my friend's house and it's not 2005, why are we talking about big screen plasma TVs


Anyone know if the various cable companies streaming sites would work on this? (Been a while, sorry.)


Looks amazing, I'll have to see if I can set it up on desktop mode for the docked Steam Deck.


Does it let you watch Netflix, HBO, etc. with the usual DRM stuff?


Nope. 720p and lots of CPU.


Okay, follow up question then: Does it easily support streaming/downloading torrents?


At first I got excited that maybe someone finally developed a modern TV with no Smart TV features at all but nope it’s just linux crapware I’ll never use because I just plug in an Apple TV box anyway.


Ironically, this will probably destroy plasmas


Is this a good replacement for Kodi?


My name ist fadlan adnrika


How do you flag an account as a bot?


I realize "Plasma" is the name of the KDE presentation layer across all platforms, but it's mildly awkward to see in this context considering plasma TVs have only been a dead technology for six years or so: https://en.wikipedia.org/wiki/Plasma_display


Frankly it's disappointing that most of the comments on HN so far have been focused on the "marketing" and the website. The FAQ states that this project is still in very early stages and not yet intended for use as a daily driver. So if you actually read the website, it's clear why it's not consumer grade. I would much rather they focus on developing than marketing at this stage.

The basic thing going on here is simply that the KDE guys are working on a 10' experience and that's great, there is no good 10' experience today on Linux. Unless you count Steam Big Picture mode I guess? But a desktop environment which knows what to do about 10' would be really exciting.

Edit: for anyone who's interested, this appears to be the repo for the Bigscreen project: https://invent.kde.org/plasma/plasma-bigscreen/activity

The distro images on the website may be out of date.


The splash screen makes it look like this is something you can simply install on a television of your choice.

It is perfectly reasonable to criticise the marketing when it gets in the way of the message. Having to find and read the FAQ to discover that it is basically just a prototype desktop environment for existing distros is a bit silly.


I will admit that at first glance I thought this was a firmware hack for TVs, and I immediately started digging around the site to try to find a list of compatible TVs...

Once I found out what it was, it was still quite cool and I am curious to follow development, but I agree the site design/layout/copy is confusing at best, misleading at worst.


> I would much rather they focus on developing than marketing at this stage.

Totally reasonable to feel this way, but personally I think naming is so important that it transcends any technical merit a project might have.

For my own projects, I never write a single line of code until I come up with a name I truly love. Some times it takes weeks, but then the project has an identity which keeps me motivated over the long term and actually inspires my technical choices of what to build.

Compare to the “Yet Another” naming scheme which (to me) feels like a preemptive apology for a project’s mere existence: https://en.wikipedia.org/wiki/Yet_another


> Frankly it's disappointing that most of the comments on HN so far have been focused on the "marketing" and the website.

Or it's a strong signal toward the creators that their clever idea for naming was only clever in their heads and nowhere else?

I too got confused and was wondering "is this a project to revive old plasma TVs somehow?".


The marketing screenshots aren't too attractive honestly. It seriously seems to be lacking in consistent margins and padding. Look at the menu bar for example-- those icons are far too large for the space they fill. They look super cramped.

This is, somehow, typical in most Linux software I've used. They just don't care. :/


> there is no good 10' experience today on Linux.

What about Xbmc/Kodi? Works great on a raspberry pi hooked up to my TV... Has been doing that since the early 2000's when it ran on my brothers Xbox (original) but that was pre-Linux.

Edit: spelling


100% CPU redrawing the screen continuously. Or was, last I checked.


Here's a link with a lot more information. The big news today is that the UI component for Bigscreen has just gone live as a part of the KDE 5.26 beta, which is relatively straightforward to install on any Linux distro (or image to a USB stick): https://kde.org/announcements/plasma/5/5.25.90/

Maybe the post's main link should be changed to this one? That Bigscreen website seems to be confusing and out of date.


It makes me happy Big Buck Bunny is still being used for demonstrations.


What is a "10' experience"?


https://en.wikipedia.org/wiki/10-foot_user_interface

There are 10' experiences for Linux (Kodi, Plex, and Steam Big Picture are what I know of), but these aren't integrated into the desktop, they are focused on TV/movies and gaming respectively, so KDE focusing on this problem is something new.


I've never heard this term. Thanks for the great explanation.


It’s the common term for a user interface that’s large and legible to users sitting across a room from the display: https://en.wikipedia.org/wiki/10-foot_user_interface


Sitting three metres from the screen I presume.


No way. that would be a 9'9" experience, I believe..

Although I think your point is correct.


I'd say 10' is within the margin of error considering how many significant figures they gave in their estimate.


Is this essentially Lineage for smart TVs?

Can the vendor OS be wiped from these devices?

If so, is there a list?


I think that's close to, or part of, their vision: https://plasma-bigscreen.org/vision/

What they have in the Downloads section now is basically a variety of distro images which you can run on a Pi, or a device that supports pmOS, etc. There's a long way to go before you could buy any random smart TV and flash this onto it. Though maybe less long if you have a smart TV with an embedded Pi which might be a thing these days?


Yup. I wandered around page for about 20 seconds before realizing it wasn't some sort of open source hardware plasma TV.


I only realized it after reading this thread. Seems like a very confusing name chosen.


In combination with "Bigscreen" for sure.


Plasma (display) TVs are mostly dead - or at least no longer mainstream.

On topic: I'd love to have my TV run an open source OS that can run all the streaming apps to avoid OEMs snooping on what I'm watching


I'm pretty sure you'll only ever be able to have either an open source TV or all the streaming apps. No reason for Amazon, Netflix, YouTube, or the others to support this platform.


The biggest hurdle is the DRM. Plasma Bigscreen ship with it's own browser but you still need the proprietary blob for the DRM running on an ARM device instead of the usual one for intel/amd. Making things worse, even on intel/amd the video quality is often limited to 1080p on Linux.


Funny, since DRM is never an issue if you play pirated content. Arrgh maties!

But ironically, DRM is only a huge annoyance for paying law abiding customers.

Did you pay Ubisoft $50 10 years ago for Assassin's Creed 2, am amazing game? Great, because now you can go f*ck yourself instead of playing the game since Ubisoft took the server running the DRM for that game offline. Did you pirate the game? Great, because now you can play it for free indefinitely.

And there are countless horror stories of paying customer bases getting shafted on the products they (used to) own via DRM.


I purchased DUNE on YouTube. Turns out even after paying full price, the image quality is limited to 480P unless you watch via a smart TV.

To prevent piracy, or something.


Even worse is the fact that they sacrifice quality for bandwidth. Even if you manage to meet all their silly DRM requirements, you get poor quality video that's high definition in name only. There are titles in Netflix that have compression artifacts in 90% black frames.

Meanwhile pirates enjoy Blu-Ray rips encoded by people known for taking pride in providing the highest possible quality.


Are you watching YouTube on Linux by any chance? AFAIK modern versions of chrome and FF should not have this issue even on Linux but I could be wrong as I'm out of the loop on modern DRM.


On Windows, actually.


Fortunately since 2015 it has been legal to circumvent DRM for games that have had their single-player mode rendered inoperable by the decommissioning of an activation server:

https://copyright.gov/1201/2015/fedreg-publicinspectionFR.pd...

Yes, it still sucks when publishers make their customers jump through hoops like this, potentially exposing them to malware if the necessary DRM circumvention software comes from a dubious source.


You have HDMI port, there are RPis and others, I don't think TVs are running image recognition on HDMI input (yet).


They do, which is why you should never connect them to the internet.


Seriously? Well not connecting them to the internet works less and less, Amazon is rolling out their 900Mhz mesh with echo devices and there's LTE-M, NB-IoT and such.

The idea of not letting a device connect to the Internet is slowly becoming a thing of the past unless you live in a faraday cage (which is becoming more and more tempting). Oh, correction, thanks to mesh networks like that of Alexa living in a Faraday cage is not actually enough if at least one device is connected. Yay.


I do just fine not connecting proprietary IoT shit to the internet. Sometimes I have to bust out a soldering iron, but if it is hardware you own it is always a choice.


If all my neighbors dive into Amazon 'sidewalk' at 900Mhz, and trash my DECT handsets and my ZWave IoT, I am going to be really pissed.


If you build your own house maybe you can. And then run an access point and a repeater for 4g, like LTT did


Even with a mesh, devices still have to be authorised to get access. Just don't do that.

The more difficult option is if they come with cellular modems built-in, thus bypassing any of your infrastructure (which needs authorisation). That of course is technically possible, tho probably commercially unacceptable (modems cost ~15 to 20 dollars in bulk which is significant at smart-tv scales, not counting the data cost).


More expensive IoT items like modern cars ship with factory activated cellular modem based tracking. Your options today are to buy older base model vehicles or learn to use a screwdriver and a soldering iron.


> Even with a mesh, devices still have to be authorised to get access. Just don't do that.

Nope, the whole point of Amazon Sidewalk is that it "just works." I don't think there's even a way to know what devices are connected, let alone any kind of authorization.


LG WebOS TVs are capable of image + sound fingerprinting at OS level! Not only can they identify what you're watching, it's accurate enough to tell which scene you on and provide "handy" information overlays.


I'm pretty sure they do perceptual hashing of all displayed output.


The name is generally really unfortunate, and I wish they would switch back to KDE for the Desktop Environment (and call the framework Klibs or KDElibs or something).

Many people have positive associations with KDE, but associate Plasma with bloaty QML based desktop widgets from the KDE4 era. If they even associate it with KDE, and not with plasma TVs or blood plasma.


Literally what I thought the thread was about, and actually why I avoided clicking into it so long.


People who know KDE hardly need marketing - they already know what is what and what they want. So they could name it whatever way, I only care about how well does that name integrate in the whole namespace of Linux apps/libs/packages. At the same time, for non-techies or people coming from other OSes Plasma automatically sounds cool, doesn't it?

Nevertheless, as for what I personally feel, I agree - I like KDE better.


Admittedly i thought this as well. was hoping for a new plasma tech! Not that i think it's better, just nostalgia - it was the first 'new gen' tv I owned back in 2007. Heavy but well worth it at the time.


Still running my Panasonic 50" Plasma from that era. So heavy. And yes they are power hogs, but in reality, ours really isn't on that often. Three, maybe four hours a week, on average.

But oh, the picture quality back then, compared to LCDs and rear projection. No comparison, and even today, it holds up well.


I'm still rocking my 12 year old plasma TV, the thing won't die!


I connected a simple energy meter to my old (inherited with a house purchase so don't know exact age) 720p non-smart plasma TV to an energy meter just to get an idea of its power consumption. It was pulling between 350-400W! For comparison my 10 year old 1080p LED smart TV pulls about ~90W.


That's a benefit during winters in Canada, when it's on, I don't need to turn up the heat...


This trope is getting tired, heat pumps exist.


Maybe give it up, the energy savings associated with replacing it with a LCD/LED TV will quickly add up, environmentally and economically. Depends on how much TV you're watching though.


I still have a 50" 720p Toshiba plasma, from 2006.

Won't die... Been wanting to upgrade to a newer larger tv for years, but this thing keeps running along without issue...


I don't understand why people think electronics are supposed to die. Even if they do, it's often just some capacitors that died in the power supply. An easy fix!


this is about justification. The current thing works fine but I would like the new expensive thing. I cannot justify expense of replacing working thing, but if it would die, then it's justified expense.

it isn't about that it's more economical purchase when it dies, but I even saw people breaking stuff just so they "had to" buy new thing that they want.


Perhaps you’ve noticed this already but having bought my last TV in 2008 I had no idea how cheap they’ve gotten! Checking Amazon right now I see a 50” 4k Samsung for $450.


Expense doesn’t always mean purely financial. There is an environmental cost to buying something new as well, and a social cost in supporting the manufacture of electronics that may have less than perfect supply chain ethical standards.

I’m not saying this to be snarky or judgemental - your comment genuinely made me think about the issue and this was my response.


Yes absolutely.


An inevitable consequence of consumerism: https://en.m.wikipedia.org/wiki/Throw-away_society


Back in the day, there was a strong public perception/rumor that plasma screens would fail much sooner than other kinds. Maybe it was just a FUD campaign, after all.


Plasma did have an issue with 'burn in' on static images. If you did a lot of gaming on them, you could see it. Same with banners on news programming.

If you watch a lot of movies, it isn't an issue. Plus, the TVs came with a 'burn in' reduction program that you could run a couple times a year, but that operated by sweeping an intense white bar across the screen, so you were effectively 'wearing' the pixels down to a similar level to reduce the obvious burn in. If you ran that cycle to often, it would kill the maximum brightness.


Could have been FUD from manufacturers looking to push thier LCD TV over the competition plasmas.


Depending on how much it's on in a day, and the price of energy the cost of replacing it with an LCD might pay for itself.


wouldn't LCD be downgrade? i never owned plasma tv, but AFAIK biggest advantage are perfect blacks, so I guess OLED should be more natural upgrade path?


OLEDs have very good blacks, but as soon as a pixel needs to emit any light at all, it will be significantly brighter than what a regular IPS pixel can emit at it's lowest brightness.


LCD with full-array local dimming can get really close. It also gets much brighter than OLED. It's a much tougher decision than it used to be.


Also depends on the TV you upgrade to: replacing a 50" plasma with an 84" LCD will do nothing to decrease your energy usage.

While plasma uses a bit more energy than LCD it's not that big of a deal (especially if you've got one of the last gen plasma panels) if you don't use it as some sort of moving wallpaper but just turn it off when not in use.


Probably still would, if you interpolate this graph: https://www.rtings.com/images/power-consumption.png

Still, wouldn't recommend buying a huge TV. Jevons paradox and all.


Depends on the generation. I've got one of the last plasma's made (2013 model), it's a 42 inch panel and has a max consumption of 180W (only when at full brightness viewing a white screen).


Also, calling a home device "hackable" is not exactly going to make it attractive to most people. I know what they mean, but most people will read that very differently.


I highly doubt that "most people" would ever change their stock TV or TV dongle OS. So it targets geek and privacy-aware auditory and geek auditory understands what that means.


Not sure in other countries, but in my country (Korea) they were called PDP and rarely called plasma TV even knowing the technology behind it. It's a bit funny thinking about "Plasma" but it should be fine.

BTW, it had been dead for 10+ years...time flies so fast.


In Russia there was definitely a period when any large flat panel TV was called "plasma".


[flagged]


On the contrary - "Plasma" is one the best branding efforts in the open source ecosystem so far.

It's not really confusing or hard to search for if you know that "Plasma" is a KDE product - for instance, Plasma Bigscreen is the first Google result for "kde plasma tv" and "kde tv". It's 8 years old, it covers several different products with a similar look and feel, and it sounds as cool as it looks.

Big brands don't have an issue with using common words for product lines (like "surface" and "pixel"), and I don't see why open source should cede that ground.


I agree with what you’re saying, but you could’ve said it nicer.


Do they have any information on resolution, chip, or anything else?


From what I can see, it's a software ecosystem and not a specific hardware device.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: