Hacker News new | past | comments | ask | show | jobs | submit login
The smallest and worst HDMI display (mitxela.com)
969 points by todsacerdoti on March 31, 2022 | hide | past | favorite | 162 comments



Article makes fun of how slow i2c is compared to the super high speed pairs next to it on HDMI plug. Well, please note that there is a bus that is literally a thousand time slower than i2c on that same connector. It's CEC. Contrary to i2c it only occupies one wire rather than two though. CEC, which is used mainly to power TV on/off or switch sources has a whopping bandwidth of 400bps. No missing multiplier.

Maximum device name is 16B. If you have a TV, a tvbox and a game console, simply asking the name of those devices can take more than a second. I'm in the process of shortening my product's CEC name, in order to reduce congestion on the bus.

Mainline Linux CEC developer jokes that Voyager 1 which is on the other side of solar system, made 40 years prior to CEC, is faster than CEC.

Edit: Sorry for the atrocious original formating


Voyager is so far away that it's estimated data rate is only a hundred or so bps.[0] So CEC is actually faster now than Voyager! New Horizons still has the edge with an estimated kilobit/sec.

EDIT: I was partially correct. It seems Voyager can transmit at up to 1.4 kbps for certain things.[1]

[0]: https://space.stackexchange.com/questions/24338/how-to-calcu...

[1]: https://voyager.jpl.nasa.gov/mission/spacecraft/


Voyager still actually gets up to a whopping 1.4 kbps downlink! (but only 16 bps uplink)

https://voyager.jpl.nasa.gov/mission/spacecraft/


That's a bit confusing; shouldn't it be the other way around? Because they're using the same antennas in both directions, but the power budget on the Earth-transmitting side is much higher.

(Asked in the spirit of curiosity; I'm ignorant of radio physics).


I'm assuming that our ability to computationally aid / resolve signal is far higher than voyager's.


That's a factor, but I think it's more that there was never a need for large data uplinks to Voyager, while they needed to downlink large amounts of science data.

The uplink and downlink are in different frequency bands, S-band (2-4ghz) for uplink and X-band (8-12ghz) for downlink and have different amounts of radio frequency bandwidth allocated.

Additionally, Voyager's High-gain antenna has a much narrower beamwidth in X-band at 0.5°, compared to 2.3° for S-band, though I have no idea if that's a factor.


Plus receiver size.


It’s mainly this and it’s normal for spacecraft (especially in deep space) to have orders of magnitude more datarate on downlink.


There are a couple of factors at play here.

The first is that the downlink is at a higher frequency (X-band, 8.4 GHz) so there is more available bandwidth and the beamwidth of the antenna is tighter at a higher frequency (i.e. the power is more focused). The uplink is at 2.3 GHz, but part of the advantage of the larger beamwidth is that you are able to command the spacecraft if there is an anomaly and it is the high gain antenna is more offpointed. There is also a low gain omnidirectional S-band antenna in case it is really offpointed.

Although the signals both come through the high gain antenna there are two different feeds for the X-band and S-band. If you take a look at the picture of the antenna on Wikipedia [0] you can see the X-band feed is the prime focus behind the subreflector. The S-band feed is mounted on the subreflector so its not seeing the optimally focused signal (which the lower frequencies are less sensitive to this overall loss of focus for the reasons above).

Generally a lot of spacecraft are designed with a much higher downlink to get science data down, but you only need a pretty small link to get commands up to the spacecraft.

[0] https://en.wikipedia.org/wiki/Voyager_1#Communication_system


well, problem solved then! no need to update cec anymore


Realtime status of the Deep Space Network:

https://eyes.nasa.gov/dsn/dsn.html


I guess this would be one of those cases where you'd want to use the most aggressive compression algorithms in the history of mankind, almost regardless of compression/decompression time, because you'd definitely make overall transmission times faster.

Of course, this wouldn't help Voyager, but it would help similar modern projects, since we can throw billions more computing power at these kinds of projects, these days.


> CEC, which is used mainly to power TV on/off or switch sources

CEC is much more than that as it allows the use of a single remote instead of two or more, which is super handy; the TV remote sends its commands through the HDMI cable to the active device, so if you switch from TV to an external source like PVR or media center on the TV remote, if that device supports CEC (Kodi on a Raspberry PI does that) you can control it seamlessly through the TV remote. I recently switched my media center from a Raspberry PI 4 to an older (but much faster) unlocked Chromebox whose video chipset doesn't support CEC, so I had to connect an external interface that does USB to CEC conversion. Unfortunately there is only one manufacturer of such interface, which means it's not that cheap. https://www.pulse-eight.com/p/104/usb-hdmi-cec-adapter


Some surprising companies have not completely figured out the correct way to use CEC either.

- Apple: When you start an Airplay stream on your phone directed at an Apple TV, the Apple TV doesn't send a switch input command.

- Nintendo: The Nintendo Switch switch input command when waking from sleep is unreliable at best.

- Microsoft: Surprisingly the worst offender. The only switch input command it ever sends is when it wakes from user-induced sleep. If you let the console go to sleep on its own, and wake up the controller and press the big Xbox button, it never sends a CEC command.


Sony seem to get this right: My PS4 can turn on my TV and switch the input source when it wakes up, but is also clever enough not to do so if you wake the PS4 via remote play (where you are streaming the output to some other device)

However, I think that the issue is that CEC support has never been 100% reliable and interoperable. My TV is also made by Sony, which likely helps with compatibility. Probably all these devices work well with some TVs and not others, despite everything claiming to support CEC.


My PS5 will randomly steal the input from my LG TV's internal smart apps. My Nintendo Switch does whatever it feels like doing at random.

Frankly I find the technology to be a total mess, and there's no way to debug it.


The PS5<>LG interaction is fucked. The TV will wake up all its devices when one of them does, and the PS5 will grab the input even if it wasn't the initiator.

You can disable this in config, but it's all-or-nothing: now the PS5 can't wake the TV at all.

Attempting to report the misbehavior to LG and Sony is like shouting into the void. They don't care, or it's the other entity's problem, or both.


Exactly the same on PS5 <> Philips TV. Needed to turn it off completely.


Wow, sounds like a dream. I had it happen ONCE that my chromecast-with-googleTV did NOT turn on the TV screen when streaming spotify.

Also my xbox turns on the TV, but that never turns on the speaker bar.


Sony TV as well, I guess.


Apple: When you start an Airplay stream on your phone directed at an Apple TV, the Apple TV doesn't send a switch input command.

I think it depends on the equipment combination, and its state.

I'm able to send an Airplay stream to my AppleTV and it both turns on my LG television, and switches the input to the AppleTV.

However, if the TV is already on, and I send an Airplay stream to the TV it will not switch the input. Which I think is an OK measure if you have kids or roommates or the TV is in a public setting so that someone might hijack your viewing as a joke or to be annoying. My memory is that this used to be possible with an earlier model AppleTV, or an earlier version of iOS.

That said, ever since I attached a $23 no-name Chinese DVR to another HDMI input on the same LG TV, the AppleTV no longer has the ability to wake or change inputs. Very strange.


At least for my TV, which is some cheap Element Amazon TV from 2017, my Apple TV has full CEC. It will change input to Apple TV whenever I interact with it, and can turn the TV on directly. So it has to be how the TVs implement things. I've wiped my TV firmware back to 2017, so it's possible newer TVs intentionally break their CEC features to encourage you use their own (bad) smart features?


it's possible newer TVs intentionally break their CEC features to encourage you use their own (bad) smart features?

In my case, the TV is from 2015.

I know because my wife and I just happened to be in an electronics store when the Pope was visiting the United States in 2015, and it was on all of the televisions.

At the time, we had a 32-inch TV from 2003ish, and when my wife saw the Pope on the 65-inch screen in the store, she immediately grabbed a salesperson and said, "I want to see the Pope like that."

So, in case you've ever wondered if the programming on a TV in a store matters, the answer is "sometimes."


> - Apple: When you start an Airplay stream on your phone directed at an Apple TV, the Apple TV doesn't send a switch input command.

I have almost the opposite problem. Any time I turn on my TV and receiver to use another device (often my Nintendo Switch), the Apple TV somehow wakes up and then switches my receiver's input to itself. (My receiver is my HDMI switcher and has a single HDMI output to my TV.) This is a very annoying problem, and is the only reason I still need to have the receiver's remote nearby.

Another unrelated but incredibly annoying problem I have with this whole setup is that it's essentially impossible to play music through my receiver with my AppleTV without the TV being on. Turning off the TV will, of course, turn off the receiver too.


I have the same issue with my XBox: It gets turned on by the TV (without me wanting this), but fails to turn on or off the TV (which would actually be useful).


You can get a special HDMI cable with the CEC cables disconnected to stop that.

CEC won't work at all then of course, but it will fix that problem.


I've never seen a device that supported CEC where use of it wasn't toggled inside a menu. Both sides of the wire need to accept it.


I've seen millions of comments on AV forums from Apple TV users having problems with CEC.

I don't have an Apple TV. Does it have a setting to disable CEC?


TVs usually have an option to disable (i.e. ignore) it though, at least every one I've owned/helped someone with has.


I can disable CEC on the TV, but of course I want the TV to turn on when I wake up a device. I just want to be able to turn off the TV and have it not turn off the receiver. Picky, I know, but I gathered that we’re deep in “first world problems” here.


Sounds like there is a market for a CEC firewall box. Something $1 ESP8266 based you could configure with a phone/laptop over the air in visual way (webpage) to pick an d choose what can pass in what direction, maybe even with ability to add simple scripts triggered by particular messages. ~$5 BOM ~$15 retail.


I'd buy one.


> When you start an Airplay stream on your phone directed at an Apple TV, the Apple TV doesn't send a switch input command.

Oh? My Apple TV 4K (2021) turns on the TV and receiver when I start AirPlay, and the receiver switches the input to the AppleTV if it wasn't already on it. This works whether the TV+receiver were on or off. My TV is a Sharp from ~2013, and the receiver is a Yamaha also from around the same time.


I haven’t tested this enough to know if it’s a real hidden feature, or just me placebo-effecting myself, but I swear it takes way less time to wake up my TV (Samsung) if I press the little TV button on the ATV remote 5x really fast.

Like if I press it once, the ATV wakes up and usually the TV does, but often showing the wrong source. 5x fast consistently wakes the TV to the right input.


Google got it surprisingly well with the Chromecast with Google TV, it can send a power on and off, switch input and even control the actual TV volume through CEC.


I just wish it supported auto-off. It is nice that I can ask it to turn off the TV but I really don't need the screensaver playing forever.


Are you sure you didn't get bluffed out by their IR emitter?

Chromecast with Google TV contains an IR emitter to handle that, and IR is the default not CEC.


I can change the volume from another floor (I tried out of curiosity), and it also controls the TV volume through the Android TV Remote. My phone doesn't have an IR emitter.


I really like how starting a cast can wake up my whole A/V system.


People reading this, make sure to enable it in settings. Thought my TV didn't have HDMI-CEC, but it was deep in the settings and unchecked by default.


Some years ago I set to be able to control my smart TV via my laptop. Unfortunately, the graphics card (integrated or discrete) must also support the CEC spec in order to be able to send commands to the TV. Having a HDMI cable is not enough. My laptop did not ship with that requirement.


Also, I don’t know if it changed since I no more have a TV but on my last TV, HDMI CEC was both disabled by default but also renamed with some stupid marketing feature name TV manufacturers are capable of like SmartLink or some shit like this. BTW, it was a LG TV before WebOS.


Yes, some manufacturers give CEC a different name for added confusion; on LG sets it is called SimpLink. The Wikipedia page reports them.

https://en.wikipedia.org/wiki/Consumer_Electronics_Control


Yes ! Thank you ! It was SimpLink.

On the same topic, I once bought a vacuum cleaner on which they wrote on the box « ExtraRangePlus (R) TM: Up to 12m power cable length ». Just why !?


My Switch's CEC is as flaky as yours, but my Xbox One X does wake up the TV from sleep no worries even if I let it go to sleep, last I checked. Though now I really want to double check this, and see whether it's the Xbox that has messed it up, or the Xbox + TV manufacturer combo?

My Apple TV 4K wakes my TV and to the right input with Airplay though!


I solved the switch input issue by using an automatic HDMI switch from Aliexpress. This thing is remarkably reliable at switching to any source that starts emitting a signal.

I don’t know if there are switches capables of turning on and off the TV based on the presence or lack or signal but that could be the ultimate solution to this issue.


I hadn't noticed that with my Xbox, but then again I may actually have the sleep timeout turned off...

I'm surprised that they'd get it wrong though given that they've properly supported the HDMI VRR and ALLM features since the Xbox One X. (AFAIK Sony doesn't have either of those, even on the PS5. And with Nintendo it's surprising they have any CEC support at all)


What is the purpose anyway? Why would a device send a signal to turn on the tv through an HDMI connection, without tuning to that HDMI channel?

That seems to be the main use case, but instead the design forces every source to be a universal remote control.


I use my chromecast to turn my TV on or off with voice commands, even when I just want to watch FTA channels.


So you use your device as a universal remote control. That is not at all the common use case.


I mostly use it so that a 'goodnight' voice command will turn off the TV, no matter what source is selected or what is playing.


So it doesn’t matter at all if the tv switches to a channel or not.


When Apple TV sends my Samsung TV to "sleep", the TV doesn't fully turn off the display.

In a dark room, you can still see a dim glow coming from the LCD panel that only goes away if I press the power button on the Samsung remote.


:/ works great on my samsung serif as long as its plugged into the port labeled 'hdmi-cec'


Having worked on a large product that plugs into TVs and uses CEC, I wouldn't be so quick to blame the devices as I would be to blame your TV...


The Nintendo switch input command seems to work absolutely fine.


Yes, CEC can do much more, but IMO the basic requirement is to power TV on/off properly and control volume, so that I can use whichever HDMI device's remote alone. Everything should be extra.

But yes it can do much more: You can program records over CEC! You can directly select broadcast service! (i.e. not just through key presses, but indeed say which channel you want) You can display some message on TV's OSD! You can discover another device's language! You can make audio 0.1% or 0.01% slower (or 1% and 0.1%? I don't remember)

I don't think I've crossed any device that supports any of the command I just mentioned however.

Though "Dispay some message on TV's OSD" from CEC bus would be perfectly on point based on TFA.


I even programmed my own overlay (app selector) for rpi that listens to CEC commands, so I can use the remote's arrows. Convenient so that I don't need to reach for mouse/keyboard/ssh/phone to start something on the rpi.


Oh neat, I assumed my PVR remote just had an infrared emitter that controlled the TV at the same time as the PVR.


That USB to CEC passthru box is $44.92 for reference. I do wonder what chipset it's using under the hood to do its thing, but given the combination of HDMI licensing/royalty fees, manufacturing tooling, the little custom enclosure it has (looks like molded/formed plastic, if I'm using the right term), then factoring in the low production volume, and considering that this seems to be a smaller operation... I think it's quite plausible the margin on the thing is quite small. Very plausible.

While on the page I was curious if the firmware was open source. While writing this and properly articulating the bit about HDMI royalty (and also just now realizing it might need to deal with HDCP... ah) I now realize the question is moot, but I didn't think of any of what while looking at the webpage, so I wondered if it would be interesting to have a cursory poke at the firmware download.

7-Zip (also available as a commandline program as p7zip-full on debian et al) is honestly grossly understated it its capabilities. I didn't feel like remembering how to invoke binwalk so tried `7z x firmware-v7.exe` almost as a bit of a challenge and it actually worked :)

  $ 7z l ~/Downloads/firmware-v7.exe 
  Path = /home/i336/Downloads/firmware-v7.exe
  Name = WEXTRACT.EXE            .MUI
  OS Version = 6.1
  Image Version = 6.1
  Subsystem Version = 5.0
  Subsystem = Windows GUI
  DLL Characteristics = Relocated NX-Compatible TerminalServerAware
  Comment = FileVersion: 9.0.8112.16421
  FileVersion: 9.00.8112.16421 (WIN7_IE9_RTM.110308-0330)
  CompanyName: Microsoft Corporation
  FileDescription: Win32 Cabinet Self-Extractor                                           
  ----
  Path = .rsrc/RCDATA/CABINET
  Type = Cab
  Physical Size = 1348694
  ID = 7086

    Date      Time    Attr         Size   Compressed  Name
  ------------------- ----- ------------ ------------  ------------------------
  2016-04-13 13:05:12 ....A       184800               flash.exe
  2012-08-29 16:19:56 ....A       619536               driver1.exe
  2012-02-21 04:24:52 ....A       816224               driver2.exe
  2012-07-31 15:35:30 ....A          400               flash.cmd
  2012-03-10 11:27:28 ....A        67008               libusb0.dll
  ...
It output even more lines of metadata than I included. I removed probably 50% of output so as not to spam the subthread.

Awesome.

Hmm... what if I do the same to `flash.exe`?

  $ 7z l flash.exe 
  ...
  Linker Version = 10.0
  OS Version = 5.1
  Image Version = 0.0
  Subsystem Version = 5.1
  Subsystem = Windows CUI
  DLL Characteristics = Relocated NX-Compatible TerminalServerAware

     Date      Time    Attr         Size   Compressed  Name
  ------------------- ----- ------------ ------------  ------------------------
  ...
                      .....        67392        67392  .rsrc/RCDATA/101
                      .....         2496         2496  .rsrc/RCDATA/102
  ...
  ------------------- ----- ------------ ------------  ------------------------
  2016-04-13 22:05:07             183568       183568  9 files
HMMmmmm, RCDATA you say?

Extracting (`7z x ../flash.exe` from an empty subdir) produced newline-less ASCII hex output ("EF19541A19D258B099B877545DE0B65BBBC5602..."), so after a bit of `printf "$(cat 101 | sed 's/../\\x&/g')" > 101.bin` I came up with... nothing.

Binwalk has no idea what it is. Perhaps the start of the files ring bells for some humans?

  $ ls -l
  -rw-r--r-- 1 i336 i336 67392 Apr 13  2016 101
  -rw-r--r-- 1 i336 i336 33696 Apr  1 05:46 101.bin
  -rw-r--r-- 1 i336 i336  2496 Apr 13  2016 102
  -rw-r--r-- 1 i336 i336  1248 Apr  1 05:48 102.bin

  $ xxd 101.bin | head -n 5
  00000000: ef19 541a 19d2 58b0 99b8 7754 5de0 b65b  ..T...X...wT]..[
  00000010: bbc5 6020 54e7 ca9a 1fb7 beca 0a39 95c4  ..` T........9..
  00000020: c876 de8a 4705 19a5 9f03 4c56 5d83 9550  .v..G.....LV]..P
  00000030: f5ad 2930 5f07 9b46 21b1 91a0 d091 4685  ..)0_..F!.....F.
  00000040: 3ada 0ce3 305b 49e1 939e 7384 3c5a 5794  :...0[I...s.<ZW.

  $ xxd 102.bin | head -n 5
  00000000: f3f1 7b55 3f33 cbed 426e 1ab7 5792 4425  ..{U?3..Bn..W.D%
  00000010: 8e75 1566 5597 8183 c1ce 9265 6acf 3b73  .u.fU......ej.;s
  00000020: 1370 47b4 5431 399b f73f 0f6a 6323 3329  .pG.T19..?.jc#3)
  00000030: 0e52 25ad 0530 5d03 9393 bc8c 40c8 f0fb  .R%..0].....@...
  00000040: c63c 7ddb 4d3a cd89 ab4a e6c7 cb2f ab4f  .<}.M:...J.../.O
For what it's worth, having realized this might be touching HDCP or for all I know might be an FPGA bitstream, I emphasize my interest is purely coming from a decidedly non-exhaustive sense of "ooh that's a cute product, I wonder what firmware it runs and how it broadly works internally" entirely made up of idle curiosity :)


HDCP is unrelated to the CEC bus. CEC is completely separate and unencrypted. The pulse-eight adapter, AFAIK, should just be a female-female connector with two pins plugged into a microcontroller.

I've toyed with the idea of replicating its functionality on an Arduino, but ended up just buying the adapter :)

In the end, I'm not sure how doable it is, but it shouldn't be too complex. Licensing is another matter. I'm at a loss as to why GPU makers don't wire it up though.


I see (TIL a lot about CEC, added some info to a sibling comment).

I incidentally found https://hackaday.io/project/168696-cec2usb while poking around, last updated a couple years ago so probably not available for sale anymore, but open source at least.


Unpacking the files also produces a certificate, and the strings command on the flash.exe executable reveals also references to various related links, so it is possible that the firmware is also encrypted and signed.

  ---
  http://s2.symcb.com0
  http://www.symauth.com/cps0(
  http://www.symauth.com/rpa00
  http://s1.symcb.com/pca3-g5.crl0
  SymantecPKI-1-5670
  Symantec Corporation1
  Symantec Trust Network100.
  Symantec Class 3 SHA256 Code Signing CA
  ---
... etc.


The intel NUC exposes the CEC pins to the motherboard, and the same company makes an internal USB/CEC adapter for $15 less[1]. That PCB is almost entirely an MCU and a crystal, so the plastic box and HDMI passthrough adds $15.

1: https://www.pulse-eight.com/p/154/intel-nuc-hdmi-cec-adapter


Ah, I see.

The listed photos are just at the wrong angle to be able to read the chip markings (grr)... but I had a bit of a further poke around, and found a listing for a discontinued internal board for HTPC (!) setups: https://www.pulse-eight.com/p/117/internal-hdmi-cec-adapter

A reasonable bit of eye-strain later I at last identified that I was staring at an AT90USB162-15AU. Here's a reference for the -16AU: https://www.mouser.com/ProductDetail/Microchip-Technology-At...

Huh. 16MHz. That answers that then.

(Now I'm a tad more idly curious why the firmware was unreadable.)


If you end buying the product wireshark can capture USB data and then you'll be able to see exactly what data is sent to it when you use their flash util. On windows you'll need to also install USBPcap to capture USB traffic, but if I remember correctly it is bundled with wireshark and just unchecked by default.

Otherwise you might try binwalk with the --disasm option (you'll need capstone installed for it to work) then it will attempt to search files for assembly for any of wide range of processors (obviously would fail if the firmware is encrypted, but I doubt a cheap product like this would bother). I'll also just briefly say that even though RCDATA is microsoft's recommended way to embed a file into an executable I've seen plenty of software that embeds files in other creative ways, so I wouldn't rule out the possibility that the firmware is somewhere else in the executable, binwalks entropy analysis mode can sometimes work firmware that is otherwise hard to locate.


You sound like you know a lot about the HDMI CEC spec. Do you have any insight as to why CEC implementations are often buggy on so many devices? I.E. "one touch play" turning on unintended devices in addition to your display and receiver.


First, I have to plead guilty: my own product's CEC still has many "simple" flaws.

That being said, I indeed have the feeling that CEC is specifically badly implemented by most people.

I feel like the standard is rather under-specified (or there are things I don't understand), like it's said how some kind of devices are supposed to behave based on some commands, but no explanation as to how other devices are supposed to behave on those commands.

For instance, it is specified that a Player (there are various types of devices in CEC, more on that later) can send an AVR (which is different kind of CEC device, there can only be one in the whole network) volume commands. More specifically, a Player can send an AVR VOL+ and VOL- key presses to change volume. What happens if you send VOL+/VOL- to a TV? That's not specified until 2016 with CEC 2.0. So you can't control volume on most TVs if you don't have an AVR.

The number of "slots" available per device kind is constant: exactly one TV, exactly one AVR three players, 4 tuners, 3 recorders. You have a tvbox, and three game consoles (all shold be Players)? Well someone will have to lie and become a tuner or a recorder, or they won't be allowed on the bus.

Another thing that makes this messy, is that CEC needs to fit in small power budget during suspend. For instance, in Europe, the legal power budget in sleep is 0.5W. This means that your main application processor can't handle CEC in suspend. Usually, this leads to multiple "concurrent" CEC stacks, running in different CPUs, switching from always-on Cortex-M, to full-blown Cortex-A (and then you add Android TV on top of that with its own CEC stack, and you get three CEC stacks co-working together). Often the communication pipeline between those people is pretty light, and going through all those layers, you might end up losing the info of whether the wakeup instruction came from your remote (so you legit want to have the TV wake to you), or from CEC (so you want to let TV decide of the output).

I believe one gigantic factor is that CEC has started very poorly (no matter the reason), and since then, interoperability problem has been considered by most QA as "yeah well, this is life"

Back specifically to your question "I.E. "one touch play" turning on unintended devices in addition to your display and receiver.": This is very very usual. I don't know what the specs say precisely, on the matter, but here's what I witnessed on many TVs (many enough that I expect this to be the standard, but it's possible it's related to what I said earlier about dual-stack):

- Say you have HDMI1 and HDMI3 devices connected, you suspended TV on HDMI1, you wakeup from HDMI 3.

1) On wakeup, TV sends Set Stream Path to the previously selected device. If you suspended your TV on HDMI1, and you're waking up from HDMI 3, TV will start with a Set Stream Path to say "hey, the screen I'm currently displaying is HDMI 1".

2) HDMI1 device listens, and wakes up. In the process of waking up, it needs to tell the TV it is ready with ACTIVE_SOURCE command, so they do.

3) If HDMI3 is in the "clever" range, HDMI3 will send again "Please TV switch to HDMI 3" with ACTIVE_SOURCE

4) Even if HDMI3 isn't in the "clever" range, TV will later send Set Stream Path to HDMI3, because it remembered HDMI3's command to wake up to them, or because of thanks to -3-

5) Everyone's happy, TV's on HDMI 3

6) Message sent in -2- finally manages to reach the bus more than one full second later, because HDMI1 is less aggressive on CEC bus than the other.

... And there you go, TV switches back to HDMI 1.

Fixing this is possible, HDMI 1 "just" needs to cancel -2- when seeing -3- or -4-, but most CEC implementation's send_pkt doesn't include cancellation signals. So, it's possible to make better CEC implementations (though it requires mechanism that are pretty complicated for an embedded world), but I don't think it's possible to make a perfect implementation that will never miss.


Thank you for your detailed answer!

At times, I regret DisplayPort not having something comparable (it has data channels, but nothing "as well specified" as CEC for controlling other devices). I think you explained part of the reason. If it were specified, it would require careful consideration, and probably a conformance test suite. I still fully expect manufacturers to try to re-brand it, extend it themselves, and generally botch their software like they end up doing most of the time.


Also in play is that some parties have patents on certain kinds of control systems using CEC, so just implementing something that seems useful can get a device manufacturer in legal trouble :(


Wow this is pretty detailed, and makes a lot of sense. It's a shame the standard does not seem more well thought out.

One solution I've always felt would be handy would be for devices to have an option to ignore any CEC commands that would normally cause them to wake. As it is right now, everything is so buggy between my LG TV, Yamaha Reciever, and "players" that I have to disable the feature entirely on either my PS5 or Apple TV in order for things to not go haywire.


Awesome answer, gold star for you!

This is the kind of post that I come to Hacker News for.


TIL what the hell was happening when I turned on my TV and the receiver usually switches to the wrong device.

I'm just glad I'm not crazy.


mine works decently, except something seems to often be sending an errant ‘on’ signal to my a/v receiver: i hit the power button (on any remote) and everything turns off (as i want), only for my receiver to wake up 1-5 minutes later (my dvd/blu-ray player also comes on for a second, but then turns off automatically). super annoying.


> CEC, which is used mainly to power TV on/off

in my experience it doesn't even do that very well. I tried getting a raspberry pi to power off an attached display and support of on/off seems to be hit or miss and specialized.

I suspect it isn't supported because it will commoditizes the device, when manufacturers really want their own protocol that connects only to their own products.

much better sales of acme products if an acme tv can only connect to an acme soundbar and an acme dvd player using acme protocols.


I wish they'd made CEC a little more powerful. I like that I can control my AV receiver's volume with the TV remote, but I don't like that I can't see what the volume is actually set to on the TV. Even if I had a receiver that could overlay onto 4K video, that wouldn't help for the cases where I'm feeding audio back from the TV to the receiver.


It's ""fun"", because there is definitely a CEC command for TV to know AVR's current volume.

But I can't blame TV manufacturer for not implementing it: When sending vol+/vol- commands, you're already stretching the CEC bus pretty thin, you can't afford to spam the bus more with more commands...


I played Slashem and edited files with ed(1) over 600bps. Not bad, but things start to get usable enough at 9600 BPS.


Uh, oh. The work project I'm working on uses an OLED display that uses an I2C interface. And I happen to have a spare one sitting on my desk. And I'm no electrical engineer, so let's see how fast I smoke this thing. :-)

Should I succeed, TFA author will not have the worst HDMI display ever, as the one we're using is a two-line display looking something like this (it can do rudimentary graphics, though; we display a company logo at boot):

https://www.mouser.com/ProductDetail/Vishay-Dale/O020O002ALP...


OT: I'm idly curious why that display (photo: https://www.mouser.com/images/vishay/hd/O020N002ALPP5N000A_t...) has what appears to be a detached single vertical column of presumably-unusable pixels on each side of the active area. Obviously to do with manufacturing, but... ?ˀ?

(I'm also curious about the "Pricing: Request a quote" bit. Doesn't seem to be "not for new designs", but does have a bit about "Factory special order", so perhaps this is just an indication of unusually limited supply or something.)


Here's my reckless guess: These matrices are manufactured in a continuous strip and cropped at the appropriate length, then the interface is attached to the cells and it's sealed in its case. It would allow the factory to produce varying sizes of display.


Why not a simple LCD for this? What benefits from OLED attracts your interest?


If you put an OLED display behind dark plexiglass it looks really nice -- the border of the display becomes invisible and you see just the letters (or whatever else you are displaying).

With LCD there's always backlight leaking.


I'm not part of the team that makes such decisions, being a lowly software engineer that never got any good at EE. However, as one who was at least in the room when such decisions were discussed, I believe the decision-making process consisted of "OLED would be way cooler, and they're not that expensive these days". I'm pretty sure that is the sum total of that "engineering" decision. :-)

But I will say that the OLED looks a lot better than most LCDs of that type that I've seen.

EDIT: oh, yeah; sibling comment says something about backlighting an LCD, which might have had something to do with it, as the OLED replaces a seven-segment LED display from the previous generation (which, duh, also doesn't need a backlight).


At those tiny sizes (and in small quantities), the price difference is minimal but commonly available OLEDs and and LCDs often come in different sizes. So the choice may have been more about dimensions than display type.


Viewing angles could be a thing.

I can recall back in the early 2000s, we had a wave of people wiring up little character LCD displays for hardware monitoring. They inevitably looked awful at angles, washed out backlight, etc.

The super-lucky people could get a VFD display, which was a million times more legible, but they tended to be spendy, fragile, and warm-running.

OLED gives you everything: cheap, pretty cool running, and easy to read.

I've been experimenting with one of the Digole OLED modules to recreate what I wanted 20 years ago. :) (super-easy to interface to USB and program-- we've come so far!)


Not the OP, but small OLED displays are pretty low cost, and they don't need backlight management because the display is emissive. Viewing angles are usually nicer too.



I can imagine to look less DIY or "from the 90s". Vanitas.


I love this. What is it about pointless technical projects that are sometimes so alluring? I wonder if it's the removal of secondhand stress since there is no 'meaningful' success criteria.


Anything that involves fooling with physical wires is fundamentally more satisfying than anything that doesn't.


Kindred spirit! The one caveat to this rule for me requires that as long as there is wood and tools, it is okay to not have wires.


<looks up from hand-riveting a wooden boat and grunts approvingly>


LED-turn-on > pixel-change-color


plot twist: pixels are LEDs


But, no wires.


Very small wires.


> I wonder if it's the removal of secondhand stress

Yup, a key component of the sensation of "play" is low stakes.


In the context of observing animals, often vague behavior that expends energy for no apparent reason is called "play." This fits the bill!


Yes, I think removing the "must do a thing" component confounds macro-scale sensible classification and measurement of the discrete work that is being done. It's possible for manglement to see that effort is being invested, but the significance of the result cannot be perceived due to lack of resonance with the engineering mindset. This makes it possible to personally take ownership of engineering agency and may be the core reason engineers survive at all because they are able to take and own personal responsibility for their own learning.

Knowing it is not possible for others who don't "get" what we're doing to usefully measure or judge our work in turn disengages the "do thing in anger" stress associated with captive/acute focus, and (in ideal, spherical-cow-like situations providing infinite time) enables unbounded, open-ended introspection into reinforcing the mental solution-finding capacity within the domain in question. (In practice, infinite time would quite harmful as it would provide more space than our attention spans could fill; the practical ideal may be to find the right balance between work (acute focus) and zoning out, which might be trackable by identifying the precise points our ego lags slightly behind, but is cognizant of, our as-yet unused physical capacity.)

Being able to engage in this introspection is critical important for learning: it's almost like dreaming, in absence of any singular focus on finding an optimal solution to a given problem within a limited time frame. This makes it possible to pay attention to the problem-solving network as a whole, cross-reference and merge fragmented ideas that have developed independently, and drift toward blurrier edges of understanding to help reinforce them (ever noticed how the things you instinctively find super interesting that you really want to dive into - and often the itches you want to scratch - all depend on skills that happen to be right at the point of establishing minimum-viable cohesion and fundamentally clicking into place? We wander aimlessly... but we don't!).


> no 'meaningful' success criteria

Other than personal enjoyment.


Well, another "pointless" project this guy made eventually became a product, the flash synth. It's a tiny synthesizer that fits in a MIDI DIN5 plug. I own one because I love small and weird synths. It actually sounds pretty nice!


Same here, I thought this was great. Even though I have been in the "computer field" for a loong time, there is still so much to learn! I love articles like this.


This design is devilishly clever, but a tad misleading. It's not really an HDMI display notwithstanding that it is a display that plugs into an HDMI port. HDMI has an embedded i2c bus, and this project uses that to send images to the display. That's why you need a layer of software (xrandr) to translate images that the system produces into i2c.


(a) you're a buzz-kill; and (b) that i2c bus is part of HDMI so it is an HDMI display.


If you say so. The fact of the matter is that if you plugged that thing into, say, a DVD player with an HDMI output, it wouldn't work.

Sorry if that kills your buzz.


If you took the time to build this, then travelled back to the 1800's when people had standalone DVDs, and plugged this thing in expecting it to work, it would not kill my buzz: I'd find it quite entertaining to watch, really.


This isn't so much an HDMI display as an I2C display utilizing the DDC channel in the HDMI interface as the I2C bus. I've seen folks utilizing this on the Raspberry Pi as a generic I2C bus as well.


Eh, that's like saying phones don't charge over USB, just DC. We're referring to the physical connector more than to the typical bus.


Yeah I dunno, when I started reading I thought he had found a display that was coincidentally HDMI-compatible with the display HDMI signals.

It's at least misleading.


It's not misleading to anyone casually familiar with those little OLED boards. They are cheap.


Its not quite the same. I don't need to download a special host script to charge my phone on a friend's computer.


I needed to download a special host script to charge a friend's iPad from my desktop.


SSD1306 can definitely do better than 6-10fps, which is close to the "software mode" driven by GPIOs, not an actual I2C interface.

Either the DDC is running at very low speed, or... the I2C in HDMI in this laptop is actually a software thing?


What great timing. I've rearranged my desk and have trouble fitting a second monitor next to my ultrawide + big speakers, and I've been looking for a smaller monitor.

This might be a tad too small, but worth considering!


Link to the video demo at the end of the article:

https://youtu.be/8UbVgUFfN8U


Side note, "Look around you" is an allusion to this video series:

https://www.youtube.com/watch?v=t4CRCJUmWsM&list=PLsMUa0l1PW...


Write that down in your copy book now.


Wow this triggered some memories I cannot place but I love it!


20 years ago, we came across this tiny CRT display. I think it 4 X 5 inches. We pranked a coworker with it, replacing her large display with this tiny thing.

And left a note explaining how the black and white display was an upgrade. :-)


I think we claimed it was an upgrade because it saved wear and tear on her rods and cones.


A colleague of mine pranked a (much liked) co-worker: police and an anti-terrorism squad were called. Hilarity ensued....


To me the cool part is that you get a free I2C output over an HDMI port.

I don't know any laptops that have native I2C output w/o a USB converter.


I recently learned about evdi, an out of tree Kernel module that can add as many virtual outputs as you want. Ideal for things like this, or where you'd normally use a dummy plug.


That's exactly where my mind went when I read this post too.

The link for anyone interested: https://github.com/DisplayLink/evdi

It's unfortunate it wasn't accepted in-tree because while it was initially developed for a specific commercial product, I do think it has the potential to be generically useful.

The ability to conjure up a display "in software" & then just have the rest of the system automatically treat it as any other physical display is pretty powerful.

There's a couple of projects on my long "projects to do" that would make use of it...


I'm sure that those with a lot more hardware experience than me have the intuition to make guesses like this but I was surprised at:

>You have to register to download the HDMI spec which is more effort than I have for this, but the Hot Plug Detect pin has a pretty descriptive name. I guessed that this either has to be pulled up or pulled down to signal that a cable is connected. Sticking a 20K resistor to the 5V pin seemed to do the trick. With the oscilloscope, we can now see activity on the SCL/SDA lines when it's plugged into the laptop.

Is it really not that concerning to just guess what amperage will/won't fry something? I mean, you could test from very high resistances downwards, but you might overshoot, right? Or is there an assumption that the hardware on either end will provide the appropriate resistance, in this case?


The slot machine company IGT (now Atronic?) on the AVP platform slots had buttons that were two colour small OLED screens using HDMI as a connector.


Unrelated but wanted to ask here since this forum may have the knowledge.

Recently I bought one of those 1) USB-C 4-in-1, 2) USB-C 7-in-1 (Anker) to attach an external monitor with HDMI. When connecting to a laptop is there details how to retrieve more details of all the capabilities and ports?

What might also be some easy recommended way to start understanding how something like this works. To start, just sending a "high/low" signal to a data pin either straight connection into HDMI port, USB port as a direct connection (and how about when using a multi-port)?

Thanks!


> Here is some trivia: did you know that the mouse cursor is rendered by hardware?

Ah! That perhaps explains why often the cursor moves around just fine when other elements of the stack have completely given up.


As far as I know, it isn't really rendered by HW, but composited in HW.

GPUs usually have multiple "Hardware planes" that are composited together in hardware before sending the signal. The OS is free to put whatever it wants there. Most HW include a designated "cursor plane", but it could be used to display whatever.

IIRC a minimum of 3 HW planes is specified somewhere, it could be the Wayland protocol, or the Linux Direct Rendering Manager API. Some devices can have less though. That would be the case for that display, which could use an actual framebuffer driver (although planes could be emulated).

You can thus have a layer for the background, one for the windows, and one for the cursor, and avoid re-painting too often, even though it should be pretty cheap for GPU-accelerated surfaces (like with compositors that rely on OpenGL).


omg, i just bought and installed this display for my rpi4 and you tell me I can use it as a second monitor!? this is why I am a tech geek. :heart:


You don't need to go through all the steps involving HDMI. Linux can create framebuffers on I2C/SPI displays like this. Many years ago, I connected two to a Beaglebone and it ran Emacs just fine. Much device tree hacking is involved, though, unfortunately.


yes, yes, this is exactly what I did. But connecting it to an HDMI port is infinitely much more cooler!


HDMI is a pretty crazy protocol. CEC, HDMI, and then on top of that a dedicated hot plug detect? Is all this really needed? Can't I2C do everything?

They could use a 1M pull down on SDA, and pull it up at the display side instead of the host side, and have that be hot plug detect, and then just use i2c for control as well.


It doesn't stop there, you have an audio return channel and 100Mbps Ethernet too.


Ethernet is reasonable, but why not just have audio over Ethernet?

Or for that matter, USB2.0 instead of ethernet, and then you can have a hub on the monitor, a webcam, Ethernet over usb if you want, and audio return over usb?


My Pet theory for "Why did 100MbitOverHDMI not see traction" is, that -apart from linux/BSD- no other mainstream OS had reasonable-if-any support for meshed ethernet.

I always hoped for a future, where all the devices can ootb intelligently decide, over which of the available interfaces (Eth, Wifi, HDMI) to send data per peer device. It would substantially reduce WIFI congestion in lots of (MediaCenter) installations.

Same with CEC: "Windows doesn't support it, so why add it to GFX-card hardware?" ARC: Same deal

"Not-supported-by-windows" has led to the death/non_implementation of lots of useful things/tech not only related to HDMI. E.g. the Per-Port-Power-Control spec part of USB is only implemented by some rare USB HUB(-Chip) vendors, and never tested for in any review. On supported hardware it works great with `uhubctl`. Ubiquitous 802.11s Wifi mesh support is in the same boat.

As I say, my "pet theory". Happy to learn specifics.


For how amazingly useful it is... networking is kind of hacktastic at the protocol level.

It's a bunch of layers that don't know anything about each other, and any even slightly unusual setup is a nightmare of manual configuration.

It happens to do exactly what people do with it very well, central web services and some limited LAN stuff on the same subnet.

I almost think things would be better off without the OSI model, if we were to start over. Just one CJDNS-like mesh, with native support for encryption, firewalling, pairing and discovery, all in one place, under one version number.

Everyone talks about the flexibility but I'm super not convinced things like 6LowPan are a good idea.

Just being able to say "This isn't IP and has no way to get on your network" is a big advantage of things like ZigBee, in a world where nobody really trusts this stuff.


I saw the author looking for the hdmi pinout. For anyone who wants a quick reference you can check this website: https://pinouts.org/


Very cool hack indeed! One thing I learned from this is that having an HDMI port gives you a free I2C port on top as well, as long as you have an old HDMI cable lying around to cut open.

Small note on using these OLEDs with 5V: Typically they expect 3.3V Vcc and logic levels, although almost all of them seem to work just fine with 5V. In my experiments with a regular 5V Arduino some OLED modules made weird coil whining noises, I presume this is from the charge pump circuitry. Driving them with 3.3V as specified removed the coil whine completely.


Oh cool, I already follow mitxela for his crazy MIDI contraptions.


Hackaday has some similar DDC projects, which include interfacing with a microcontroller:

https://hackaday.com/2014/06/18/i2c-from-your-vga-port/

https://hackaday.com/tag/i2c/


"I have a proclivity to stupid and/or pointless projects. "

<scrolls through rest of project list>

This is hilariously accurate, yet still somehow great.


These kinds of fantastic hackery are the reason I consistently come back to HN. This is an absolutely fascinating little piece of kit.

It begs the question to as if there would be any actual utility to a display this small with it having to be directly attached to the laptop.

Smartwatches come to mind, but tethered smartwatch-size displays...hmm...maybe add a wi-fi module and we've got something?


Quite cool for an rPi.

I have a project where only BPM (beats per minute) is useful and it would be handy that only the hdmi socket is occupied.


What a lovely little collection of small projects.


Kinda unrelated, but I wish there were an HDMI std which also included power.

I have two AOC 15" USB ONLY screens that I use with my laptop, and they all fit in my backpack...

So I have a 3-screen setup in my backpack with one power supply (the laptop) ... its such a sweet setup.

but an HDMI+power would be great (because then I could also have audio, and free up my USB ports.)


I guess the answer for that nowadays is "USB-C", will all the complexity included with that...


So, I tested this..

I am on a flagship HP Omen Gaming Laptop (I arguably got the very first literal laptop shipped from the factory (have provenance, long story)

And this stupid machine ($3K) has a laughable casing (super chints, bends easily, not good) -- it has (3) USB ports, with (1) being SS, the other two std, and (1) USB-C port.

ZERO power out of the USB-C port.

I have purchased multiple USB-C "hubs" -- none have pwer, all require external USB power.

So imagin this: You buy a USB-C monitor, but your USB-C port provides no power, so in order to plug in a USB monitor into your USB-C HUB connected to your laptop via USB-C, you need a secondary cable to go through a regular USB port on your machine to provide power to the USB ports on your USB-C hub, thus consuming BOTH the USB port AND the USB-C port on your machine....

What a fucking design flaw this machine is.

The guts are all hyped around the 165Hz screen and the RTX... but the physicality of this machine sucks.

HP OMEN 15" 5800 RTX 165Hz machine... "gaming laptop"


Oh wow, I remember this guy from a Zelda forum I was on nearly 20 years ago. Glad to see he's still around.


I feel like cable companies that have found a market with overpriced cabling for high-end entertainment systems, like Monster and Denon, could make a killing on this.


I'm looking for a UART monitor with slightly bigger screen, and baudrate autodetection, to be used for debugging. It should be cheap.

Any ideas where to look?


This not cheap nor does it have a screen, but one of its many (many) uses allows interfacing USB to UART with auto-detection of both BAUD rate, voltage and whether or not its inverted: https://www.crowdsupply.com/1bitsquared/glasgow



> You have to register to download the HDMI spec which is more effort than I have for this

Wait, what? Register where? Is the HDMI spec behind IP bars?



I've got it. You could make a mini display like this to clip over the notch on a new MacBook Pro.


But can you play Doom on it?


This is pretty useful for NUCs, Qotoms, and x86 routers for showing stats.


The author should move the fun video at the end to the top!


What is this, a display for ants


smaller dpi and about the same size of an Apple watch that sells about 30M units a year :-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: