Article makes fun of how slow i2c is compared to the super high speed pairs next to it on HDMI plug.
Well, please note that there is a bus that is literally a thousand time slower than i2c on that same connector. It's CEC. Contrary to i2c it only occupies one wire rather than two though.
CEC, which is used mainly to power TV on/off or switch sources has a whopping bandwidth of 400bps. No missing multiplier.
Maximum device name is 16B. If you have a TV, a tvbox and a game console, simply asking the name of those devices can take more than a second.
I'm in the process of shortening my product's CEC name, in order to reduce congestion on the bus.
Mainline Linux CEC developer jokes that Voyager 1 which is on the other side of solar system, made 40 years prior to CEC, is faster than CEC.
Voyager is so far away that it's estimated data rate is only a hundred or so bps.[0] So CEC is actually faster now than Voyager! New Horizons still has the edge with an estimated kilobit/sec.
EDIT: I was partially correct. It seems Voyager can transmit at up to 1.4 kbps for certain things.[1]
That's a bit confusing; shouldn't it be the other way around? Because they're using the same antennas in both directions, but the power budget on the Earth-transmitting side is much higher.
(Asked in the spirit of curiosity; I'm ignorant of radio physics).
That's a factor, but I think it's more that there was never a need for large data uplinks to Voyager, while they needed to downlink large amounts of science data.
The uplink and downlink are in different frequency bands, S-band (2-4ghz) for uplink and X-band (8-12ghz) for downlink and have different amounts of radio frequency bandwidth allocated.
Additionally, Voyager's High-gain antenna has a much narrower beamwidth in X-band at 0.5°, compared to 2.3° for S-band, though I have no idea if that's a factor.
The first is that the downlink is at a higher frequency (X-band, 8.4 GHz) so there is more available bandwidth and the beamwidth of the antenna is tighter at a higher frequency (i.e. the power is more focused). The uplink is at 2.3 GHz, but part of the advantage of the larger beamwidth is that you are able to command the spacecraft if there is an anomaly and it is the high gain antenna is more offpointed. There is also a low gain omnidirectional S-band antenna in case it is really offpointed.
Although the signals both come through the high gain antenna there are two different feeds for the X-band and S-band. If you take a look at the picture of the antenna on Wikipedia [0] you can see the X-band feed is the prime focus behind the subreflector. The S-band feed is mounted on the subreflector so its not seeing the optimally focused signal (which the lower frequencies are less sensitive to this overall loss of focus for the reasons above).
Generally a lot of spacecraft are designed with a much higher downlink to get science data down, but you only need a pretty small link to get commands up to the spacecraft.
I guess this would be one of those cases where you'd want to use the most aggressive compression algorithms in the history of mankind, almost regardless of compression/decompression time, because you'd definitely make overall transmission times faster.
Of course, this wouldn't help Voyager, but it would help similar modern projects, since we can throw billions more computing power at these kinds of projects, these days.
> CEC, which is used mainly to power TV on/off or switch sources
CEC is much more than that as it allows the use of a single remote instead of two or more, which is super handy; the TV remote sends its commands through the HDMI cable to the active device, so if you switch from TV to an external source like PVR or media center on the TV remote, if that device supports CEC (Kodi on a Raspberry PI does that) you can control it seamlessly through the TV remote. I recently switched my media center from a Raspberry PI 4 to an older (but much faster) unlocked Chromebox whose video chipset doesn't support CEC, so I had to connect an external interface that does USB to CEC conversion. Unfortunately there is only one manufacturer of such interface, which means it's not that cheap.
https://www.pulse-eight.com/p/104/usb-hdmi-cec-adapter
Some surprising companies have not completely figured out the correct way to use CEC either.
- Apple: When you start an Airplay stream on your phone directed at an Apple TV, the Apple TV doesn't send a switch input command.
- Nintendo: The Nintendo Switch switch input command when waking from sleep is unreliable at best.
- Microsoft: Surprisingly the worst offender. The only switch input command it ever sends is when it wakes from user-induced sleep. If you let the console go to sleep on its own, and wake up the controller and press the big Xbox button, it never sends a CEC command.
Sony seem to get this right: My PS4 can turn on my TV and switch the input source when it wakes up, but is also clever enough not to do so if you wake the PS4 via remote play (where you are streaming the output to some other device)
However, I think that the issue is that CEC support has never been 100% reliable and interoperable. My TV is also made by Sony, which likely helps with compatibility. Probably all these devices work well with some TVs and not others, despite everything claiming to support CEC.
The PS5<>LG interaction is fucked. The TV will wake up all its devices when one of them does, and the PS5 will grab the input even if it wasn't the initiator.
You can disable this in config, but it's all-or-nothing: now the PS5 can't wake the TV at all.
Attempting to report the misbehavior to LG and Sony is like shouting into the void. They don't care, or it's the other entity's problem, or both.
Apple: When you start an Airplay stream on your phone directed at an Apple TV, the Apple TV doesn't send a switch input command.
I think it depends on the equipment combination, and its state.
I'm able to send an Airplay stream to my AppleTV and it both turns on my LG television, and switches the input to the AppleTV.
However, if the TV is already on, and I send an Airplay stream to the TV it will not switch the input. Which I think is an OK measure if you have kids or roommates or the TV is in a public setting so that someone might hijack your viewing as a joke or to be annoying. My memory is that this used to be possible with an earlier model AppleTV, or an earlier version of iOS.
That said, ever since I attached a $23 no-name Chinese DVR to another HDMI input on the same LG TV, the AppleTV no longer has the ability to wake or change inputs. Very strange.
At least for my TV, which is some cheap Element Amazon TV from 2017, my Apple TV has full CEC. It will change input to Apple TV whenever I interact with it, and can turn the TV on directly. So it has to be how the TVs implement things. I've wiped my TV firmware back to 2017, so it's possible newer TVs intentionally break their CEC features to encourage you use their own (bad) smart features?
it's possible newer TVs intentionally break their CEC features to encourage you use their own (bad) smart features?
In my case, the TV is from 2015.
I know because my wife and I just happened to be in an electronics store when the Pope was visiting the United States in 2015, and it was on all of the televisions.
At the time, we had a 32-inch TV from 2003ish, and when my wife saw the Pope on the 65-inch screen in the store, she immediately grabbed a salesperson and said, "I want to see the Pope like that."
So, in case you've ever wondered if the programming on a TV in a store matters, the answer is "sometimes."
> - Apple: When you start an Airplay stream on your phone directed at an Apple TV, the Apple TV doesn't send a switch input command.
I have almost the opposite problem. Any time I turn on my TV and receiver to use another device (often my Nintendo Switch), the Apple TV somehow wakes up and then switches my receiver's input to itself. (My receiver is my HDMI switcher and has a single HDMI output to my TV.) This is a very annoying problem, and is the only reason I still need to have the receiver's remote nearby.
Another unrelated but incredibly annoying problem I have with this whole setup is that it's essentially impossible to play music through my receiver with my AppleTV without the TV being on. Turning off the TV will, of course, turn off the receiver too.
I have the same issue with my XBox: It gets turned on by the TV (without me wanting this), but fails to turn on or off the TV (which would actually be useful).
I can disable CEC on the TV, but of course I want the TV to turn on when I wake up a device. I just want to be able to turn off the TV and have it not turn off the receiver. Picky, I know, but I gathered that we’re deep in “first world problems” here.
Sounds like there is a market for a CEC firewall box. Something $1 ESP8266 based you could configure with a phone/laptop over the air in visual way (webpage) to pick an d choose what can pass in what direction, maybe even with ability to add simple scripts triggered by particular messages. ~$5 BOM ~$15 retail.
> When you start an Airplay stream on your phone directed at an Apple TV, the Apple TV doesn't send a switch input command.
Oh? My Apple TV 4K (2021) turns on the TV and receiver when I start AirPlay, and the receiver switches the input to the AppleTV if it wasn't already on it. This works whether the TV+receiver were on or off. My TV is a Sharp from ~2013, and the receiver is a Yamaha also from around the same time.
I haven’t tested this enough to know if it’s a real hidden feature, or just me placebo-effecting myself, but I swear it takes way less time to wake up my TV (Samsung) if I press the little TV button on the ATV remote 5x really fast.
Like if I press it once, the ATV wakes up and usually the TV does, but often showing the wrong source. 5x fast consistently wakes the TV to the right input.
Google got it surprisingly well with the Chromecast with Google TV, it can send a power on and off, switch input and even control the actual TV volume through CEC.
I can change the volume from another floor (I tried out of curiosity), and it also controls the TV volume through the Android TV Remote. My phone doesn't have an IR emitter.
Some years ago I set to be able to control my smart TV via my laptop. Unfortunately, the graphics card (integrated or discrete) must also support the CEC spec in order to be able to send commands to the TV. Having a HDMI cable is not enough. My laptop did not ship with that requirement.
Also, I don’t know if it changed since I no more have a TV but on my last TV, HDMI CEC was both disabled by default but also renamed with some stupid marketing feature name TV manufacturers are capable of like SmartLink or some shit like this. BTW, it was a LG TV before WebOS.
My Switch's CEC is as flaky as yours, but my Xbox One X does wake up the TV from sleep no worries even if I let it go to sleep, last I checked. Though now I really want to double check this, and see whether it's the Xbox that has messed it up, or the Xbox + TV manufacturer combo?
My Apple TV 4K wakes my TV and to the right input with Airplay though!
I solved the switch input issue by using an automatic HDMI switch from Aliexpress. This thing is remarkably reliable at switching to any source that starts emitting a signal.
I don’t know if there are switches capables of turning on and off the TV based on the presence or lack or signal but that could be the ultimate solution to this issue.
I hadn't noticed that with my Xbox, but then again I may actually have the sleep timeout turned off...
I'm surprised that they'd get it wrong though given that they've properly supported the HDMI VRR and ALLM features since the Xbox One X. (AFAIK Sony doesn't have either of those, even on the PS5. And with Nintendo it's surprising they have any CEC support at all)
Yes, CEC can do much more, but IMO the basic requirement is to power TV on/off properly and control volume, so that I can use whichever HDMI device's remote alone. Everything should be extra.
But yes it can do much more: You can program records over CEC! You can directly select broadcast service! (i.e. not just through key presses, but indeed say which channel you want) You can display some message on TV's OSD! You can discover another device's language! You can make audio 0.1% or 0.01% slower (or 1% and 0.1%? I don't remember)
I don't think I've crossed any device that supports any of the command I just mentioned however.
Though "Dispay some message on TV's OSD" from CEC bus would be perfectly on point based on TFA.
I even programmed my own overlay (app selector) for rpi that listens to CEC commands, so I can use the remote's arrows. Convenient so that I don't need to reach for mouse/keyboard/ssh/phone to start something on the rpi.
That USB to CEC passthru box is $44.92 for reference. I do wonder what chipset it's using under the hood to do its thing, but given the combination of HDMI licensing/royalty fees, manufacturing tooling, the little custom enclosure it has (looks like molded/formed plastic, if I'm using the right term), then factoring in the low production volume, and considering that this seems to be a smaller operation... I think it's quite plausible the margin on the thing is quite small. Very plausible.
While on the page I was curious if the firmware was open source. While writing this and properly articulating the bit about HDMI royalty (and also just now realizing it might need to deal with HDCP... ah) I now realize the question is moot, but I didn't think of any of what while looking at the webpage, so I wondered if it would be interesting to have a cursory poke at the firmware download.
7-Zip (also available as a commandline program as p7zip-full on debian et al) is honestly grossly understated it its capabilities. I didn't feel like remembering how to invoke binwalk so tried `7z x firmware-v7.exe` almost as a bit of a challenge and it actually worked :)
$ 7z l ~/Downloads/firmware-v7.exe
Path = /home/i336/Downloads/firmware-v7.exe
Name = WEXTRACT.EXE .MUI
OS Version = 6.1
Image Version = 6.1
Subsystem Version = 5.0
Subsystem = Windows GUI
DLL Characteristics = Relocated NX-Compatible TerminalServerAware
Comment = FileVersion: 9.0.8112.16421
FileVersion: 9.00.8112.16421 (WIN7_IE9_RTM.110308-0330)
CompanyName: Microsoft Corporation
FileDescription: Win32 Cabinet Self-Extractor
----
Path = .rsrc/RCDATA/CABINET
Type = Cab
Physical Size = 1348694
ID = 7086
Date Time Attr Size Compressed Name
------------------- ----- ------------ ------------ ------------------------
2016-04-13 13:05:12 ....A 184800 flash.exe
2012-08-29 16:19:56 ....A 619536 driver1.exe
2012-02-21 04:24:52 ....A 816224 driver2.exe
2012-07-31 15:35:30 ....A 400 flash.cmd
2012-03-10 11:27:28 ....A 67008 libusb0.dll
...
It output even more lines of metadata than I included. I removed probably 50% of output so as not to spam the subthread.
Awesome.
Hmm... what if I do the same to `flash.exe`?
$ 7z l flash.exe
...
Linker Version = 10.0
OS Version = 5.1
Image Version = 0.0
Subsystem Version = 5.1
Subsystem = Windows CUI
DLL Characteristics = Relocated NX-Compatible TerminalServerAware
Date Time Attr Size Compressed Name
------------------- ----- ------------ ------------ ------------------------
...
..... 67392 67392 .rsrc/RCDATA/101
..... 2496 2496 .rsrc/RCDATA/102
...
------------------- ----- ------------ ------------ ------------------------
2016-04-13 22:05:07 183568 183568 9 files
HMMmmmm, RCDATA you say?
Extracting (`7z x ../flash.exe` from an empty subdir) produced newline-less ASCII hex output ("EF19541A19D258B099B877545DE0B65BBBC5602..."), so after a bit of `printf "$(cat 101 | sed 's/../\\x&/g')" > 101.bin` I came up with... nothing.
Binwalk has no idea what it is. Perhaps the start of the files ring bells for some humans?
For what it's worth, having realized this might be touching HDCP or for all I know might be an FPGA bitstream, I emphasize my interest is purely coming from a decidedly non-exhaustive sense of "ooh that's a cute product, I wonder what firmware it runs and how it broadly works internally" entirely made up of idle curiosity :)
HDCP is unrelated to the CEC bus. CEC is completely separate and unencrypted. The pulse-eight adapter, AFAIK, should just be a female-female connector with two pins plugged into a microcontroller.
I've toyed with the idea of replicating its functionality on an Arduino, but ended up just buying the adapter :)
In the end, I'm not sure how doable it is, but it shouldn't be too complex. Licensing is another matter.
I'm at a loss as to why GPU makers don't wire it up though.
I see (TIL a lot about CEC, added some info to a sibling comment).
I incidentally found https://hackaday.io/project/168696-cec2usb while poking around, last updated a couple years ago so probably not available for sale anymore, but open source at least.
Unpacking the files also produces a certificate, and the strings command on the flash.exe executable reveals also references to various related links, so it is possible that the firmware is also encrypted and signed.
---
http://s2.symcb.com0
http://www.symauth.com/cps0(
http://www.symauth.com/rpa00
http://s1.symcb.com/pca3-g5.crl0
SymantecPKI-1-5670
Symantec Corporation1
Symantec Trust Network100.
Symantec Class 3 SHA256 Code Signing CA
---
The intel NUC exposes the CEC pins to the motherboard, and the same company makes an internal USB/CEC adapter for $15 less[1]. That PCB is almost entirely an MCU and a crystal, so the plastic box and HDMI passthrough adds $15.
The listed photos are just at the wrong angle to be able to read the chip markings (grr)... but I had a bit of a further poke around, and found a listing for a discontinued internal board for HTPC (!) setups: https://www.pulse-eight.com/p/117/internal-hdmi-cec-adapter
If you end buying the product wireshark can capture USB data and then you'll be able to see exactly what data is sent to it when you use their flash util. On windows you'll need to also install USBPcap to capture USB traffic, but if I remember correctly it is bundled with wireshark and just unchecked by default.
Otherwise you might try binwalk with the --disasm option (you'll need capstone installed for it to work) then it will attempt to search files for assembly for any of wide range of processors (obviously would fail if the firmware is encrypted, but I doubt a cheap product like this would bother). I'll also just briefly say that even though RCDATA is microsoft's recommended way to embed a file into an executable I've seen plenty of software that embeds files in other creative ways, so I wouldn't rule out the possibility that the firmware is somewhere else in the executable, binwalks entropy analysis mode can sometimes work firmware that is otherwise hard to locate.
You sound like you know a lot about the HDMI CEC spec. Do you have any insight as to why CEC implementations are often buggy on so many devices? I.E. "one touch play" turning on unintended devices in addition to your display and receiver.
First, I have to plead guilty: my own product's CEC still has many "simple" flaws.
That being said, I indeed have the feeling that CEC is specifically badly implemented by most people.
I feel like the standard is rather under-specified (or there are things I don't understand), like it's said how some kind of devices are supposed to behave based on some commands, but no explanation as to how other devices are supposed to behave on those commands.
For instance, it is specified that a Player (there are various types of devices in CEC, more on that later) can send an AVR (which is different kind of CEC device, there can only be one in the whole network) volume commands. More specifically, a Player can send an AVR VOL+ and VOL- key presses to change volume. What happens if you send VOL+/VOL- to a TV? That's not specified until 2016 with CEC 2.0. So you can't control volume on most TVs if you don't have an AVR.
The number of "slots" available per device kind is constant: exactly one TV, exactly one AVR three players, 4 tuners, 3 recorders. You have a tvbox, and three game consoles (all shold be Players)? Well someone will have to lie and become a tuner or a recorder, or they won't be allowed on the bus.
Another thing that makes this messy, is that CEC needs to fit in small power budget during suspend. For instance, in Europe, the legal power budget in sleep is 0.5W. This means that your main application processor can't handle CEC in suspend. Usually, this leads to multiple "concurrent" CEC stacks, running in different CPUs, switching from always-on Cortex-M, to full-blown Cortex-A (and then you add Android TV on top of that with its own CEC stack, and you get three CEC stacks co-working together). Often the communication pipeline between those people is pretty light, and going through all those layers, you might end up losing the info of whether the wakeup instruction came from your remote (so you legit want to have the TV wake to you), or from CEC (so you want to let TV decide of the output).
I believe one gigantic factor is that CEC has started very poorly (no matter the reason), and since then, interoperability problem has been considered by most QA as "yeah well, this is life"
Back specifically to your question "I.E. "one touch play" turning on unintended devices in addition to your display and receiver.": This is very very usual. I don't know what the specs say precisely, on the matter, but here's what I witnessed on many TVs (many enough that I expect this to be the standard, but it's possible it's related to what I said earlier about dual-stack):
- Say you have HDMI1 and HDMI3 devices connected, you suspended TV on HDMI1, you wakeup from HDMI 3.
1) On wakeup, TV sends Set Stream Path to the previously selected device. If you suspended your TV on HDMI1, and you're waking up from HDMI 3, TV will start with a Set Stream Path to say "hey, the screen I'm currently displaying is HDMI 1".
2) HDMI1 device listens, and wakes up. In the process of waking up, it needs to tell the TV it is ready with ACTIVE_SOURCE command, so they do.
3) If HDMI3 is in the "clever" range, HDMI3 will send again "Please TV switch to HDMI 3" with ACTIVE_SOURCE
4) Even if HDMI3 isn't in the "clever" range, TV will later send Set Stream Path to HDMI3, because it remembered HDMI3's command to wake up to them, or because of thanks to -3-
5) Everyone's happy, TV's on HDMI 3
6) Message sent in -2- finally manages to reach the bus more than one full second later, because HDMI1 is less aggressive on CEC bus than the other.
... And there you go, TV switches back to HDMI 1.
Fixing this is possible, HDMI 1 "just" needs to cancel -2- when seeing -3- or -4-, but most CEC implementation's send_pkt doesn't include cancellation signals. So, it's possible to make better CEC implementations (though it requires mechanism that are pretty complicated for an embedded world), but I don't think it's possible to make a perfect implementation that will never miss.
At times, I regret DisplayPort not having something comparable (it has data channels, but nothing "as well specified" as CEC for controlling other devices). I think you explained part of the reason. If it were specified, it would require careful consideration, and probably a conformance test suite. I still fully expect manufacturers to try to re-brand it, extend it themselves, and generally botch their software like they end up doing most of the time.
Also in play is that some parties have patents on certain kinds of control systems using CEC, so just implementing something that seems useful can get a device manufacturer in legal trouble :(
Wow this is pretty detailed, and makes a lot of sense. It's a shame the standard does not seem more well thought out.
One solution I've always felt would be handy would be for devices to have an option to ignore any CEC commands that would normally cause them to wake. As it is right now, everything is so buggy between my LG TV, Yamaha Reciever, and "players" that I have to disable the feature entirely on either my PS5 or Apple TV in order for things to not go haywire.
mine works decently, except something seems to often be sending an errant ‘on’ signal to my a/v receiver: i hit the power button (on any remote) and everything turns off (as i want), only for my receiver to wake up 1-5 minutes later (my dvd/blu-ray player also comes on for a second, but then turns off automatically). super annoying.
in my experience it doesn't even do that very well. I tried getting a raspberry pi to power off an attached display and support of on/off seems to be hit or miss and specialized.
I suspect it isn't supported because it will commoditizes the device, when manufacturers really want their own protocol that connects only to their own products.
much better sales of acme products if an acme tv can only connect to an acme soundbar and an acme dvd player using acme protocols.
I wish they'd made CEC a little more powerful. I like that I can control my AV receiver's volume with the TV remote, but I don't like that I can't see what the volume is actually set to on the TV. Even if I had a receiver that could overlay onto 4K video, that wouldn't help for the cases where I'm feeding audio back from the TV to the receiver.
It's ""fun"", because there is definitely a CEC command for TV to know AVR's current volume.
But I can't blame TV manufacturer for not implementing it: When sending vol+/vol- commands, you're already stretching the CEC bus pretty thin, you can't afford to spam the bus more with more commands...
Uh, oh. The work project I'm working on uses an OLED display that uses an I2C interface. And I happen to have a spare one sitting on my desk. And I'm no electrical engineer, so let's see how fast I smoke this thing. :-)
Should I succeed, TFA author will not have the worst HDMI display ever, as the one we're using is a two-line display looking something like this (it can do rudimentary graphics, though; we display a company logo at boot):
OT: I'm idly curious why that display (photo: https://www.mouser.com/images/vishay/hd/O020N002ALPP5N000A_t...) has what appears to be a detached single vertical column of presumably-unusable pixels on each side of the active area. Obviously to do with manufacturing, but... ?ˀ?
(I'm also curious about the "Pricing: Request a quote" bit. Doesn't seem to be "not for new designs", but does have a bit about "Factory special order", so perhaps this is just an indication of unusually limited supply or something.)
Here's my reckless guess: These matrices are manufactured in a continuous strip and cropped at the appropriate length, then the interface is attached to the cells and it's sealed in its case. It would allow the factory to produce varying sizes of display.
If you put an OLED display behind dark plexiglass it looks really nice -- the border of the display becomes invisible and you see just the letters (or whatever else you are displaying).
I'm not part of the team that makes such decisions, being a lowly software engineer that never got any good at EE. However, as one who was at least in the room when such decisions were discussed, I believe the decision-making process consisted of "OLED would be way cooler, and they're not that expensive these days". I'm pretty sure that is the sum total of that "engineering" decision. :-)
But I will say that the OLED looks a lot better than most LCDs of that type that I've seen.
EDIT: oh, yeah; sibling comment says something about backlighting an LCD, which might have had something to do with it, as the OLED replaces a seven-segment LED display from the previous generation (which, duh, also doesn't need a backlight).
At those tiny sizes (and in small quantities), the price difference is minimal but commonly available OLEDs and and LCDs often come in different sizes. So the choice may have been more about dimensions than display type.
I can recall back in the early 2000s, we had a wave of people wiring up little character LCD displays for hardware monitoring. They inevitably looked awful at angles, washed out backlight, etc.
The super-lucky people could get a VFD display, which was a million times more legible, but they tended to be spendy, fragile, and warm-running.
OLED gives you everything: cheap, pretty cool running, and easy to read.
I've been experimenting with one of the Digole OLED modules to recreate what I wanted 20 years ago. :) (super-easy to interface to USB and program-- we've come so far!)
Not the OP, but small OLED displays are pretty low cost, and they don't need backlight management because the display is emissive. Viewing angles are usually nicer too.
I love this. What is it about pointless technical projects that are sometimes so alluring? I wonder if it's the removal of secondhand stress since there is no 'meaningful' success criteria.
Yes, I think removing the "must do a thing" component confounds macro-scale sensible classification and measurement of the discrete work that is being done. It's possible for manglement to see that effort is being invested, but the significance of the result cannot be perceived due to lack of resonance with the engineering mindset. This makes it possible to personally take ownership of engineering agency and may be the core reason engineers survive at all because they are able to take and own personal responsibility for their own learning.
Knowing it is not possible for others who don't "get" what we're doing to usefully measure or judge our work in turn disengages the "do thing in anger" stress associated with captive/acute focus, and (in ideal, spherical-cow-like situations providing infinite time) enables unbounded, open-ended introspection into reinforcing the mental solution-finding capacity within the domain in question. (In practice, infinite time would quite harmful as it would provide more space than our attention spans could fill; the practical ideal may be to find the right balance between work (acute focus) and zoning out, which might be trackable by identifying the precise points our ego lags slightly behind, but is cognizant of, our as-yet unused physical capacity.)
Being able to engage in this introspection is critical important for learning: it's almost like dreaming, in absence of any singular focus on finding an optimal solution to a given problem within a limited time frame. This makes it possible to pay attention to the problem-solving network as a whole, cross-reference and merge fragmented ideas that have developed independently, and drift toward blurrier edges of understanding to help reinforce them (ever noticed how the things you instinctively find super interesting that you really want to dive into - and often the itches you want to scratch - all depend on skills that happen to be right at the point of establishing minimum-viable cohesion and fundamentally clicking into place? We wander aimlessly... but we don't!).
Well, another "pointless" project this guy made eventually became a product, the flash synth. It's a tiny synthesizer that fits in a MIDI DIN5 plug. I own one because I love small and weird synths. It actually sounds pretty nice!
Same here, I thought this was great. Even though I have been in the "computer field" for a loong time, there is still so much to learn! I love articles like this.
This design is devilishly clever, but a tad misleading. It's not really an HDMI display notwithstanding that it is a display that plugs into an HDMI port. HDMI has an embedded i2c bus, and this project uses that to send images to the display. That's why you need a layer of software (xrandr) to translate images that the system produces into i2c.
If you took the time to build this, then travelled back to the 1800's when people had standalone DVDs, and plugged this thing in expecting it to work, it would not kill my buzz: I'd find it quite entertaining to watch, really.
This isn't so much an HDMI display as an I2C display utilizing the DDC channel in the HDMI interface as the I2C bus. I've seen folks utilizing this on the Raspberry Pi as a generic I2C bus as well.
What great timing. I've rearranged my desk and have trouble fitting a second monitor next to my ultrawide + big speakers, and I've been looking for a smaller monitor.
This might be a tad too small, but worth considering!
20 years ago, we came across this tiny CRT display. I think it 4 X 5 inches. We pranked a coworker with it, replacing her large display with this tiny thing.
And left a note explaining how the black and white display was an upgrade. :-)
I recently learned about evdi, an out of tree Kernel module that can add as many virtual outputs as you want. Ideal for things like this, or where you'd normally use a dummy plug.
It's unfortunate it wasn't accepted in-tree because while it was initially developed for a specific commercial product, I do think it has the potential to be generically useful.
The ability to conjure up a display "in software" & then just have the rest of the system automatically treat it as any other physical display is pretty powerful.
There's a couple of projects on my long "projects to do" that would make use of it...
I'm sure that those with a lot more hardware experience than me have the intuition to make guesses like this but I was surprised at:
>You have to register to download the HDMI spec which is more effort than I have for this, but the Hot Plug Detect pin has a pretty descriptive name. I guessed that this either has to be pulled up or pulled down to signal that a cable is connected. Sticking a 20K resistor to the 5V pin seemed to do the trick. With the oscilloscope, we can now see activity on the SCL/SDA lines when it's plugged into the laptop.
Is it really not that concerning to just guess what amperage will/won't fry something? I mean, you could test from very high resistances downwards, but you might overshoot, right? Or is there an assumption that the hardware on either end will provide the appropriate resistance, in this case?
Unrelated but wanted to ask here since this forum may have the knowledge.
Recently I bought one of those 1) USB-C 4-in-1, 2) USB-C 7-in-1 (Anker) to attach an external monitor with HDMI. When connecting to a laptop is there details how to retrieve more details of all the capabilities and ports?
What might also be some easy recommended way to start understanding how something like this works. To start, just sending a "high/low" signal to a data pin either straight connection into HDMI port, USB port as a direct connection (and how about when using a multi-port)?
As far as I know, it isn't really rendered by HW, but composited in HW.
GPUs usually have multiple "Hardware planes" that are composited together in hardware before sending the signal. The OS is free to put whatever it wants there. Most HW include a designated "cursor plane", but it could be used to display whatever.
IIRC a minimum of 3 HW planes is specified somewhere, it could be the Wayland protocol, or the Linux Direct Rendering Manager API. Some devices can have less though. That would be the case for that display, which could use an actual framebuffer driver (although planes could be emulated).
You can thus have a layer for the background, one for the windows, and one for the cursor, and avoid re-painting too often, even though it should be pretty cheap for GPU-accelerated surfaces (like with compositors that rely on OpenGL).
You don't need to go through all the steps involving HDMI. Linux can create framebuffers on I2C/SPI displays like this. Many years ago, I connected two to a Beaglebone and it ran Emacs just fine. Much device tree hacking is involved, though, unfortunately.
HDMI is a pretty crazy protocol. CEC, HDMI, and then on top of that a dedicated hot plug detect? Is all this really needed? Can't I2C do everything?
They could use a 1M pull down on SDA, and pull it up at the display side instead of the host side, and have that be hot plug detect, and then just use i2c for control as well.
Ethernet is reasonable, but why not just have audio over Ethernet?
Or for that matter, USB2.0 instead of ethernet, and then you can have a hub on the monitor, a webcam, Ethernet over usb if you want, and audio return over usb?
My Pet theory for "Why did 100MbitOverHDMI not see traction" is, that -apart from linux/BSD- no other mainstream OS had reasonable-if-any support for meshed ethernet.
I always hoped for a future, where all the devices can ootb intelligently decide, over which of the available interfaces (Eth, Wifi, HDMI) to send data per peer device. It would substantially reduce WIFI congestion in lots of (MediaCenter) installations.
Same with CEC: "Windows doesn't support it, so why add it to GFX-card hardware?" ARC: Same deal
"Not-supported-by-windows" has led to the death/non_implementation of lots of useful things/tech not only related to HDMI. E.g. the Per-Port-Power-Control spec part of USB is only implemented by some rare USB HUB(-Chip) vendors, and never tested for in any review. On supported hardware it works great with `uhubctl`. Ubiquitous 802.11s Wifi mesh support is in the same boat.
As I say, my "pet theory". Happy to learn specifics.
For how amazingly useful it is... networking is kind of hacktastic at the protocol level.
It's a bunch of layers that don't know anything about each other, and any even slightly unusual setup is a nightmare of manual configuration.
It happens to do exactly what people do with it very well, central web services and some limited LAN stuff on the same subnet.
I almost think things would be better off without the OSI model, if we were to start over. Just one CJDNS-like mesh, with native support for encryption, firewalling, pairing and discovery, all in one place, under one version number.
Everyone talks about the flexibility but I'm super not convinced things like 6LowPan are a good idea.
Just being able to say "This isn't IP and has no way to get on your network" is a big advantage of things like ZigBee, in a world where nobody really trusts this stuff.
Very cool hack indeed!
One thing I learned from this is that having an HDMI port gives you a free I2C port on top as well, as long as you have an old HDMI cable lying around to cut open.
Small note on using these OLEDs with 5V:
Typically they expect 3.3V Vcc and logic levels, although almost all of them seem to work just fine with 5V. In my experiments with a regular 5V Arduino some OLED modules made weird coil whining noises, I presume this is from the charge pump circuitry. Driving them with 3.3V as specified removed the coil whine completely.
I am on a flagship HP Omen Gaming Laptop (I arguably got the very first literal laptop shipped from the factory (have provenance, long story)
And this stupid machine ($3K) has a laughable casing (super chints, bends easily, not good) -- it has (3) USB ports, with (1) being SS, the other two std, and (1) USB-C port.
ZERO power out of the USB-C port.
I have purchased multiple USB-C "hubs" -- none have pwer, all require external USB power.
So imagin this: You buy a USB-C monitor, but your USB-C port provides no power, so in order to plug in a USB monitor into your USB-C HUB connected to your laptop via USB-C, you need a secondary cable to go through a regular USB port on your machine to provide power to the USB ports on your USB-C hub, thus consuming BOTH the USB port AND the USB-C port on your machine....
What a fucking design flaw this machine is.
The guts are all hyped around the 165Hz screen and the RTX... but the physicality of this machine sucks.
HP OMEN 15" 5800 RTX 165Hz machine... "gaming laptop"
I feel like cable companies that have found a market with overpriced cabling for high-end entertainment systems, like Monster and Denon, could make a killing on this.
This not cheap nor does it have a screen, but one of its many (many) uses allows interfacing USB to UART with auto-detection of both BAUD rate, voltage and whether or not its inverted: https://www.crowdsupply.com/1bitsquared/glasgow
Maximum device name is 16B. If you have a TV, a tvbox and a game console, simply asking the name of those devices can take more than a second. I'm in the process of shortening my product's CEC name, in order to reduce congestion on the bus.
Mainline Linux CEC developer jokes that Voyager 1 which is on the other side of solar system, made 40 years prior to CEC, is faster than CEC.
Edit: Sorry for the atrocious original formating