Hacker News new | past | comments | ask | show | jobs | submit login
When HDMI 2.1 isn't HDMI 2.1 – the confusing world of the standard (tftcentral.co.uk)
232 points by cbg0 on Dec 13, 2021 | hide | past | favorite | 181 comments



Quick summary, HDMI 2.0 no longer "exists", and HDMI 2.1 only features are now optional according to the certifying body. Manufactures are supposed to indicate which features they support.

Whelp, I guess we should just stick to Display Port 1.4


I see they went to the USB school of standardisation


USB 2 Full Speed = USB 1 speed

This has been going on for 20 years


That was already called full speed in the USB 1 days though. There's a mode called low speed too which was 1.5mbit and meant for low throughput devices. Full speed was 12Mbit. This way stuff like keyboards and mice didn't need to include the highest end controller chips.

Calling it Full Speed was more a lack of vision than a deliberate attempt to confuse consumers as this HDMI thing seems to me.

So then when 2 and 3 came out they were forced to find superlatives. Hi-Speed, Superspeed.

The same happened with radio. High Frequency was up to 30Mhz. Then they found they could go higher. Very High Frequency, VHF. Then came more advances and Ultra-High Frequency.

Eventually they gave up after XHF and SHF and started using band letters :)

Using relative terms for constantly changing technologies is just a bad idea :)


The same people are the ones that decide to put _final/_latest/_etc in file names


The newest version of USB seems to have solved this by labeling ports with the supported speed (5, 10 or 20). Assuming they start doing the same with the cables, at least that part of the problem becomes tractable.


There is a reserved seat in hell for whoever allowed usb-c cables to have a max speed of usb2 (480Mbit/s).


USB SuperSpeed requires extra wires and much better shielding, which means a thicker, more expensive, less durable, cable.

You also can't make SuperSpeed cables that actually meet the spec once they get past a couple meters.

If I just want to charge my phone or laptop, a High Speed cable is actually better.


USB-C itself has nothing to do with data rate at all. It's really the combination of USB-C and USB 3.x standards made things like hell. It was supposed to have one single cable to do everything but the fact is that I have far more cables which look almost exactly the same but with dramatically different capabilities and I have no idea which can do what at all.

But nonetheless, USB-C indeed introduced one issue that I never imagined: sometimes my phone decided that it should charge the charger instead of being charged...


I don't know, sometimes I just want a very thin cable that is flexible - for charging mostly. I bought a proper USB-C anker cable for my phone early on, and replaced it literally within few days as it was proper big shielded cable that had zero flexibility, horrible for charging. I didn't care that technically it could do 10GB/s, it just wasn't necessary.


In theory, the choice of USB-C or -{,mini,micro}A/B is orthogonal to the choice of version: there are USB-A and even USB-B connectors (with additional pins) that support USB 3.0 as well. There is a logic to this.

In practice, it is confusing.


micro-USB requires a different extended connector for 3.0 [1], and I don't think there is a mini-USB 3.0. USB-A is generally coded with a different color and the extra pins are clearly visible [2], especially when you hold a 2.0 cable side-by-side. So it is all quite clear except for USB-C.

[1] https://www.startech.com/en-us/cables/usb3aub15cms

[2] https://www.howtogeek.com/wp-content/uploads/2014/01/usb-2.0...


There was logic to it: you could have a completely passive cable with a type c connector that would interoperate.

Agree or disagree, it wasn’t completely absurd.


There are "charge only" micro usb cables too; I hate them.


I'd be fine with charge-only cables if there was a distinctive, mandatory, universally-honored way of indicating that.

I deliberately carry charge-only cables when I anticipate encountering untrusted chargers, and I've designated all mine with a band of red heatshrink on both ends.


These are useful for charging in no security situations. You can connect your device to an untrusted port in a wall with a charge-only cable.


If they have an A connector at the other end they are easily identified by just looking how many contacts it has.

These were made to save money but I found them handy for security reasons (no need for a USB condom).


USB condom. And now I know. Where can I buy a trustable one of these?


Pretty easy to validate— you plug your device into a trusted computer first and it doesn't recognize it.


For Type A chargers they are readily available online and can be testing using the approach mikepurvis wrote in a comment parallel to this. Or you can make yourself a clunky one as it’s just two wires.

I have seen them called “data blocker” or “secure charging cable” but what you want is one that is a female-to-male device so you can attach it to charging cables not just bricks.

Type C is harder. I have one from the early days of Type C that doesn’t do PD so in the end it’s only useful for phones. I haven’t seen one that does Power Delivery.


"full" was bad planning but I've never seen anyone advertise USB "full speed".

And I've never seen a USB 1 port or device labeled as USB 2. Is that even allowed?

So the problem hasn't been going on for 20 years.


I really wonder what is up with that. These standards become increasingly frustratingly complex even for people who deal with them daily.


It’s just design by committee and long-standing efforts for backwards compatibility. Also the people writing the standards are far too familiar with them and thus a bit lost when it comes to making practical decisions.

Whenever you make changes there will be compromises and someone will have reason to be unhappy.


There is no way that eliminating 2.0, in such a way that everything that used to rate as 2.0 now qualifies as conforming to some subset of 2.1, can be justified as backwards compatibility.


Cable manufacturers probably realized that the secret to profits lay in resisting commoditization, beat a path to the table, and made it happen.


I don't think its particularly odd that the specifications are supersets of old versions; indeed that feels pretty common in the standards world. IETF specs are maybe the odd ones out where you typically have to read like ten different RFCs to get good picture of some standard.


Free market of fraud is going on, there should be a cost to lying


LOL. Why don't you understand superduperspeed USB 3.4.6 2x2x6? It's so eeeasy.

And, coming soon to an Amazon app near you:

(Pack of 3) USB4 Thunderbolt 4, 98 ft / 30m, 240W charging, gold plated! $19.99! Free Shipping!

It's not like anyone is checking products are what they say they are.


You can't legally call your food a hamburger if it's 90% meat glue and 10% cow.

Can we not do the same with cables?

Whoever is responsible for setting standards must be getting some good kickbacks from all this...


The people writing the standards are also the ones implementing it. That’s the kickback. That’s why every USB 3.whatever device suddenly became USB4 ones.


Same is done with the bluetooth standard. Manufacturers proudly claim their product uses bluetooth 5.0, reviewers assume that means higher speed and lower power, but the products don't actually have to support all sections of the standard and can pretty much just be bluetooth classic with 5.0 slapped on the label


Seems kind of reasonable. DisplayPort is exactly the same - just because your display supported DisplayPort 1.4 doesn't mean it is required to support 10 bit colour, VRR, etc.


I replaced my old Dell XPS laptop with a newer Dell laptop. The old one had been happily driving an external monitor through a thunderbolt cable (USB-C on the laptop side; DisplayPort on the monitor side.)

The new laptop still has a thunderbolt port proudly advertised in the specs. But I'm not allowed to use it. It won't send video data out that way. And when I called tech support, the most they would offer me for my newly-purchased laptop was "Don't use the thunderbolt port. Use the HDMI port."


I thought that the primary feature of Thunderbolt was DisplayPort signaling. What makes it Thunderbolt vs USB-C if it doesn't have DP support?


So Thunderbolt 2 included a DisplayPort signaling layer but (AFAIK; may be wrong) Thunderbolt 3 does not -- instead, the hardware must implement the USB-C Alternate Mode for DisplayPort (and in turn is also implementing the USB-C Alternate Mode for Thunderbolt; it's not USB, at least not in USB3).

Thunderbolt's main draw is being "PCIe on a wire".


My port is advertised as "Thunderbolt 3/USB 3.2 Gen 1 Type-C port with DisplayPort with alt mode". I guess the problem is that it can't speak whatever version the cable understands? The cable says "thunderbolt 3 compatible". And like I said, the same cable worked fine with the same monitor and a previous Dell laptop. :-/


That's crazy, I just got a new laptop from work, a lenovo instead of a dell, the dock is still an older dell thunderbolt dock, and it refuses to correctly work with the laptop. My (3) monitors show up but ethernet and usbs all cycle in and out. I kinda expected it bec the dock had a lot of problems even with dell laptops I eventually got it to work with mine but I know it got so bad dell stopped selling it pretty quickly. Now I need to shell out $400 to lenovo for a new dock that will (hopefully) support 3 monitors at 1080p60hz.

I feel like thunderbolt has gotten to the point that you could legitimately make a great side hustle on affiliate links from in-depth usb-c and thunderbolt accessory reviews.


That sucks - but I can attest that the Lenovo Thunderbolt dock is very solid (I have one of each generation).


That sounds like it should work, yeah. USB-C alternate modes should work over any TB3 cable (as far as I know). That's weird.


They won’t work over active thunderbolt 3 cables


Is that true? Hunh, I was unaware. So a TB3 dock with DisplayPort is handling the display protocol? Is there a standard driver on the OS side?


Not true, thunderbolt 3 can carry up to 8 DisplayPort lanes, something usb-c alternate mode cannot do. Both can coexist


Interesting - I must have misunderstood. Is there a good protocol breakdown of the USB-C to Thunderbolt changes and switch-over? (I assume there has to be, as USB4 brings them closer together?)


What model it is? To know what to avoid. Thanks.


G15.


Well that makes sense. Your 1080p display shouldn't be forced to accept an 8k signal just because the interface supports it.


The first thing, this isn't the first time Xiaomi has done something like this. Earlier this year they launched a WiFi 6E router, every single press ran their PR without fact checking and looking through the data. The router doesn't even support 6Ghz along with a few other optional features. After hours of searching, reading through spec apparently no one on the internet gives a damn, I wrote to Wi-Fi Alliance for clarification, and Xiaomi finally retracted their PR and relabel their router as WiFI 6 only.

The second baffling thing is how HN's react to USB-C and HDMI so differently. With USB-C / USB 4, more than half of HN expect the consumers to know which cables and port offer which features. It is the consumers's fault for not choosing the right cable or knowing what their port does. I dont see how that is different to HDMI here. All it takes is one bad actor in the market. And it will be a race to the bottom.

That is partly why Apple brought back MagSafe. You dont tell the user to go and find a USB-C 120W ( or 240W ) capable cable that may or may not be compliant to do quick charging.

You tell them to use MagSafe cable.


Weird, I haven’t seen HNers blaming the user for USB confusion. In fact I am sure I’ve seen articles here getting shared and upvoted which discuss the USB confusion and express sympathy for users while trying to disambiguate the situation


Yes you get these article submission, but you still see many of HN comments defending USB-C as a non-issue or an user issue. It is the user's fault for buying the wrong cable. Just get the Thunderbolt Cable.

For most part of 2014 to 2019. USB-C was hailed as the silver bullet. The increasing backlash against USB-C was a very very recent thing.


https://news.ycombinator.com/item?id=29445844 this was the last discussion I saw on HN about this, very much not hiding the complexity or laying blame.


You linked to one specific comment. If you zoom out and view the entire thread, starting with the first comment on that story, you’ll find plenty of comments defending the current state of USB.


Yes. And imagine from 2015 to ~2019 I was the only few who voiced those concern, even before the Google engineers raised awareness of non-standard confirming USB-C. Getting downvoted every time while 95% of the comments were in strong support of USB.

The only way to solve a problem is to first accept we have a problem. And we aren't quite there yet.


Here's the top few replies to the submission that comment is replying to:

- person saying they've only had a smooth experience with USB-C cables

- person acknowledging that this is complex and is grateful someone took the time to make the article

- someone lamenting that it's hard to know cable capabilities

- person asking for advice on some BIOS modes

- someone saying USB-C solved some problems but also created others

- someone annoyed about the amount of cables they have to have now

- someone wishing it was simpler to find out capabilities of cables

- someone saying they were probably lucky but they only ever had one problem with USB-C

I've seen a couple of "works for me, maybe read the packaging on the cables?" replies but honestly my impression is not that HN commenters are dismissive of how USB cables are confusing, but that many are themselves irritated with the state of affairs. If you were personally treated badly after expressing this opinion here then that sucks and I'm sorry, but you can definitely count yourself among the majority of people (myself included) on this site who wish it was a little clearer.


This 100% :-) , USB was always a prime example on how NOT to name things. The problem is that HDMI makes it even worse.


> The second baffling thing is how HN's react to USB-C and HDMI so differently. With USB-C / USB 4, more than half of HN expect the consumers to know which cables and port offer which features. It is the consumers's fault for not choosing the right cable or knowing what their port does. I dont see how that is different to HDMI here. All it takes is one bad actor in the market. And it will be a race to the bottom.

With USB-C I get something out of the confusion - a really versatile cable standard. With HDMI there is no downside (to the consumer) in just calling HDMI 2.0 so.


LTT did some manual testing of HDMI cables [0] in hopes of answering the last question of this article, "how do consumers know if a cable supports v2.1 features?"

Does anyone know of other tests or more comprehensive data sets?

[0] https://linustechtips.com/topic/1387053-i-spent-a-thousand-d...


Has anyone ever seen a device that actually uses Ethernet over HDMI? The thought of being able to plug a single network cable into the back of your display and then anything plugged into that has a wired connection is lovely, but as far as I can tell absolutely nothing actually supports it, despite the ever growing set of internet connected devices sitting underneath people's TVs.


Ethernet Over HDMI is used by newer AV receivers to support eARC (extended audio return channel). The older ARC spec would work with any HDMI cable, but bandwidth limitations only allowed compressed 5.1 surround sound. eARC uses the higher bandwidth from Ethernet Over HDMI, allowing uncompressed 7.1 surround and Dolby Atomos streams.

(If you're not familiar with ARC/eARC, this lets the TV send audio from its native inputs back to the AV receiver over an HDMI cable. Without ARC, you need to plug everything directly into the AV receiver.)


eARC is neat in theory, but my experience with it has been that’s too unreliable and unstable to actually use in practice.

I even bought new cables to make sure there wouldn’t be issues, but eARC audio regularly falls out in ways other sources (including regular ARC) doesn’t. And when it fails there’s literally zero tools for diagnosing it either.

Maybe around the time of eARC2 we’ll have something working as well as Bluetooth does today. (Yes, that’s me being snarky)


I've had good luck with eARC without any intermittent issues. Of course, I'm only using it because my tv has two EDIDs, one that allows 4K video for my streaming box, and one that allows audio for my Blu-Ray player. So I have the streaming player connected directly to the TV and audio over eARC to the reciever instead of going through the reciever. I was pleasantly surprised that this works fine, but not surprised that it doesn't work for everyone.


I’ve had no problems with eARC when used together with my Appletv and LG C1 TV - which I’m happy about because the appletv does not offer any other wired audio ports.


That's unfortunate. I've been hoping to simplify my HT setup and eARC was something I wanted to target in an upgrade


> The older ARC spec would work with any HDMI cable, but bandwidth limitations only allowed compressed 5.1 surround sound.

My 2012 Denon AVR-1913 won't do ARC with just any HDMI cable. According to the manual it must be an HDMI with ethernet cable and experiment has shown that the manual is correct.


actually, if i understand correctly, earc doesn't use HEC. it just re-purposes hec wiring for something useful


It uses the same pins, but it doesn't actually use Ethernet. I'm pretty sure it's IEC 60958/61937, like SPDIF.


I went down this rabbit hole the other night and found a German Blu-ray receiver T+A K8[0] from 2012 that supports the HDMI Ethernet Channel. I have not found, however, the other piece of equipment that I can only suspect may be be some sort of HDMI IP injector.

[0](https://www.homecinemachoice.com/content/ta-k8-blu-ray-recei...)

> Ethernet switch: distribution of an Ethernet uplink connection to BluRayplayer, streaming client, TV monitor and up to 3 source devices (via HEC),up to 2 more external devices via LAN cable (e.g. playing console

from the manual


My understanding is that Ethernet over HDMI is still used by consumer devices, just no longer for the original dream of switching wired internet given the modern ubiquity of WiFi. More recent standards such as ARC [Audio Relay Channel; used for a number of surround sound setups] and CEC [Consumer Electronics Control; used for passing remote/controller data between devices] both piggy back on the Ethernet pins, and I believe they entirely interfere with using the Ethernet pins as Ethernet (though maybe only in the available bandwidth/speed?).


I tried to use this once in a theatre to connect a camera watching the stage to a greenroom backstage. It worked sometimes, but was super unreliable. Latency was often several hundred milliseconds, and sometimes the image would just straight up disappear. It may be that we had bad HDMI<->Ethernet devices, but that’s the thing: It’s not a “works or doesn’t” kind of thing, it’s a “varies with the quality of all the devices in the chain” kind of thing.


Those HDMI-over-Ethernet devices have very little to do with Ethernet-over-HDMI.

Unless you really boosted the gain, they could really only do about 30m reliably. What you wanted to buy is HDMI-to-HDSDI and HDSDI-to-HDMI converters, which could do much longer over RG59 (and much longer over fiber).


This is a stellar example of how catering to everyone results in the destruction of a brand. “HDMI 2.1” will be with us for years and it’s essentially irrelevant now, and they aren’t willing to admit they were wrong, so their only hope is to release an HDMI 2.2 that is just “all of the optional features of HDMI 2.1 are now required”, which will cause howls of outrage and confusion. I’m guessing they are too captive to manufacturers to have the courage to do that. Oh well.


It wasn't an oversight to make the features optional. They're deliberately optional so device manufacturers aren't forced into a ridiculous all or nothing situation.


It's fine that they're optional but the specs for a device that doesn't support everything really should read:

HDMI 2.0+{the specific newer things supported}


At a certain point I wonder if you just declare both 2.1a and 2.1b with different required subsets of the "full" 2.1 functionality. Or, like, 2.1 requires only the basics and you simultaneously declare 2.2 which has the extra stuff. Remember 802.11 b/g/n routers, and how they're up to, like, ax now? Slightly different situation, but I'd call it a reasonable demonstration that consumers can handle there being four or five different versions of a single standard, as long as those versions are narrow, clearly defined, and well communicated.


I think they would have been better off forcing manufacturers into that situation, and that the feature list has grown so large that it’s no longer a sensible specification for purchasing decisions, which will erode consumer trust.


I see it as a problem with the abstractions being on the wrong layer.


It's kind of going in that direction with HDMI and DisplayPort over USB 3. But.. not exactly because they're not actually encoding video data into USB packets. It's too difficult to have abstractions like that when you're dealing with such insanely high data rates.


The parts that are necessary to make a screen run at full capacity very much shouldn't be optional. Like taking input at native resolution, native bit depth, max native framerate, no chroma subsampling.


AND manufacturers not wanting to be stuck with old branding on devices.

the standard coukd have made some things optional. But they made everything optional.


> They're deliberately optional so device manufacturers aren't forced into a ridiculous all or nothing situation.

Thats so anti consumer.


HDMI has always been a shitshow. I spent years of my life playing whack-a-mole for a major set top box manufacturer because of incomplete or braindead implementations. CEC added a whole 'nother layer of shit to go wrong. There is a reason CE mfgs go to "plug fests"(https://www.cta.tech/Events/CTA-861-PlugFest) instead of being able to trust their implementation will work with other devices as long as they follow the rules.


I'm still living this dream today. It's amazing how terrible brand spankin' new "hospitality" (hotel) TVs are when it comes to poor implementations. It is still whack-a-mole and often has to be compensated for on the set-top side because the TV manufacturers are even worse at firmware.


Whoa I had never heard of a plug fest before, the concept seems pretty interesting. Silly of me to assume the mega corps just buy their competitor’s products internally.


Buying each other's products only gives you what's actually made it to market. Plug-fests are full of unreleased stuff. (Buying each other's stuff, to tear it down for competitive analysis, is exceedingly common. But you don't do that at a plugfest.)

Trivia: Plug-fests used to be called Bake-offs, but Pillsbury sued to stop it: https://www.cs.columbia.edu/sip/sipit/pillsbury.html

Trivia 2: Bluetooth calls theirs an "unplugfest". https://www.bluetooth.com/bluetooth-resources/unplugfest-eve...


When the new MacBook Pro came out this year, everyone was puzzled as to why the newly included HDMI port was only 2.0.

Well it turns out they lied. It’s 2.1 after all! \s

Jokes aside, it’s actually only 2.0 because internally they transmit a DisplayPort video signal and convert it to HDMI using an MCDP2900 chip[0], which is the same chip usually seen inside USB-C hubs.

So the new MacBook basically got rid of the HDMI dongle by integrating it inside the laptop.

This also breaks DDC/CI on that port and now I get a ton of support emails for Lunar (https://lunar.fyi) because people can’t control their monitor brightness/volume and think that the app is broken.

[0] https://www.kinet-ic.com/mcdp2900/


"So the new MacBook basically got rid of the HDMI dongle by integrating it inside the laptop"

Damn, that sounds like something a no-name 'made in China' brand would do, what is the possible justifucation for this on a premium device?


AIUI, Sony did the same with the PS4:

> If you query the GPU information, you actually get that it has an HDMI port and a DisplayPort port. [...] If you ask the GPU, it tells you that HDMI is not connected, DisplayPort is connected. OK. Yeah, they have an external HDMI encoder from DisplayPort to HDMI because just putting a wire from point A to B is too difficult

https://youtu.be/VpB49dhk2uQ?t=1665


How can I control screen brightness from my m1 max? Willing to pay for a solution.


Lunar can do that: https://lunar.fyi

It’s free for manual brightness adjustments.

Just make sure to use one of the Thunderbolt ports of the MacBook.

And if it still doesn’t work, check if there’s any monitor setting that could block DDC by going through this FAQ: https://lunar.fyi/faq#brightness-not-changing


Neat. I've been quite fond of Monitor Control: https://github.com/MonitorControl/MonitorControl


How does Lunar and Monitor Control get around the Tbolt -> HDMI dongle baked into the logic board problem?


There is no way to get DDC working on that port. It’s probably locked in the chip firmware.

All we can do is to provide software dimming using Gamma table alteration, or tell people to use the Thunderbolt ports instead.


Now I understand. There are two ways of dimming a screen. One is dimming the backlights in the display while the other is faking it via Gamma table alteration


They said use one of the Thunderbolt ports.


What problem? Is DDC not available?


Hey, I was wondering if there was a way to lower the minimum brightness of my M1pro built-in display. The minimum brightness at 1 notch is way too bright for working at night. I have to resort to apps like quickshade which just put a grey overlay on the display which kills contrast and colors.

I had found something on stackexchange to do this for previous gen macs, but was unable to replicate for current gen. This is something I would be willing to pay for. https://apple.stackexchange.com/a/179204

I figure with the HDR display there must be a way to control the actual backlight brightness.


Lunar can control the real backlight of the MacBook display by tapping into the private DisplayServices system framework.

The app only supports integer values between 0 and 100 but the framework supports floating point values between 0 and 1.

You can try installing Lunar’s CLI (https://lunar.fyi/#cli) and then running something like the following in a terminal:

    lunar display-services SetBrightness builtin 0.005
That will try to set the backlight of the builtin display to 0.5%. Then you can try lower and lower values to see if they make any difference.

If you don’t see much difference between 1% and sub-1% values then there’s no way to dim the backlight further, and the only remaining method would be Gamma dimming which Lunar can do if you disable the Apple Native protocol from the Controls menu.


Thanks for the reply, I tried this but it seems like anything below around 6% (.06) doesn't actually brightness anymore. This matches the behavior of when you press function+shift+option+F1, where if you set the brightness to anything below the first bar (1/16th = .0625), there is no difference.


2021 HDMI is a disaster. I'm using a Sony flagship TV, a PC, and a popular Sony 7.1 receiver.

I had to update my graphics card to get 4k120 444 10bit and eARC.

Only eARC is totally broken - audio often doesn't work at all without restarting PC/TV/receiver a few times. And then once it "works" it will randomly cut out for 5-10 seconds at a time.

HDR on windows is also totally broken. It was a nightmare to get something that correctly rendered full 10bit HDR video (I ended up having to use MPC-HC with madvr and a ton of tweaking). You also have to turn off windows HDR support and use D3D exclusive mode. After updating my TV to get DRR, the audio for this setup stopped working.

Linux also has zero HDR support. Didn't have luck getting 5.1 or 7.1 working either.

MacOS can at least handle HDR on Apple displays - not sure if it works on normal displays. Haven't tried surround sound.


> HDR on windows is also totally broken. It was a nightmare to get something that correctly rendered full 10bit HDR video (I ended up having to use MPC-HC with madvr and a ton of tweaking). You also have to turn off windows HDR support and use D3D exclusive mode. After updating my TV to get DRR, the audio for this setup stopped working.

Odd, I've got 3 different HDR monitors across 2 Windows computers (one Nvidia, one AMD) and also 2 different HDR TVs occasionally connected to each and I've never had to turn HDR off or do tweaking in madvr to get it to look right. Having HDR disabled in Windows globally makes me wonder if the adjustments you're talking about are referring to tonemapping HDR content to an exclusive SDR output.

One thing I will say is TVs default to some god awful HDR settings. Well I guess you could say that about picture settings on TVs in general but it's even worse for HDR. It took me a solid 40 minutes to figure out how to get the picture settings for a Samsung Q900R to display HDR properly instead of "worlds darkest picture mode" (turns out the same values take different actual effect depending on the type of source you identify it as and "PC" is not what you want even though you specify HDR picture mode elsewhere and it detects an HDR signal...). Also for SDR content you'll need to adjust the SDR bridghtness slider depending on the actual brightness of your display.


My suspicion is you're not actually getting HDR video, maybe unless you're using a built-in windows video player. You can look on home theater forums for the work people have to do to get actual 10bit video working on windows and I've never once seen it be plug-and-play. By default it will either take 10bit content and truncate/dither to 8bit, or totally fuck up the transfer function so the video looks like ass. Honestly if you're happy with what you have now I encourage you not to validate that you're getting proper 10b HDR because it's very frustrating. The only devices that might be plug-and-play are current-gen game consoles.


I don't see why I wouldn't be:

- Monitor says it is receiving a 10 bit HDR-ST2084 signal https://i.imgur.com/IFa5Y1e.jpg

- Windows says it is sending a 10 bit HDR signal to a monitor reporting a peak brightness of 1566 (which is correct, HDR1400 certified with an advertised peak of ~1600) https://i.imgur.com/WatJ6gP.jpg

- Windowed applications that support HDR (e.g. Krita or Chrome or MPC-BE w/ madvr) are detecting the display as linear HDR and using FP16 backing. Note on this display the SDR brightness was manually adjusted via the slider in Windows to be near 500 https://i.imgur.com/dD8HvfC.jpeg

- YouTube detects and defaults to serving the HDR versions of content, in this test case smpte2084/rec.2020 https://i.imgur.com/HI2fPKr.png

- The peak brightness I actually see in the HDR brightness test video is bang on the ~1500 nit mark I'd expect for a small brightness window (this image was taken with parameters to match almost exactly what I see in real life in terms of visibility on the brighter test cells https://i.imgur.com/z4nW5on.jpeg)

- P3 images with content in the extended range stand out perfectly clear https://raw.githubusercontent.com/codelogic/wide-gamut-tests...

When HDR first landed with Windows in 2018 I remember it was a crapshoot of driver/windows build as to whether the transfer function was correct and there was a longstanding bug where if a video was particularly bright for too long all HDR content would look like a god awful tie-die until you bounced the display. Since ~late 2019 I haven't seen either of those and I've added a few more HDR monitors, quite literally just plug it in, enable hdr, make sure the monitor went into HDR mode, adjust SDR brightness, and off to the races. Ironically when I got an Xbox Series X about this time 2020 it took a few TV firmware updates to a Q900R before HDR started working properly with very similar kinds of issues as when Windows first got HDR.

I just got a 16" M1 Max with an HDR screen. There it seems to work out of the box (with Safari at least) but I had to tell YouTube a fake Windows Chrome user agent to get it to serve HDR content. I've heard there have been some kernel crashing issues with HDR on it but other than loading up a quick YouTube test to see how the screen was (great!) I haven't really messed with HDR on it. I do recommend disabling TrueTone and auto brightness, in particular TrueTone's effects are really amplified on HDR content and can ruin it completely depending on the room you are in.

If you ever want a cool demo to show off your setup Krita in HDR mode allows you to paint the absolute extremes of brightness and color gamut in large swatches. It can make for quite the effect when comparing it to a good calibrated sRGB SDR monitor.


> turns out the same values take different actual effect depending on the type of source you identify it as and "PC" is not what you want

That is a subtle pitfall indeed. LG also has this, where you can configure the icon (a small image) of your HDMI inputs in the UI. Turns out changing the icon to a PC will also change the signal processing algorithms.


Were you able to get 4k120Hz 444 working on Linux? What GPU do you have? I can only do 4k60 444 or 4k120 420 on my LG C1 connected to my Radeon RX 6900xt.


I don't remember, but I had to sidegrade from a 2070Ti to a 3060Ti to get HDMI2.1 to get 4k120 444.


Not to be offensive, but -- first world problems: where did you find a new graphics carts, for starters?

Now, a bit more on a serious tone: this is all bleeding edge. And combining multiple recent development together is a recipe for corner cases and untested combinations.

That said, did you try Variable Refresh Rate with that? Bur reduction technologies (backlight strobing) are also interesting, but thankfully they require little software interaction (for now).


My design would remove all ambiguity, by having different rules for describing port vs. device capabilities.

1. HDMI ports should be described as "HDMI X.XX Compatible". This indicates that the ports themselves (and the device that contains them) will work when connected to any other HDMI X.XX device, and that the description is of a port.

This is the low bar on 2.1 devices. Their ports will sensibly negotiate optional features with other HDMI 2.1 devices.

2. HDMI devices should be described as "Full HDMI X.XX" or "Limited HDMI X.XX up to [limitations]" to distinguish devices that support all features of X.XX that apply to the device, or have applicable limitations. (Audio devices would not be considered "Limited" by non-applicable features such as image resolution.)

The ""Limited HDMI X.XX up to ..." disclosure would need to be prominently displayed in any description. One place where all limitations can be found.

Limitation phrases (like "up to 4k resolution") would have standard wording supplied by the HDMI standards body.

Additional references to HDMI versions, such as product listing titles, can be shortened to "Limited HDMI X.XX" without any limitations listed. So all limitations listed, or none, to avoid any confusion.

Done! Next problem, Internet? ...


My design would be HDMI 3.Y where Y is a variable-length Base32 encoding of the value of a bitmask representing the presence of the underlying features of an HDMI port/cable/device.


This is terrible. You expect an average consumer to start converting base32 to binary to check what features their cable support? I mean I'm all for educating the public but this is just completely unreasonable


I'm pretty sure it's a joke, maybe your comment is also one but it's harder to tell.


> You expect an average consumer to start converting base32 to binary to check what features their cable support?

Fret not, consumer! Just install the HDMI™ Companion App™* for easy parsing of the version numbers.

*HDMI™ Companion App™ and facial recognition are required for decoding of 4k HDR content on your media devices.


This article doesn't mention the most infuriating aspect of HDMI - it's not an open standard! It's a closed, proprietary standard that requires licensing fees and prevents any open source drivers from existing. This is why the Linux AMD open source drivers don't support HDMI 2.1 - so you can't display 120hz@4K on Linux with AMD.


Can displayport do it?


You can’t do 4:4:4 chroma with displayport 1.4. Not enough bandwidth. As a result, fonts look pretty bad. Next gen displayport should be able to do it but no cards support it yet afaik.


Of course. Using 4k / 144 / 10bit right now on a RX 5700xt.


This HDMI bandwidth calculator helped me understand HDMI 1.4/2.0/2.1 far better than anything else. It is worth noting that many resolution/depth/rate configurations easily fit within TDMS limitations, especially if DSC is enabled (which it should be). FRL isn't required, unless you want to be certain you can support all HDMI 2.1 situations (mostly over 8K or 240Hz+).

https://www.murideo.com/cody.html


If you want to compare video bandwidth requirements not only for HDMI but also for DisplayPort and some other video transports, you can also use my video timings calculator: https://tomverbeure.github.io/video_timings_calculator.


Looks useful, thank you. Could you add DisplayPort with DSC?


The DSC compression ratio is selected by the source as a function of the available and desired bandwidth.

One thing I could add is the theoretical maximum bandwidth if DSC is dialed up to the maximum compression setting of just 8 bits per pixel.


It sounds like the marketing people who kept renaming USB 3.x Gen FU got hired to mess up HDMI.


As someone who works on a lot of standards, I can say there are two common misconceptions

1. Typically, standards and certification are entirely different. 2. Standards may not follow semantic versioning - e.g. a major version may not indicate a loss of either backward or forward compatibility.

The protocols defined in USB 1 are still allowed in USB4, as are the cables and connectors. HDMI is the same way. Saying something is USB4 or HDMI 2.2 compliant is not a stronger statement than saying they are USB 1.0 or HDMI 1.0 compliant.

Likewise, statements like "USB 3.2 Gen 2x2" are garbage. The correct terminology is "SuperSpeed USB 20Gbps". Why do so many products use the incorrect, more confusing terminology while omitting the official marketing name? Often because they are not certified products.


Was "USB 3.2 Gen 2x2" not the name of SuperSpeed 20Gbps before USB 4 got released?


No, "SuperSpeed 20 Gbps" has been the official nomenclature since day 1. From USB 3.2 language use guidelines

• USB 3.2 Gen 1

o Product capability: product signals at 5Gbps

o Marketing name: SuperSpeed USB

• USB 3.2 Gen 2

o Product capability: product signals at 10Gbps

o Marketing name: SuperSpeed USB 10Gbps

• USB 3.2 Gen 2x2

o Product capability: product signals at 20Gbps

o Marketing name: SuperSpeed USB 20Gbps

[...]

To avoid consumer confusion, USB-IF’s recommended nomenclature for consumers is “SuperSpeed USB” for 5Gbps products, “SuperSpeed USB 10Gbps” for 10Gbps products and “SuperSpeed USB 20Gbps” for 20Gbps products.

https://www.usb.org/sites/default/files/usb_3_2_language_pro...

The SuperSpeed name was originally introduced with USB 3.0


I remember when USB2 came out and similar mischief ensued. All the hardware manufacturers got together and pushed the standards body to re-brand USB 1.1 hardware as USB 2.0 (full-speed vs. high-speed). It allowed hardware retailers to empty their shelves, while consumers thought they were getting the latest technology.

https://arstechnica.com/uncategorized/2003/10/2927-2/


Same thing exists for USB3. Every time a new version is released, all cables and products suddenly support that revision. They just don't have any new features.

Not to mention that I've _never_ had a cable identify what it is capable of. Thus USB is a shitshow of crapiness.


I recently upgraded the NVME SSD in my machine. The motherboard only has a single NVME compatable M.2 port, so I bought a USB 3.1 enclosure [0] to put the old drive in while I cloned it to the new drive. The enclosure has a USB type-C connector so I also had to use a USB 3.1 A-to-C adapter [1] to connect it to my motherboard's [2] USB 3.1 type-A port. Anyway something somewhere went wrong and it took over 5 hours to copy 750 GB instead of the expected 10 minutes. Absolute shitshow.

[0] https://www.newegg.com/rosewill-rhub-20001-enclosure/p/N82E1...

[1] https://www.amazon.com/dp/B07L92KPBB

[2] https://www.asus.com/us/Motherboards/Z97AUSB_31/


From the review section of that enclosure:

> Cons: - Included USB-C to USB-C cable is only capable of USB 2.0 speeds (40 MB/s as measured with crystaldiskmark)

yeah, that would explain it.


I recently had nearly the opposite experience. I was upgrading a Linux server to a new motherboard with a NVMe SSD from and old one with a SATA3 SSD attached. To see how things would go, I imaged the old SATA3 SSD onto a USB3/NVMe adapter (https://www.amazon.com/gp/product/B07Z8Y85GL) and tried booting the new system from USB. It actually came up working, so next I figured I would need to remove the NVMe SSD from the USB3 adapter and install it in the motherboard slot, boot the system from a different USB drive, and then once again image the NVMe SSD from the old SATA3 drive. (I had read that the USB3/NVMe adapter accessed the SSD in a way that was incompatible with the motherboard.) So I installed the NVMe SSD in the new motherboard and powered it up just for giggles. To my great surprise, it booted normally and everything was fine! (Oh, and my SSD access speeds went from 500MB/s on the old system to 2GB/s on the new one.)


Why wouldn't it work? Bulk storage is bulk storage. As for booting from that... Linux has all the required drivers in the kernel, at worst (booting from radically different hardware) select a fallback initramfs with more drivers. If you did a bit-by-bit copy of your drive, partitions should have come out unmodified at the other end, including GUIDs and the EFI label on the EFI partition (if using EFI), or the bootloader in the MBR if using that.

Parent is talking about speed. There are different things in M.2 ports (as this is the form factor): SATA and NVMe, PCIe AHCI[1]. There was probably a slight incompatibility and a fallback to some other mode there.

[1] https://en.wikipedia.org/wiki/M.2#Storage_interfaces


I've definitely had problems with external storage on Linux machines

> Why wouldn't it work? Bulk storage is bulk storage

You would think so, but anything using UAS is a complete mess and you can't be sure it'll work. I can only assume devices implemented the convenient parts of the spec and fail randomly.

Happened often enough the kernel flag for the USB quirk to disable UAS was stickied on the Raspberry Pi forums when USB boot was new.

https://forums.raspberrypi.com/viewtopic.php?t=245931


Is there some device that can/could do this. I did a cursory look through Amazon and there's a lot of "signal testers", is that sufficient?



Apparently he uses [1] a "Advanced Cable Tester v2" from Totalphase for his tests, starting at 15000$. Probably depends on what you need to test.

[1] https://www.reddit.com/r/UsbCHardware/comments/ny4y6z/commen...


Linus Tech Tips did a couple videos on such a device: https://youtu.be/u6lx1ntNoxE It is only $15K


I still had an USB 1.0 motherboard laying around not too long ago...


Why does it feel like it is inevitable that standardization/licensing organizations in tech will always eventually turn into a user-hostile mess?

USB, HDMI, what can we screw up next?

Is it incompetence? Malice? I'd really like to see an in-depth investigation of this phenomenon


HDMI was never a pro-user protocol, it was made to encumber a digital display signal with DRM.


This. HDMI was cooked as a proprietary connector with DRM by the big manufacturers in the DVD/Blu-Ray, TV, home-entertainment business and the big movie studios to enforce stronger content protections to protect their IP, at wich it miserably failed, as I can still torrent every Hollywood blockbuster and every Netflix series.

IIRC, every manufacturer must pay a fee to the HDMI consortium for every device with HDMI they sell.

DisplayPort, by contrast is a more open standard only requiring a flat fee for the standard documentation and membership instead of a fee per unit sold IIRC.


DisplayPort and DVI both support HDCP. This wasn't the purpose behind HDMI, though support for it was no doubt a requirement for adoption. It was designed to be a single cable for carrying video and audio between playback devices, receivers, and displays.

For this purpose, it succeeded and did a much better job at it than alternatives. HDMI still makes far more sense for use in a home theater environment than DisplayPort thanks to features like ARC.


I think the better question is why SDI video connections aren't available on any consumer devices.

While HDMI is nice for consumers because it carries audio/data, SDI cables are cheap (just a single coax cable!) and easy to route (T-splitters are a thing!).

SDI does not support HDCP, however.


I think cost might be the main factor there. SDI is serial instead of parallel like all other consumer digital video cables. Hardware to serialize and deserialize bits this fast on a single conductor is expensive. HDMI 1.0 had a max transmission rate of 4.95 Gbit/s in 2002. Today, HDMI 2.1 goes up to 48 Gbit/s.


On that note are there TVs with displayport?

I'm using my LG TV as monitor for a PC and forced to use HDMI.


Gigabyte has been selling a version of LG CX48 slightly changed to be a monitor. It has HDMI and DP.

Model name is AORUS FO48U.


IIRC, most panels interface via DisplayPort internally these days.


HDMI is great for a home theatre set up where there's an obvious central unit, but the ecosystem has gotten worse if your speakers don't take in HDMI, at least at the very cheap end of the spectrum I buy on.

My current TV will only put out an inferior "headphone" mix over the 3.5mm connection, and the SPDIF connection is co-axial on the tv, but optical on the speaker. Having to run a powered converter box just to get audio from my tv to a speaker feels like such a step backwards.


It sounds like your speaker system predates HDMI's adoption, or was never intended for use in a home theater system. Even the lowest budget soundbars will include an HDMI port and support ARC. I am surprised your TV has coax SPIDF but not toslink, as it was the gold standard for home theater audio before HDMI came around.

It sounds like you just got bad luck with your TV having a bad 3.5mm jack and not supporting toslink, as most I've seen will at the very least support the latter, and often have a separate "line out" port for the former.


Is there a big difference because 3.5 and something digital?

I know 3.5 is worse on a technically, but I've never been able to actually notice the difference.


3.5mm is an analogue signal that can only output stereo. Its quality will be limited by the device it comes out of.

SPDIF (both optical and coaxial) support sending multiple audio channels in their original encoding, to support surround sound. The receiving device needs to support decoding of the audio, which is why there is often an option to force the SPDIF output to use PCM stereo instead of surround sound.

If you have a cheap TV and a good hifi, you'll want to use SPDIF or HDMI so that your audio isn't ruined by poor quality of the audio chipset in the TV.


My assumption was just that it was something about how the EQ is mixed for that jack, because it is labelled specifically as headphones. The other replies about surround sound are true, as well, but I don't think should apply to my tv -> stereo soundbar set-up.

But the difference is definitely noticeably, even to my relatively forgiving ear.


The biggest difference is digital interconnects can carry extra data for surround sound.

(Also, unique for optical connections, it's easier to avoid ground loop hums.)


Not quite true. The "DRM" mechanism you're most likely referring to is HDCP which was designed separately by Intel to provide copy protection over multiple device types including DVI (the first implementation), DisplayPort and of course HDMI.

It's not the HDMI interface that enforces copy protection it's the software, firmware and additional hardware on the devices that do this. You can use HDMI perfectly fine without the additional DRM crap.


HDCP can run on DVI or DisplayPort too. HDMI is a smaller, lower pin count connector than DVI, however.


HDMI's initial version is electrically and pin-compatible (passive adapter only) with DVI-D single link; assuming the DVI port supports HDCP.

The parent post is correct in that the mandatory HDCP was a major feature (for the involved cabal of organizations).


> The parent post is correct in that the mandatory HDCP was a major feature

This is wrong. HDCP isn't mandatory to implement HDMI, they are two separate technologies. I'm not defending HDCP or DRM encumbered content but I wish folks would get their facts straight.


I almost always err on the "never attribute to malice that which can be adequately explained by incompetance". Howerver, the "standards" bodies ability to repeatedly make a complete pigs ear of every single interconnect system makes me assume the opposite.


I'm leaving towards malice (through not caring so much for users) caused by big tech using this arena as a battleground.

I wish all cables were equal too, but c'est la vie


Greed, of course.

If it becomes too big of a problem, each cable and device will be required to have a challenge-response Obscure Brand Inc. proprietary U9 chip burned with a valid secret key and serial number at the factory that must return a valid response for the link to be enabled.


>Is it incompetence? Malice? I'd really like to see an in-depth investigation of this phenomenon

"Word of advice. Agents are bad, but whatever you do, stay the hell away from Marketing."

- Thomas A. Anderson


Don't forget MPEG.


When i had a very old Samsung tv, my Nvidia Geforce videocard produced a nice image to the tv and Dolby AC3 sound to my even older surround set via a nice hdmi to optical out converter in between.

Now i have a not-so-old Philips tv and suddenly i can't get dolby ac3 sound anymore. Why? Because the GeForce communicates with the tv and the tv responds it only has stereo. The surround set has no hdmi input or output so it cannot communicate with GeForce.

I have tried everything from hacking drivers to changing EDID with some weird devices. Nothing works. Stereo only. Very frustrating.

I was recommended to replace my surround set or my tv. Both pretty expensive solutions for some weird hdmi communication bug/standard.

So i bought a $20 usb sound device to get Dolby AC3 sound to my suround set. All because i replaced my old tv which couldn't communicate with the GeForce about its speaker setup.


I'm a "tech guy" and I'll be in the market soon for a screen.

Just the thought I'll have to learn about this while HDMI disaster in order to not get burned gives me anxiety.

Also, never quite liked HDMI when it came out, but from what I'm reading they really outdone themselves during these years.


If you're getting a high performance monitor DP is overwhelmingly preferred and available anyways, e.g. this monitor can be operated at full speed via DP 1.2 (though it requires reduced timings at least you don't have to drop chroma quality).

If you're getting a high performance TV then be prepared to dig into which input port set to which mode supports which actual data rate and features on which firmware because literally none of it is consistent. Also be sure to check technical reviews to see that it actually works, e.g. 4k120 HDR with freesync from my Xbox Series X didn't work with my TV for quite a while even though all of those things worked individually.


I have been avoiding HDMI like the plague due to various issues with devices and cables. The only way it has reliably worked has been where the use has been WAY under what devices and cables have been specced for. Anything remotely close to max speeds/features has been riddled with problems.

DisplayPort has been less of a hassle, but probably mostly because I settled for 4k60 which HDMI 2.0 supposedly handles with ease.

Devices with DP2.0 will appear any time now. On paper DP2.0 shound handle 10k60 displays without compression in 8 bit, or 8k60 (10bit) also without compression.


The fundamental problem is a lack of supply chain integrity. Customers can buy a million cables or laptop batteries directly from (country that shall not be named), but they have no idea if they're getting fakes or not.

The fix isn't "authorized" suppliers only, but requiring a reputable someone in the supply chain to maintain evidence of continually testing products advertising trademarked standards for compliance. If it's too much work, then boohoo, sad day for them, they don't get to be traded or sold in country X.

In all honesty, flooding a market with cheap, substandard products claiming standards they don't comply with is dumping.

https://en.wikipedia.org/wiki/Dumping_(pricing_policy)


>In all honesty, flooding a market with cheap, substandard products claiming standards they don't comply with is dumping.

I think you may have misread the article. The problem isn't that manufacturers are lying about HDMI 2.1 compliance. They are complying with the HDMI 2.1 requirements; the standards body chose a weird set of requirements, that's all.


Linus Tech Tips used a $expensive dedicated device to test a ton of HDMI cables. Most of them were shit: https://youtu.be/XFbJD6RE4EY

And what was most interesting is that price and quality didn't always correlate at all.


I blame Sony. They pushed expensive HDMI 2.1-compatible TVs and receivers alongside the PS5, strongly implying that without a 2.1-compatible living room then you might as well be playing your PS5 through a black-and-white TV.

Of course the PS5 can't really exploit any of the features of HDMI 2.1, and then it turned out that most/all 2.1-compatible receivers and TVs have glaring errors and incompatibilities that render them effectively useless.


LG pushed HDMI 2.1 on their TVs a full year before Sony did.


HDMI is contributing to consumer confusion at least, and fraud at most.

In order to get the benefits of this protocol, you need the source, the monitor, and the cables to meet the spec. That’s a lot of opportunities for confusion, waste, and loss of consumer confidence.

The protocol specs may be great, but the confusion will eventually lead consumers to think that HDMI 2.1 is just industry jargon, like “hi-def”.


As with any tech, you can't trust marketing if you plan on pushing it to the limit. You need to either test it yourself, or in some rare cases there are reviewers who have already tested it. Most "reviews" for tech are extremely superficial though and certainly won't be testing HDMI inner workings.

For HDMI 2.1, there are a bunch of monitors being sold under that banner that don't have the full 48 Gbps bandwidth. For example the Gigabyte M28U is limited to half of that at 24 Gbps. [1] Gigabyte certainly doesn't want you to know this. On their specificaion page they just list it as HDMI 2.1. [2]

Similar nonsense was going on during the transition from HDMI 1.4 to 2.0. I really wanted HDMI 2.0 output on my laptop and held off on a purchase until that was possible. I finally bought an ASUS Zephyrus GX501 laptop. It has a Nvidia GTX 1080, which does support HDMI 2.0 output. The marketing for this laptop also seems to suggest that they're utillizing this potential, with claims like "You can also run a large ROG gaming monitor with NVIDIA G-SYNC™ via DisplayPort™ over Type-C™ USB or use HDMI 2.0 to connect 4K TVs at 60Hz." [3] The specification page mentions the HDMI 2.0 port. [4] However in reality I found out that this HDMI 2.0 port is limited to HDMI 1.4 bandwidth. It supports HDMI 2.0 features like HDR, but not the bandwidth. 4K @ 60Hz is possible only with 4:2:0 chroma subsampling and you're limited to 8 bits, so no HDR.

I'm not the only one who found this out either. There are plenty of others on the ASUS forums. [5] Hard to say whether this was intentional misleading by ASUS marketing, or whether engineering messed up, or whether the feature ended up being cut due to time constraints. In any case, they still haven't adjusted the old marketing pages for this laptop that never shipped with HDMI 2.0 bandwidth.

Reviewers don't tend to check things like this either. For example The Verge reviwed this laptop [6] and wrote: "Asus has a nice array of ports on the Zephyrus — four traditional USB 3.0 ports, a Thunderbolt 3 USB Type-C port, HDMI, and a headphone jack." They're just regurgitating the marketing material. There's no depth to it, the claims aren't verified. So people end up buying the product and then get confused why it isn't working.

--

[1] https://www.rtings.com/monitor/discussions/q-D1CBeE2EiGMYgn/...

[2] https://www.gigabyte.com/Monitor/M28U/sp#sp

[3] https://rog.asus.com/laptops/rog-zephyrus/rog-zephyrus-gx501...

[4] https://rog.asus.com/laptops/rog-zephyrus/rog-zephyrus-gx501...

[5] https://rog.asus.com/forum/showthread.php?96916-GX501-Zephyr...

[6] https://www.theverge.com/2017/8/25/16201656/asus-rog-zephyru...


I have had similar concerns with reviewers skipping USB compatibility on laptops. After I left comments on relevant reviews with a quick rundown of the problem they started including it.

I might get at it again because I'd want my next laptop to drive my 4k 120Hz display over HDMI.


As the saying goes, a camel is a horse designed by a committee


Wait, according to that chart...does that mean that HDMI 2.0 cannot support Dolby Vision, as it is a form of Dynamic HDR?

I am very confused, as several devices I own are only supporting 2.0, but claim to support Dolby Vision.


Topic aside - why can’t cable companies write the spec of the cable on the cable? Like it’s all the same connector, I can’t tell the difference between 2.0 and 2.1. seems like it should be standard


It is HF. A propper tester for that stuff costs at least 10k and this would test one cable at a time. Throwing cables on the market and hope they work is much easier.

Getting cheap HDMI cables is a gamble for that reason and I wish it would be different.


I bought an HDMI 2.1 8K display, only to find that the Nvidia RTX 3090 wouldn't run it at 8k 60 Hz on Linux, only 30 Hz. On Windows, however, 8k 60 Hz works perfectly. It was a bummer.


I'm still waiting for DisplayPort to replace HDMI, but some profit a lot on the patents for it, that's why you don't see DP in TVs and such.


This reminds me so much to the USB standard story. I don't know why is it so hard to have some kind of normal naming convention?


Maybe an analogue VGA cable connection wasn't the worst idea


Shouldn’t consumer AV be adopting USB C by now?


They should (USB-C with DisplayPort protocol). But HDMI group makes a lot of money on patents, that's why DP adoption is so slow in that segment.

https://en.wikipedia.org/wiki/HDMI#History

It's simply anti-progress stance at this point.


Is it really confusing? The people who need the specs of HDMI2.1 (like gamers) will do their research.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: