Hacker News new | past | comments | ask | show | jobs | submit login
Upcoming HDMI 2.1 Features (hdmiforum.org)
70 points by mfiguiere on Dec 3, 2017 | hide | past | favorite | 70 comments



Finally we'll be able to switch between inputs without a 3-second delay! This is one of my dream technology scenarios.


QMS is unlikely to help with that. It appears to be targeted at allowing a single device to vary its refresh rate or resolution without requiring a complete disconnect/reconnect cycle.

The delay you're seeing is more likely caused by HDCP hand shaking and key negotiation. Depending on your situation, there are external switchers that handle key caching so you can switch cleanly - complete overkill for personal user though as they are not cheap.


Wait, so slow HDMI switch times are caused by DRM?


As a rule, anything your modern TV setup does that seems frail or way too slow is about DRM :(


> Supporting the 48Gbps bandwidth is the new Ultra High Speed HDMI Cable. The cable ensures high-bandwidth dependent features are delivered including uncompressed 8K video with HDR.

This is interesting, because I've seen several companies promoting "visually lossless" compression (which is apparently a euphemism for lossy compression that the vendor doesn't think you'll notice) to reduce bandwidth requirements for cabled 8K video transmission. Does this represent some kind of defeat for them, or were they targeting applications where Ultra High Speed HDMI wasn't going to be realistic in the first place (e.g. the various systems that transmit video over UTP cable)?

Also, does anyone know how USB 3.1 SuperSpeed+ cables stack up? Obviously they're specified for 10Gbps for USB devices, but that's in the context of USB's requirements for the attached transceivers, so I'm not sure what it implies about the performance of the cable itself.


It also supports DSC[1]. Visually lossless is defined as indistinguishable difference between A/B of predetermined challenging images. You can check the images they used and run the test yourself if you so desire.

[1]https://www.vesa.org/news/vesa-updates-display-stream-compre...


Comparing static images is not really an appropriate test for a video compression algorithm, for obvious reasons. You can write lots of stuff that works on a single frame but looks like garbage on a video stream.

Can you give a link to video source files that could be used to compare the two? The link I turned up looks like it needs specialized testing hardware...

https://www.youtube.com/watch?v=dFbpcBuQg9s


It doesn't involve any inter-frame coding or prediction, so evaluating stills is fair. It compresses a single line at a time.[1]

[1]: http://www.vesa.org/wp-content/uploads/2014/04/VESA_DSC-ETP2...


It might introduce temporal artifacts, which would not be seen in stills.


At these kinds of speeds, why would we use it for just video? We could make server interconnects extremely fast.


Isn't the high bandwidth path for HDMI uni-directional? Not half-duplex, but actually one way only, as in the receiver doesn't have the hardware to transmit back (on those lines). I'm not terribly knowledgeable about HDMI but I think that's basically the case, someone please correct me or elaborate if they know more.

There are some other bi-directional communications channels for control and other features like DRM (HDCP) or even Fast Ethernet but they don't get nearly that much bandwidth.


You can get 40 Gbps infiniband cards for $200 now a days.

The switches are a bit expensive (you can probably find them for 3-4K if you look hard enough), but that's not that expensive for a data center.


Modern HPC interconnects are already faster than 48Gb/s


Those are usually quite expensive. If HDMI requires 48Gb/s capable transceiver chips to be built into every TV, they will get really cheap very fast. It might not be a viable solution in the end, but it could work for some applications.


Over typical HDMI distances, you don't really need an expensive transceiver. 100GbE can be done over direct attach cables that cost less than many current "premium" HDMI cables. Much of the cost beyond that is related to the MAC and host interface being able to do something interesting with the packets.


40 gbps is only Infiniband QDR speeds, you can pick up an adapter for sub-$50 right now.


And plug it into what?


Your servers?

If you mean switching, you don't need a switch for simply connecting two computers. Presumably that was the use-case supported by "HDMI-as-network-layer".

If you watch around, you can get Mellanox Voltaire rack switches for $200 or less. QDR gear is "obsolete" and often surplused at very attractive prices. Everyone else has moved onto FDR or EDR speeds.


Your can get e.g. a qlogic qdr infiniband switch for a few hundred dollars


Could anyone provide a guess as to how “quick frame transport“ works?

As someone who likes games the variable frame rate seems like one of the best things in here. I wonder if it would be possible to implement in the current consoles.

I hope the new products announced at CES support this.


I read somewhere that the adaptive response rate will work like Nvidia g sync. A 4K, hdr 75 inch gaming monitor without frame rate hiccups is quite an exciting prospect!


I eventually expect that this will allow variable framerate to be used in systems like DirectTV to improve compression and squeeze in more channels.

Right now the TV support is not there, but consoles will be pursing this tech (XB1X supports it) and the TVs are going to follow.


Right. My understanding is it’s basically the same as G sync or whatever the AMD name for it is.

I’m kind of curious to see what a game running at say, 50 frames per second, looks like if the frame pacing is uneven. I imagine it looks worse than 50 frames per second with even frame pacing but better than 50 frames per second with glitches because you’re supposed to be outputting 60.


GSync and AMD Freesync do the same thing, but not in the same way. Freesync is just a brand name for DisplayPort's Adaptive Framerate specification. GSync is proprietary, and requires the display manufacturer to add special hardware into the unit that increases costs a bit.


Hasn't the world just agreed on migrating to USB-C for everything?


USB 3.2 (and currently USB-C from it) doesn't support the 48GBps that HDMI 2.1 and 32GBps that display port 1.4 call for. USB supports a max of 20 GBps at currently. It's possible you can get more bandwidth using an alternate mode in the connector, but the spec says you can only guarantee up to the 20GBps over those modes currently. This means the higher resolutions and higher bitdepths (10bit, HDR, etc) may not be possible to push over the connector.

This is why Thunderbolt 3 requires special active cables to get the additional bandwidth by amplifying the signal to help get around the bandwidth limitation of the connector in a passive configuration.


> It's possible you can get more bandwidth using an alternate mode in the connector, but the spec says you can only guarantee up to the 20GBps over those modes currently.

20Gbps per what? One lane? Two lanes? I haven't heard of this, where can I find more info?

(If it's actually 20GBps with a capital B then we're nowhere close.)

> This is why Thunderbolt 3 requires special active cables

> the 48GBps that HDMI 2.1 and 32GBps that display port 1.4 call for

This part isn't right.

Thunderbolt 3 can run 40Gbps over short passive cables. And that's bidirectional. It's the equivalent of 80Gbps for a display cable.

Displayport 1.4, at 8.1Gbps per lane, is actually slower than USB 3.1 Gen 2.

HDMI2.1, at 12Gbps per lane, is slightly faster but still well short of thunderbolt.


That's why usb-c is failing. You simply can't know what that port can do.

Is it power, video, data? Do I need a special cable for it? Nobody knows.


Here's a good discussion about how that has turned out: https://news.ycombinator.com/item?id=15473777


I would favor if monitors, displays, and projectors have both USB-C and Ethernet. At the moment all types of displays still have DVI, DisplayPort, HDMI or even legacy like VGA, SCART.

USB-C and Ethernet are capable of 10+ Gbit speed, and the cables are cheap, not the premium one pays for HDMI 2ab cables. Also Ethernet cables with CAT6+ are found in most buildings. If you want to connect a big screen flat TV dozens meters away or a projector 50 meters away, you need repeater devices and fragile cable arrangements, while with Ethernet cables it would be no problem.


Why would anyone use HDMI instead of DisplayPort? It is cheaper to make?


The real reason why both continue to be developed in parallel is industry politics. HDMI is the product of consumer electronics companies while DisplayPort is from PC hardware manufacturers. For various reasons these two groups rarely cooperate. One big point of contention is royalties. The consumer electronics companies like standards they can charge royalties for, while PC companies like royalty free standards.


Since DP is royalty free, why wouldn't most consumer electronics companies push for it as well? Only minority would demand royalties, while majority would be forced to pay them. So why not ditch the patent encumbered standard if majority would benefit from it?


The consumer electrons industry is dominated by a handful of companies. Those companies also happen to be the ones who own the patents on HDMI. They control enough of the market that the smaller players are forced to pay up to be able to interop with the dominant companies' products.


HDMI is longer distance while DP is higher bandwidth. So based on typical cable length you get HDMI for TVs and projectors, and DP for monitors.


HDMI (2002) is older and was already the standard for consumer A/V when DisplayPort was created (2006).


That's not a reason to continue using it in new devices, isn't it? COM/LPT ports were there before USB too, yet they were replaced.


The hassle of changing the connector is not worth the benefit of consolidating the protocols.


HDMI is more expensive: you must pay royalties to use HDMI tech, whereas DP is free.


That's a reason not to use HDMI.


Right -- sorry I should have been more explicit :)


Does anybody know why DP cables are so incredibly expensive? It's like 20$ + shipping.


Monoprice is selling 10' DP cables for $5 and 10' HDMI cables for $6. Maybe you need to find a new place to buy cables. Amazon Basics has a 10' DP cable for $12.


Well, for one thing, it supports higher bandwidth right now.


Only this new one does. DP 1.4 supports 32.4 Gbit/s. HDMI 2.0 supports less.


You're incorrect. DisplayPort 1.3 has supported 32.4 Gbit/s transfer rates for several years now.

https://en.wikipedia.org/wiki/DisplayPort#1.3

The only thing that DP1.4 adds (in terms of bandwidth) is compression over the top. The native bitrate is actually the same.


DisplayPort doesn't carry audio.


https://en.wikipedia.org/wiki/DisplayPort#Technical_specific...

"Optional 8-channel audio with sampling rates up to 24 bit 192 kHz, encapsulation of audio compression formats (including Dolby TrueHD and DTS-HD Master Audio from v1.2)"


It does.


DisplayPort might be cheap, as it's royalty free, but one has to buy a cable dongle to convert it, as displays rarely feature a DisplayPort.

None of my business monitors, TV nor projectors have DisplayPort, but every notebook, discrete graphic card came with a DisplayPort - so they first thing is to buy another dongle to convert DisplayPort to HDMI or DVI.


I have two monitors and a TV with DisplayPort, personal anecdotes/small sample doesn't count for much, usually.


Except you only need a passive cable because display port supports both HDMI 1.4 and single link DVI.


Because native plain DisplayPort is only available on HP and some other Windows laptops (not sure about the state of the desktop GPU market), and it's rare as input.

The rest of the market - consumer Windows laptops, Apple pre-USB-C-crap-series laptops, gaming consoles, cable TV boxes, other home theater stuff on source side, as well as TVs and projectors on the sink side - speaks HDMI only.


DisplayPort is widely available on business-class laptops, gaming laptops, all modern discrete graphics cards, all modern integrated video chipsets (although not all motherboards/devices will actually provide a port), etc.

You're actually diametrically opposite of reality here: the only devices which do not widely support DisplayPort are cheap crap intended for the low-end consumer market, and stuff exclusively targeted at the living room market like consoles and media-stick PCs.

Even something as pedestrian as my old Thinkpad from 2010 supports it.


Every modern desktop graphics card has display port connectors. A common configuration is 3 display port + 1 hdmi, making hdmi the legacy connector. My five year old laptop has display port. It's not as rare as you think.


At least on desktop GPUs, you tend to get a lot more DisplayPorts than HDMI. You're lucky to get more than a single HDMI port, but three or more DPs are common.


Question is why (all these devices would prefer HDMI).

Anyway, Thunderbolt ports route DisplayPort.

> not sure about the state of the desktop GPU market

High end cards support DP for a long time already.


> Apple pre-USB-C-crap-series laptops

Totally incorrect. Before USB-C, there was Thunderbolt 1&2, which was over a Mini-DisplayPort connector.


Display port connector, but only PCIe signaling, no other protocols implemented.


Except for Displayport. Seeing, as you know, half of HN has MBPs they plug into DP'd desktop monitors, including me.


No, the ports on the Macbook are PCIe + Displayport. The Apple displays are PCIe only. The parent was specifically talking about Thunderbolt. You aren't using anything Thunderbolt when you plug into a DP monitor.


I've plugged non-Thunderbolt laptops into Apple Cinema displays as well.


Here's a link to the specification info page and Q&A rather than the press release: https://www.hdmi.org/manufacturer/hdmi_2_1/

A copy of the presentation from the announcement can also be found here: https://www.hdmi.org/download/hdmi_2_1/HDMIForum2.1NovReleas...


can this do 2x 4K @90Hz? e.g. for VR headsets.


What VR headsets are you using and with what video card? Me thinks we are a ways away from that.


Apple, please bring back the HDMI port.


No thanks. DP over usb-c is way better for all laptops.


Man, I'm so over dongles.

For the graphics cards that these laptops have... they aren't like super powerful. I can't, for example, game at 60 FPS on a 4k Monitor on a MacBook Pro. So the use case for needing more than 60 FPS, or more than a 4k display, seems more like an edge case to me.

Mostly what I want is to be able to plug into projectors when I'm with a client so I can share a pitch deck. Instead... I've got this massive USB-C to VGA / USB dongle, and another for USB-C to HDMI, and yet another for USB-C to Thunderbolt for my work monitor. Anywhere I go I've gotta lug around a laptop bag now, and it's full of dongles.

The OLD 15" MBPs were perfect. Good mix of ubiquitous ports, and Apple's proprietary Thunderbolt ports. I have yet to find anything other than dongles that use USB-C. Just like I have yet to find anything other than Apple Thunderbolt Displays that actually use Thunderbolt.


I was in the same boat when I was on my rMBP with its thunderbolt ports but now with usb-c, I just carry this one: https://www.amazon.com/dp/B01KV5332A/ref=cm_sw_r_cp_api_onqj...


I've found the mini-DP ports tend to get worn down quickly. I've seen that on a desktop and an Intel NUC so far. I don't know if I've just gotten unlucky since atleast on the NUC, I don't disconnect it often.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: