Hacker News new | past | comments | ask | show | jobs | submit login
DisplayPort: Taming the Altmode (hackaday.com)
97 points by rcarmo on Aug 5, 2023 | hide | past | favorite | 62 comments



MacOS broke the Display Stream Compression (DSC) in Big Sur, and has not fixed it since. I can have 4k@144Hz in Catalina but only 4k@60Hz in Big Sur - Ventura. On the same hardware, applies to all Intel based macbooks.

Tried support, geniuses, no luck.


I have a LG 5k monitor that stopped reliably working at the correct resolution with Catalina and then, I think, got worse with Big Sur. MacOS just refused to even let me choose the proper resolution.

Forcing the DisplayPort version from 1.4 to 1.2 in the monitors settings (which would have the side effects of disabling DSC) allowed it to work correctly over Thunderbolt.

I eventually found a firmware update for the monitor that made it work with Macs in DisplayPort 1.4 mode again.

It worked fine with Windows all along, except that DisplayPort 1.2 meant I couldn't get full resolution at 60hz over the actual DisplayPort connection, only the TB3.

The whole saga is documented on https://apple.stackexchange.com/questions/388951/macbook-pro...


Unfortunate, DSC works fine on apple silicon. I did notice displayport-related regressions on my old 2015 mac way before big sur though, Apple's track record is not great.


Yea, the Apple Engineering Department asked me to run a ton of diagnostics, both on Catalina and Ventura, then came back with “Would you ask Dell to confirm their monitor is compatible with MacOS Ventura specifically”. Lol


I believe they broke it for the Pro Display XDR. I had the same. 4K 144Hz HDR10.

When it came out people were wondering how they were driving it at 6K HDR10.

If you can dial your monitor back to DSC 1.2 things get slightly better.... I could do 4K 120 SDR, or 75 HDR.

Also applies to Mac Pro - my 2019 cheesegrater was affected, too.


> they broke it for the Pro Display XDR

What do you mean, they had to kill DSC for Pro Display XDR to work?


They had to kill it for Pro Display XDR to be a compelling upgrade.


At the time of the release, people were wondering how the Pro Display had sufficient bandwidth to drive it. Big Sur came out at the same time.

Everyone using monitors with DSC 1.4 to the extent of their monitors capabilities saw those capabilities crippled and remain crippled ever since.

Those devices continued to work fine under Catalina.

Downgrading DSC from 1.4 to 1.2 on those monitors gave "better" performance (i.e. higher than without DSC, and higher than they were with DSC 1.4 with Big Sur+).

Hundreds or more bug reports were filed. This affected users of different Macs, different GPUs, different monitors.

The only common thread was DSC 1.4 and Big Sur (Ventura, Monterey... this still hasn't been 'fixed').

Given that the Pro Display and Big Sur came out 'together', it's very hard not to draw the conclusion that Apple found "sufficient bandwidth to drive the Pro Display XDR" by making changes to their "DSC 1.4" implementation that only the Pro Display understands.


FWIW, I ran `/S/L/E/AppleGraphicsControl.kext/Contents/MacOS/AGDCDiagnose -a` on Catalina and Ventura and diffed it.

MacOS Ventura seem to properly sense that monitor supports DSC: "DSC Support: 1, DSC Algorithm revision: 33" , but then simply disables it anyway: "DSC Enable: 0x0".


Interesting. And with maybe a little sense of shame, while I still have those monitors (27GN950-Bs), I actually have a Pro Display XDR on my M2 Studio (and the 2019 Mac Pro before it).

Torn between not wanting to support such behavior, and being one of the minority that could actually justify the Pro Display (photographer who works in at times challenging light environments).


> apparently, the protocol allows for building bidirectional DP devices, which is certainly idea-provoking

https://dancharblog.wordpress.com/2020/05/10/bi-directional-...


So, does that mean that all the sketchy USB cable applies to DP ports too?


My English sentence parser gave up. Could you please reword your question?


Yeah, I dropped a word by rewriting it, but you can only edit it for so long

So, does that mean that all the sketchy USB cable VULNERABILITIES apply to DP ports too?


You can't transmit data through DisplayPort, so no.

The blog post I linked has a list of cables which you can plug into a DisplayPort host and a USB C monitor and it'll transmit the video data. Some might add power and even data from other USB source(s). But that doesn't make DP carry data, no.


DP is lauded as an open specification.

I am sad to hear altmode is "semi-proprietary". A stain on this reputation.


I think that it’s mostly in comparison to HDMI. The latest versions of HDMI don’t really even allow open source implementations.

It’s a shame that all these protocols are gradually closing themselves up. We’re evolving backwards.


In the spirit of times - does anybody know what room-temp supercoductors would do to the transmission lines, like DP/HDMI?

Would 0 resistance have positive influence on signal degradation/coherence (ie. increasing maximum cable lengths or their reliability), or maybe it's already limited by various forms of capacitance/inductance or other problems in cables?


This is a really great question. If you had a perfect conductor in air, then yes you would have no attenuation (or "insertion loss", one common term used in Signal Integrity). Real transmission lines (cables, PCB traces) use dielectric insulating layers adjacent to the conductors. These dielectrics have a property called permittivity which affects what we call "dielectric loss". Essentially, transmission line loss can be broken into conductor loss (resistance, skin effect, etc.) and dielectric loss (energy lost by causing physical rotation in the dipoles of the dielectric material).

A perfect conductor would have zero conductor loss but would still have some of this dielectric loss. I covered the concept a little bit in a webinar recently [0] but SI guru Eric Bogatin really boils it down best. In this [1] equation for loss estimation in a PCB trace, you can see the conductor and dielectric portion of loss on either side of the "plus sign". The left part is the conductor loss and is proportional to the square root of the frequency. The right part is the dielectric loss and is proportional to the frequency of the signal, as well as proportional to parts of that permittivity (Dk and Df in that equation). To answer your question though, it absolutely would allow for much longer cables since you eliminate one part of the loss.

Sorry for all the ridiculous detail, if you're still interested in the topic please feel free to reach out for more info! Contact info in my profile and at [0].

[0] https://wiki.shielddigitaldesign.com/High_Speed_Design_Wiki/...

[1] https://www.edn.com/loss-in-a-channel-rule-of-thumb-9/


That man is really good at teaching/explaining SI concepts. He was my professor at my university and taught High Speed PCB design class. Lots of learning like the one you described!


The details - I feel like I learned everything I would have from a whole 3-credit university course.

Thank you.


I wonder if a coaxial cable with the outer tube made superconducting would make a difference.


Yes it definitely would :) There's conductor loss in both the signal and return path.


Line length would not be limited any more in theory, but it's not like you can do ridiculous length even with a HTSC wire. The most obvious factor is protocols, you can't get faster than light which means protocols with a tight response timeout won't work out.

The second one is capacitance, as most transmission is something very closely resembling AC, so IIRC even at zero resistance you'd still have a capacitance between the lines and a resulting reactive power.


You don't have to go that fancy, you could just use fiber optics right now.

You can get insane bandwidths out of quite cheap and manageable cable. Eg, you can put 100 Gbps down a single strand. Not enough? Try a MPO cable with 12 of those. Maximum length is effectively unlimited.


superconductors are lossy at AC (less so than regular conductors, but significantly above negligible)

a very large bulk of the loss is through dielectric loss too. that one you can't get rid of with superconductors.


No such thing as a super insulator?


vacuum :)


Vacuum is actually a horrible insulator, like in practice.


Assuming you could produce viable consumer cables that are superconductors, my understanding is that there would be no signal loss, so your hypothetical cable could have infinite length.

Digital is digital so this would be true for any data use case.

Maybe someone smarter can chime in about how electromagnetic interference affects or doesn’t affect a hypothetical “superconductor cable.”


The longer the cable, the longer duration loopback signals take. This can affect timing of any synchronous specification. If it is too long, it could be out of scope of the spec and thus stop working. Maybe you can solve this with a transceiver MITM device.


Oh right I forgot about that! So yeah… probably longer than you’d need but not infinitely long. Unless the protocol is fine with it.


Superconductors only have zero resistance when measured in DC.

AC still faces impedance, high frequency AC like DisplayPort especially so.


Im sure a plethora of high end audio vendors will come out with whole lines of superconductor digital audio/video cables with higher sound clarity and wider soundstage precision.


Note also that superconductors only have zero resistance to direct current.

AC, including and especially any form of high frequency signaling, still face impedance.


In the "old times" a video mode descriptor included things like "front porch", "sync width" and similar unpleasantries. I believe they're still used with DP/HDMI, as it's possible to play with "modelines" in Linux and adjust those params.

My naive understanding is that it's a legacy thing from CRT times, where there was a need to allow for the electron gun to adjust itself for a new line/frame.

Do those terms have still meaning in HDMI/DP/LCDs times, or they're preserved but serve no purpose now?


The only thing that still matters are: the vertical back porch (it matters for variable refresh rate, too high and it will increase latency too much), the total vertical blank (it must be high enough to give the source to time to reconfigure itself to go to power savings mode) and horizontal blank (it must be high enough to insert HDCP information and audio samples.)

Sync polarities, sync width, horizontal front and back porch are usually ignored by monitors.


Overscan/underscan still has a meaning even in the HDMI age, as do different black/white level limits, that all originate from the CRT era and its need to account for mis-aligned beams [1]. You may notice it when you plug in a PC into a TV's HDMI and don't see the title or task bar, or blacks look weirdly "off". If you ever encounter that, enable a setting commonly known as "game mode", and also check for color profiles as some TVs don't tie color space / temperature to game vs TV mode.

Bonus points: add SDI or scalers to the mix, now that's real fun to troubleshoot. More bonus points: try to troubleshoot this with just a Blackmagic VideoAssist... awesome things in general but it's completely beyond me why they don't have an ability to show basic information about the incoming signal. JFC.

[1] https://en.wikipedia.org/wiki/Safe_area_(television)


I've seen TV overscan settings be in menus like "Size", "Aspect Ratio", and "Scan", but never "Game Mode". Which manufacturers call it that?

In my experience game modes are usually focused on lowering display latency at the expense of processing quality.


CRT tvs with dvi-d and hdmi exist, where I'd imagine all those things in the mode line are still useful. I gather DisplayPort signaling is significantly different, however.


I had a WEGA driven by HDMI and overscan was important. The TV didn't fake it, the pixels were drawn outside the bezel, and they could be brought in by adjusting the picture. However, they were very distorted. Everything was fine when compensated for overscan, but some things like Amazon video's menu used every pixel and were cropped.

I had never thought about overscan until I used that TV.


The innovation of DP is that it's the first video standard to virtualize the physical interface of video. All standards before it directly represented the video timing on the physical layer (e.g. DVI/HDMI bit-clock = bit-depth × pixel clock). DP takes the same video signal (including blanking intervals) and chops it up into packets with some stuffing so as to not change average timing, then transmits them over fixed-rate physical links.


I recently managed to overclock a pair of xreal air AR glasses (USB-C DP altmode based) to 90hz using custom modelines, so it's still relevant.


I'm always impressed by how well designed Display-Port is, as a connector and as a protocol.


I really want to feel that way too (particularly compared to HDMI) but I seem to have so many weird issues that just never existed with DVI. On my work issued M1 Studio for example my displays (bog standard Dell 4K) commonly come up at 30hz or don’t get detected at all until an unplug and replug. I’ve tried all manner of cables and adapters and it’s just not reliable.


This is a macOS issue. In my experience macOS handles DP much worse than windows. Windows will keep asking the device for its information, while macOS waits for the device to inform the OS. This can cause issues with monitor detection, as you're experiencing.


Does macOS still not support DP-MST properly?


Doesn't support on purpose, though on intel macs alternative OSes could use it.

DDC is also fraught with peril when you want to change settings from OS on non-Apple display.


Properly? I thought not at all.


Apparently, if you connect two identical monitors to a mac via dp-mst, it will display the same image on both monitors.

I'm not sure if it's intentional, but it is at least giving an image.


I had a similar issue with HP 4k displays and docks with M1 MacBooks. The was a MacOS update at some point that broke how these devices work through USB-C. One day they worked, the next they didn’t unless you direct connect to the monitor via HDMI.

It’s really irritating, as we bought these displays specifically because the previous Dell displays didn’t work well with Macs.

I never isolated it because I hadn’t been to that office for 4-5 months during the pandemic. It’s also miserable as the logging for this stuff sucks and nobody really understands how this stuff works. One of my colleagues had success with a thunderbolt dock.


I'm also constantly having issues with DisplayPort that I don't recall ever having with HDMI. The current one is the external monitor just randomly loses the connection and it's a coinflip if I can restore it without rebooting. Usually happens 2-3 times per day.


Just works for me using open graphics drivers on linux, lots of different cards/displays/displayport revisions. HDMI tends to be weirder, at least on linux, with things like HDCP (EDONTCARE) and color space problems.


> On my work issued M1 Studio for example my displays

Do they behave any better if you only have one plugged in at a time? MacOS seems to have problems with multiple identical displays.


They do. But that’s insane for a $5000 desktop computer that claims to support five displays (I’m using two), and honestly shouldn’t be possible. How complicated do we have to make a protocol for moving pixels to a screen?


The protocol is great and works just fine. The problem is with the OS you're using. Install Windows on the same machine and it'll work fine.


Maybe it will, maybe it won't.

My HP laptop with integrated Intel graphics and an HP USB-C dock would have me do a dance of unplugging and replugging the DP monitor at the right time to get 4k@60Hz under Windows. No, plugging it while the PC was running didn't work; for some reason, it was the unplug/replug action which did the trick.

Worked perfectly fine under Linux. Then a Windows or Intel driver update mostly fixed it. Now I "only" get weird "out of range" or a garbled screen when waking from sleep sometimes [0], which usually goes away by switching inputs back and forth on the monitor. Still works fine under Linux.

On the other hand, HDMI via the same dock gives weird-looking colors under Windows, but it always worked at 4k@60. Because of the colors, I never actually used it, so I don't know if it survived a sleep/wake cycle. Never bothered to try HDMI under Linux, though.

---

[0] It seems related to how long the PC has been sleeping. If it's only a few minutes, it's OK (say going to the bathroom or similar - default windows settings are quite aggressive for putting the pc to sleep). But if it's longer, like an hour or so, I'd say it's 50-50 whether the screen works properly.


I was going to say “install Windows or Linux” but decided against including the latter bit because I didn’t have sufficient firsthand knowledge of the hardware compatibility story there. I should have trusted my gut ;)


DP sometimes has issues with plug detect on pin 20, using cables with only 19 pins help.


I don't know how many pins my cable has, and never knew there was any such difference. It's the one that came with the monitor.

I chalked it up to a driver issue, since the exact same hardware works great on Linux.

I also have a separate, random Chinese dock off Amazon, which I couldn't get to work under Windows with that particular laptop. It works fine under Linux on that same laptop, and also works fine under Windows with a different, AMD-based laptop.

I haven't tried it in a while, but right after it started working with the HP dock, this particular dock was still not working at all.


It is possible GPU just does not sleep on Linux.


>how well designed Display-Port is, as a connector

Whoever thought of the locking mechanism on them can, sincerely, die in a fire.

Bloody things cause me more grief than the weeds in my garden.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: