Hacker News new | past | comments | ask | show | jobs | submit login
M1 Macs cannot support dual Extended displays through their Thunderbolt 3 ports (caldigit.com)
219 points by juliangamble on Jan 7, 2021 | hide | past | favorite | 260 comments



Meanwhile Intel Macs can't support external displays without using the discrete GPU and thermal throttling under moderate workloads.

They are completely unsuited for multi-monitor workloads without an eGPU.

More specifically, the draw 30W of additional power that has to go somewhere. You can turn down the refresh rate, to say 10Hz or something, but it still has to use the discrete GPU. The cooling solution simply isn't handled to deal with it, as they have crammed as much processing power they can into the ultra-light chassis.

Meanwhile you can play World of Warcraft whilst drawing just 25W of power TOTAL on the M1 (estimate, I don't play WoW anymore). This is while the Intel Mac is drawing 70-100 Watts from the socket for just rendering a window to the secondary display.

It's got to the stage that I prefer to use my 13 inch M1 over a top of the line 16 inch provided by work. When M1 finally supports docker, I might even consider not using the 16 inch at all. It crashes due to thermals about twice a week due to the constant high temperature and eats through batteries!

If anyone has a solution to this, I'm all ears, but it feels like every nay-sayer of the M1 is dismissing just how many compromises Intel based laptops (not just Macs) have. I had just as many issues with the Dell XPS, where even basic video games would become unplayable after 15 minutes due to CPU throttling down to 0.8 Ghz.

The M1 has been the first laptop I have been truly happy with as a desktop user.


For what it’s worth, this does not seem to present itself in all current 15/16” chassis models. I use my 2018 2.6GHz i7 15” MBP with two 4K monitors plus the laptop monitor. It’s fine for me. (But I do believe you!)

If it’s useful: It’s fine here including reasonable loads on the video card, like a lot of GPU-accelerated terminals and pretty heavy use of a tiling window manager and Exposé window swishing. Or, say, Sketchup in a browser? Plus a pretty serious IntelliJ setup with the JVM’s stops pulled out, running ReasonML and C++ source compilation, and use of Docker.

It is sensitive to setup, and I can get it to spin its fans up. Docker is especially prone to wasted cycles. And I did seem to have to do some SMC resetting at one point because the fans were a little eager to get going. I also think it makes a bit of difference if I charge through the right-side ports rather than the left.

But the main point I’m attempting to make: I do think your case is solvable.

That said, I have felt that Apple are losing their way in some respects. The fact that there are fan issues with some machines and that it’s not clear how to solve them is a problem with these machines.


> this does not seem to present itself in all current 15/16” chassis models

I think they're referring to the thermal throttling, and not the MBP forcing the discrete card when an external monitor is connected. As far as I know, there is no way around using the discrete card for external monitors (on my 2015MBP at least, I'm pretty sure the HDMI port is wired direct to the M370X).

I've solved the thermal issues by using an eGPU (like GP suggests)


I have a 2019 16-inch MacBook Pro connected to two displays via USB-C and DisplayPort through a Thunderbolt dock and it seems like it also forces it to discrete graphics. I'm not sure if I've ever seen it use integrated graphics when I'm docked.


16" rMBP Pro with beefed up graphics running Dual LG 5K monitors

When it works, its great which is about 50% of the time. When it doesn't and for no apparent reason, it will often reboot a couple of times (just when plugging in the displays).

Sometimes the banding issue that the M1 Mac owners have described appears on one or both of the displays.

A pretty frustrating experience.


do you have powernap enabled? i had this issue with random reboots when plugging/unplugging displays and disabling powernap fixed it for me.


I do. Just disabled it, thanks for the suggestion.

Already running discrete graphics all the time so no auto-switching.


I own MB Pro 16 (i9, 64GB RAM) and M1 Mac Mini (16GB) and I am still perplexed why do so many people ignore or not mention the sluggishness of the macOS on M1.

Switching apps is often plagued with at least noticable up to utterly annoying lag. Starting even optimized apps sometimes feels surprisingly slow. If the CPU is even under the light load (20 % reported by Activity monitor), these issues happen more frequently and overall responsivness gets much worse. It is not due to the low memory which I try to spare on Mini.

Yeah, Safari is fast, even faster than on the iPad Pro. But I am developer, often use multiple browsers, many docker containers, might spin up IDE and I am certainly not the one who they made it for.


On my MPB M1 everything feels exquisitely responsive with the exception of the first time that apps that need to be translated by Rosetta. So it appears something might be off on your install.


That seems strange to me -- are you sure there's nothing wrong with your machine?

For reference, I just opened every application that's installed on my M1 MBP (16GB) and was able to switch between them with basically no lag.


I am a developer too, but I haven’t noticed any of these issues on my 16GB M1 MacBook Pro. macOS has been nothing short of zippy.

I wonder why you believe that everyone’s ignoring the issues you experience? Could it be some local phenomenon, instead?


My M1 Mac Mini destroys my 2019 16” Macbook Pro (i7). both machines have a 16 GB memory load out and 500 GB of SSD. I haven’t seen any of the performance issues you’re talking about. I do Python and Angular development.


From all the reviews and comments I've seen nobody else with an M1 is seeing that kind of sluggishness. Are you maxing out memory?


Are you sure there isn't something wrong with your setup? Switching apps is near instant on my 8GB MBA M1, app startup is sometimes a bit laggy, but overall responsiveness is much better than on my 16GM MBP 2019.


> I am still perplexed why do so many people ignore or not mention the sluggishness of the macOS on M1.

Conspiracy? Or...something up with your setup.


You are swapping.


Did you perform a system migration or a fresh install?

If you have an external SSD, you could potentially try a fresh install on it and see if the problem persists, then try to drill down to the offending driver/application.

I for one, have had no issues, but that's not to say they don't exist. Could be a process that is blocking or anything.


The only sluggishness I noticed, and was unexpected, was generating a SSH key.

ssh-keygen -t rsa -b 4096

This was considerably slower on the M1 Mac Mini than my 2017 iMac.


Interesting; possibly running under Rosetta? I could see crypto operations being very poorly emulated, and if it had to be translated first (if you only ran it once), that would obviously be even worse.


Never heard of that before, are you running any background apps?


But the M1 ALSO can't support an eGPU, which is kind of the problem. I'm more than happy to pay for an eGPU dock (have one sitting in the corner from when my main workstation was a macbook). I know not everyone wants to HAVE to pay for an eGPU, but that's a better solution than "You can't have two monitors".

My solution was to dump Apple laptops entirely (which I know a lot of people won't want to do). This Asus G14 has no issues powering 2x 4K displays without even breathing hard. No throttling in games or videos, and it supports 40GB of memory on top of a replaceable NVMe SSD.


>My solution was to dump Apple laptops entirely (which I know a lot of people won't want to do)

I think several will trade their M1 laptops due to the memory, architecture limitations creeping up after long-term usage and realizing they're compromising their previous workflow's productivity benefits. The excellent performance (CPU/power) and surprisingly well built Rossetta 2 has overshadowed potential limitations with memory and IO in the reviews.

Lightning fast opening of apps is great, but it's not much use to people like me who don't close their apps at all! I keep at least 3 browsers(for different projects) with several dozen tabs(with hibernation), Note taking apps, IDE and a VM with a browser open in it. So what's important to me here is the memory, at least 32GB is bare minimum.


> I think several will trade their M1 laptops due to the memory

They'll trade them for new M1 Macs that don't have those limitations, which are estimated to be released this year. This obviously isn't a "forever" limitation. And Apple will release Apple Silicon Macs with more memory long before any other Laptop maker releases a laptop that can compete with all the advantages of the M1.


Last I read, there is potential for the M1 to get eGPU support. Other USB-C to PCIe cards have worked so it seems to be a driver issue and not a hardware limitation. Now whether Apple can work nicely with Nvidia & AMD to get drivers, that is a big question.


GPU over USB if 3rd party drivers can be had is certainly a technical potential (hell PCIe over Ethernet has been a thing for a while) I just don't know if I'd call it a realistic one or one worth using for this use case if it did arrive. Especially since it seems extremely likely the scaled up iterations will be able to do this properly in short order.

I am curious about the other "USB to PCIe cards" though. I know there are NVMe to USB (non TB) adapters these days but I think that's largely due to the NVMe protocol being transport agnostic not that PCIe is actually interacting over the USB connection.


I was under the impression that Thunderbolt and the latest USB protocols actually are PCIe under the hood. This is why you had to have Thunderbolt security, because essentially you can tap deep into the hardware via a simple external port.


Ah yep you're right it supports Thunderbolt 3/USB 4, for some reason I had it in my head the ARM version just had plain USB 3.x type C.


Drivers are an untenable problem. eGPU on regular Macs is helped by there being existing x86 drivers needed for first-party Apple hardware. There’s not much incentive for Apple or the third parties to write drivers targeting Apple silicon for external solutions that aren’t widely used. The cost would far outweigh the income.


> They are completely unsuited for multi-monitor workloads without an eGPU.

I use an Intel MBP13 inch with two external monitors and it works pretty well. What am I missing?


It might be related to the connection used. I've heard that the issue doesn't present when using DisplayPort as opposed to HDMI, but I have tried different dongles.

And it's not terribly bad, just that I pretty much draw the full 92W the power adaptor supplies and get the mentioned thermal throttling much sooner than without.

Running a bunch of docker containers, slack, and general everyday workflow. Doing all the good stuff like ad-blocker and checking process monitor.

Our solution isn't the most lightweight, but still the end result is that plugging in an external monitor (1080), my productivity goes down dramatically which is the opposite of how multiple monitor setups are supposed to work. :-)


I have a MBP13 from late 2013 connected to 2 1080p-monitors. One using a thunderbolt-hdmi cable and the other a thunderbolt-dvi cable. No thermal throttling or any other issues at all.


I have a few year old MBP 15 with discrete GPU and a more recent MBP 13 without and neither of them break a sweat with 2 4k monitors @ 60Hz.

So whatever a “multi-monitor workload” is (video editing or something?) — it’s certainly not universal.


I have one 4k TB3 display, and my old Thunderbolt Display chained off of that one. The second monitor plus the laptop monitor add up to about the same number of pixels as the 4k.

When I'm on a video call, the CPU fans are consistently on at what I believe is the lowest RPM.

I would classify that as 'not breaking a sweat'. I suspect you might as well. I'm not sure if OP expects to be able to use all of GPU memory using only passive cooling, or if they have something else is going on.


The 13 inch MBPs don't have a dedicated GPU and the Intel integrated graphics draws significantly less power. Does anyone know why the 16 inch MBPs power on the AMD GPU for external screens?


The AMD GPU is wired directly to all the video outputs


Same, I've worked with 2 4K displays on both with a 2018 MBP15 and 2020 MBP16 for the last year, I've never noticed any of these problems. 1 connected with DP, the other with HDMI.


Are they 4K? My 13 inch MBP sounds sometimes sounds like a jet engine when connected to a single 4K display, dropping frames like crazy, even when I’m doing nothing on the system.


I have a 2017 nTB 13" Pro, connected to a 4K LG (the USB-C one released with the USB-C Macs), and it runs just fine. It doesn't get warm (not quite, but pretty warm) until I fire up a Zoom meeting


That's... fascinating. I'm using the exact same setup as you and the fans randomly spin up for extended periods of time, even through several reformats and OS reinstallations over the years. I haven't been able to pinpoint any logical reason for this behaviour.


> When M1 finally supports docker

I've been using the preview beta of Docker on M1 for some time now. It works fine for me; some people have reported issues in the corresponding Docker Slack channel, but they usually stem from the particular images in use rather than Docker itself.

> It's got to the stage that I prefer to use my 13 inch M1 over a top of the line 16 inch provided by work.

I agree. It's really quite difficult to go back to x86 laptops. If they manage to fix the external monitor limitations with the next generation--and I'm sure they will--it will be really hard for me to justify spending money on anything else.


I’ve tried _everything_, 5 different docks, refresh rates, different displays using different interfaces.

The problem is the vram using full power.

The only solution I’ve found under OSX is using an eGPU. I have both the Core X and the Core X Chroma, and the Chroma works fantastically both as a fully fledged docking station and keeping the output wattage / fans at bay. I’m almost exclusively using my 16” stationary with an external display.

Another solution, albeit expensive, is buying the 16” MBP with the 5600m graphics. That card uses HBM2 memory which uses around a 4th-5th of the wattage of the GDDR6 memory of the 5500m. Around 4w instead of 19w.


You can buy a 27" iMac together with an M1 Air for the same price as a 16" MacBook Pro and an eGPU.

I don't understand why people insist on trying to convert laptops into desktops.


> I don't understand why people insist on trying to convert laptops into desktops.

I finally bought a desktop again over Thanksgiving after a decade of having a laptop that functioned as a desktop. It's been a great experience.

I don't do anything taxing, so I just got a little Intel NUC i5 with a 24" monitor. Having an actual desktop setup is just head and shoulders better than the "laptop as desktop" I'd been living with.


I remote into a desktop from a laptop so the power is always there and I can be portable if i need to. Also has the benefit that I can work on any device that has an rdp client (android, ios, rpi, etc)

The laptop only needs to drive the display(s) so anything made in last few years should suffice. Currently using a surface go.


> I don't understand why people insist on trying to convert laptops into desktops.

They want to have just one computer without worrying about synching or other coordination.


I'm running the 5300M, that might explain why.

I can imagine that when it came to configure the machine the decision was `Put more RAM in it, he's a web-dev, doesn't need a fancy graphic's card!`

Which I don't as I'm not going to gaming on my work PC, but if that 15W is pushing it above the thermal limits of the chassis, then that would have been a decent upgrade.

I'd prefer it without a heavy-lifting graphics card in that case, but that doesn't seem to be an option.


The 5600m is a BTO option that arrived in the second half of 2020, so it wasn’t available when I bought mine.


> The M1 has been the first laptop I have been truly happy with as a desktop user.

I also would be, if there were not some annoying bugs. Like external display flickering after waking up from sleep (interestingly, not when cold booted). Sometimes reconnecting the cable fixes it, but it is still an annoyance.


I've experienced something like that as well. Depending on what you can do, try a different monitor and/or different cable and/or different dongle. I've managed to find a combination that doesn't do that anymore.


Could you share your combination?

I have my problem with OWC TB3 dock and Dell P2715Q. I will try an older dock that I have in the office (Kanex TB2 Express; as the name says, it is TB2, thus extra dongle needed); but playing around with different TB docks can get expensive...


I use a Dell XPS 13 with 3 external monitors via a Dell TB16 dock (which uses external power). I wonder if that setting solves these M1 issues (via the extra power) or not. I understand CalDigit docks also have external power but don't know if there can be a subtle difference.


The OWC TB3 dock I'm using is also powered externally. Docks by CalDigit are different docks than OWC, but I think they are all using the same chipset underneath.

It also does not happen with Intel Macs, which use Intel TB implementation. With M1, the TB is brand new, so some issues are to be expected. I suspect that with sleep and wakeup, there is some state mixup with previous link training, so it doesn't talk to other side the way the other side expects. This might be - or might be not - fixed with firmware updates.


Is the flickering display an Apple Thunderbolt Display by any chance? They put out a firmware patch ages ago to fix that (in the monitor itself).

This might be the one: https://support.apple.com/en-us/HT204450


> Is the flickering display an Apple Thunderbolt Display by any chance?

No, OWC Thunderbold 3 dock and Dell P2715Q.

I also have older, 2015 Macbook Pro, with Thunderbolt 2 and Apple TB2->TB3 adapter, connected to the same setup. This laptop does not flicker.


Is your external display on the intel mac in scaled mode? MacOS has to do a lot less work if it is in its default display resolution.

FWIW I use catalina on a 2009 core 2 duo mac mini hooked up to a 1440p monitor in native mode, and the fan remains at idle when just clicking around the desktop.


Why not just slap the thing on a cooling stand? As a developer with XCode maxing my CPU 90% of the day, I consider any Apple laptop unusable without additional cooling. Whenever I'm plugged into a monitor, I also have two fans to go by the vents. Runs better if you download a program like iStat Menus and force the fans to always be high too.

Does it suck? Yes. Are fans only $10 to save a $2000 machine? Yep. Apple simply doesn't design their kit to run hard and run often. It's like those internet services that sell higher speed than they can actually give every customer. They depend on you not using it all.


I just ditched my maxed out 16” for a “low end” M1 13” because my actual benchmarks of my day to day node compiling is seconds faster on the M1.

For docker I plan on using a pi 4 or another PC to just run stuff over the network.


What's the point of using Pi 4 when it is also ARM cpu? You can use the Docker beta on your M1 mac and it's going to be a better experience.


1) problem isn’t arm. It’s m1 support. The developer preview still has some kinks.

2) free up memory on my mac. I’m working at home so using a networked computer isn’t a big deal.


> I had just as many issues with the Dell XPS, where even basic video games would become unplayable after 15 minutes due to CPU throttling down to 0.8 Ghz.

Serious Q: Have laptops ever been decent for gaming?

I don't game much, but I think the form factor would seriously crush thermal capacity.

I have a 15" MBP and it's fine for most dev work save Node and Typescript occasionally cause the fans to spin up for extended periods of time. Getting rid of VSCode in favor of more memory/ CPU friendly editors helped a bunch.

Does your work involve a lot of 3D or graphics?


They are good for gaming when manufacturers use a good cooling system. Which is, almost never.

HP, Dell/Alienware, Lenovo, all of them have had overheating CPUs for over a decade. GPUs are fine half the time, surprisingly.

Their testing is garbage, what they call normal use seems to be completely different from what power users call normal use. And I'm talking about "workstations", not cheap consumer laptops.

Oh, but the design constraints, you'll say. OK, that's forgivable for Ultrabooks/thin-and-light laptops. Not workstations. And before the thinness fad, they were still overheating. And not because of design constraints.

As an example, I'll bring the Elitebook 8760/8770w/ZBook 17 G1/Zbook 17 G2 series, all of which have the same internal design and layout.

All were overheating after a while, due to thermal paste going bad (and being shit from the factory, but I digress).

With the ZBook G2 it got very, very bad. Those Haswell chips with FIVR were mini nuclear reactors. HP realized they weren't "just overheating", but "extremely overheating".

Their solution? The same heatsink with an extra heatpipe. Just like that, poof, problem solved. One extra heatpipe!

That can't cost more than $2 in the factory. I don't know any buyer who wouldn't have paid an extra $5-10 to have a proper working $4000 workstation.

Whoever is saving money on copper and screws in these companies needs to be fired and banned from ever working in companies making hardware.

I'm not even going to talk about Apple, everyone knows their stuff has a ton of problems. But they're selling people on design and software, not quality engineering.

And they actually solved the heat problems by going in the opposite direction with the M1 (ultra low power usage while maintaining performance instead of a good cooling system).


It's a better solution IMHO. A cooling system does nothing to increase battery life. Otherwise you have a creeping race that leads you ever closer to 'fake-desktop' laptops.

I do watch Louis Rossmann about Macs, and how terribly they are designed, but this is universal across many devices.

I had so many issues getting low latency audio to work on my 2 year old XPS, I just plain replaced it with a 5 year old mac and had better performance and stability out of the box.

No company is perfect, but Apple does get consistently close. I'm glad they don't focus on the specs and focus on experience. Who really needs a 4k screen on a laptop?


> They are good for gaming when manufacturers use a good cooling system. Which is, almost never.

Ok, this is more or less what I thought.

This does seem solvable, but also seems like the sort of thing the bean counters would cut early.

From what I recall, pre-Dell Alienware did a decent job at it.


Nah, I just wanted to play some basic 2D games like Cogmind, Factorio, Slay the Spire and Pyre whilst abroad. It was one of the main selling points of the XPS, that it was an ultra-light that you could play a few games on.

I certainly didn't fancy a gaming laptop that's all compromises and no upsides just to play mindless AAA titles.

But the problem with the XPS is they jammed as much processing power as they could. It's great in theory having a Nvidea card as well as an i7, but it just can't handle both running at above half capacity.


> When M1 finally supports docker, I might even consider not using the 16 inch at all. It crashes due to thermals about twice a week due to the constant high temperature and eats through batteries!

I guess it would be better to wait for M2 or M3 rather than jump on a new system that has a work-in-progress developer ecosystem or un-optimised software (Even if it is faster on M1 than Intel Macs).

By the time that Apple Silicon software support is optimised, the M1 Mac would already be out of date and superseded by M2 or M3 Macs.


While there is definitely a risk that M1 Macs will have the same fate as the first-gen Intel Macs (they supported only 32-bit mode, thus the support was cut quicker), the usability and usefulness is surprisingly high.

Docker and Virtualbox (this one is not coming) are the only two apps that do not work for me, and worked on Intel Mac. Sure, there are many apps that run in Intel mode, but the user won't notice, unless interested in that detail.


That was only because Intel purposefully delayed 64 bit core duo.


Why? (I’m too young to know)


I would not say that it was purposeful.

Intel at the time was shipping P4 Netburst CPUs. They had all the features, like multiple cores and 64-bit support, but also one huge disadvantage: they were power hungry, and it dissipated all that heat. Definitely not something you could put into laptops.

So the next generation, Core Solo/Duo, was not continuation of Netburst, but Pentium M, their mobile line which itself forked off P6 (Pentium Pro). They were released in January 2006, and were indeed 32-bit. The reason was time to market, they were basically throwing away their current arch and forking from an older branch; for the majority it was not a problem, most people were running 32-bit XP anyway, with PAE if you happened to have more than 3,5 GB of RAM (pre-SP2 XP supported that). Then, the 64-bit Core 2 Duo was released in July 2006.

The unfortunate part is, that Apple announced transition to Intel in mid-2005, and shipped the products across all lines in January-August 2006, i.e. hitting the window when only the 32-bit version was available. That means, that they had to define and maintain i386 ABI for their OS just because they ever shipped Core Solo or Duo machines. If they were able to ship C2D from the beginning, they could have skipped i386 and just use amd64 ABI from the start.


Thanks! I appreciate the history lesson


Same issue with the XPS and throttle, you can bypass the throttling manually through an app - and disable speedstep and one other thing...

I can't recall, but I never could finish Persona 4 on my XPS. :( 9350.

Agree 100% w/ 16" MBP.


If you're still having this throttling problem (I have also, for about 2 years now) I finally found a solution. Open up ThrottleStop, click FlVR and check "Disable and Lock Turbo Power Limits". Finally an end to the constant 0.8 GHz throttling!

I think Dell must hijack those TurboBoost power limits as a way to massively underclock the processor and try for better advertised battery life.


I had problems with older (Intel) Mac Pro and usage of monitors on both HDMI and display port at once. That wasn't an Intel thing, and that sort of thing could happen with any hardware that isn't tested with every combination of other hardware.

I'm excited that the M1 is good, also- that it seems fast with everything, even the old Intel-specific code via Rosetta. I don't subscribe to Apple NIH[1] philosophy, but what they have made historically, barring the problem I mentioned above and a few macOS and iOS issues over the years, has been awesome. The M1 seems to follow Jobs' philosophy of making the best product.

[1]- https://en.wikipedia.org/wiki/Not_invented_here


Um. There are more options beyond PowerPC/Intel/ARM Macs. There's a whole world of PCs out there that aren't built out of corner cutting proprietary bits.


This ain't my first rodeo.

I've had the following in my life

* Pentium 4

* 21.5" iMac

* 27" iMac

* Another 27" iMac

* Self built 2700K Intel

* Dell XPS 9560

* Self built 2700X Ryzen

* 2015 15" Macbook Pro

* M1 Macbook Air

For my work PCs, I had a Dell Desktop, a Asus Laptop and 2 Macbook Pros.

If selecting a PC for work, I would never go for a Windows PC nowadays. You'll get the cheapest option available, filled with all the corporate crap you don't want, and locked down to oblivion.

Macs have a minimum quality bar, and don't tend to get hamstrung quite as easily by company policy. It's just not worth it unless you know for a fact you'll get to choose the machine and software yourself.

As for at home, I'm flexible, but I wouldn't go for Mac if I wanted to game, and now in 2020, I wouldn't go x86 if I needed battery life.


>If selecting a PC for work

Well there's your problem. Don't define your choices by what you're coerced into using at work where you only go because they pay you. Define yourself by what you choose freely as a person.

It sounds like you've self built before, so why not again? It'll be far more compatible and with fewer proprietary bits than using something from Apple.


FWIW, I am currently using a CalDigit TS3+ with a (Intel) MBP, with dual displays (one over DP, one over USB-C (DP)). They work as separate displays, not mirrored.

I also tested an HP G2 Dock [1], because its dual-DP ports gave me more options, only to find that macOS can only use those two ports as Mirrored to each other.

Apparently [2], this is because the HP G2 implements its two DP ports as DP1.4 MST (Multi-Stream) on a single link, and in that mode macOS will only support a Mirrored configuration. It cannot (as of Big Sur 11.1) use them as separate displays.

I'm told you can also use the HP G2 with separate displays by plugging one monitor into one of its USB-C outputs, kind of like how I'm using the CalDigit TS3. I tried the one labeled USB-C (DP), and that stayed mirrored, but apparently I should've tried the Thunderbolt downstream port instead. I returned the dock before I knew or could try this.

All this to say, I'm disappointed by the mess of an ecosystem that Apple has built around Thunderbolt 3. But I can't say I'm surprised.

[1] https://store.hp.com/us/en/pdp/hp-thunderbolt-dock-120w-g2

[2] https://medium.com/@sebvance/everything-you-need-to-know-abo...


Intel's TB3 controllers can't do both TB3 daisy-chain and provide two DisplayPort endpoints. That's why most TB3 docks that have two DisplayPort ports don't have a second TB3 port, and why the HP had to use MST to have both. It's also why 5k+ displays may or may not support being daisy chaining, depending on whether it's using two DisplayPort links.

MST isn't necessarily great because over an HBR2 link it doesn't have the bandwidth for dual 4k60, and there's still a lot of stuff that doesn't support HBR3 links. And MST + hotplug is still a mess. So you're just trading for a different set of confusion.


Well the fact that Apple doesn't want to support MST on macOS doesn't help either. Funny enough, installing Windows on the same Mac makes MST work just fine.


Their push has always been towards Thunderbolt. I was disappointed in this decision, but not really shocked.


Apple advertised MST support for Sierra or High Sierra and never delivered. Like you mentioned, daisy chaining does work on the same hardware in Linux and Windows.


I'm happy with the CalDigit TS3+. I just got one and it's able to drive two external 4K 60Hz displays (one usb-c, one displayport) from both my Intel MBP or from my Dell XPS 17, hot swappable. In the latter case, that's over three 4K displays including the laptop. (I didn't realize the XPS actually had >4K resolution until opening the display settings tool).

I just started with this configuration two days ago, so we'll see. It's not completely glitch-free yet: I noticed the mouse (also plugged into the TS3+) glitched a bit yesterday, and once one of the monitors couldn't be detected from the XPS once and required reboot. XPS also shows a low-power charger warning in tray when plugged into it, which I wish I could disable but oh well. Still, it's worked way better than I'd expected so far with no hassle. Completely replaces my desktop setup and hot-swaps are generally seamless.

I haven't noticed any major CPU or power spikes. Fan isn't doing anything weird. But I'll look out for it.


just in case, OWC Thunderbolt 3 to Dual DP [1] adapter work fine with two 4k@60Hz separate displays (with MBP15 2019). And seems similar Thunderbolt3 -> Dual DP adapters also should works well. But I agree that is a big mess with that stuff, I brought lot of usb-c->hdmi adapters before found solution with works fine with 2 displays (some of those adapters only supports 30Hz, some can slow-down MBP performance dramatically with 2 displays).

[1] https://www.amazon.com/OWC-Thunderbolt-3-Dual-DisplayPort-Ad...


The M1-based macs support USB4, which is conceptually an upgrade over TB3 (supporting multiplexed streams of PCIe, DP, and USB). However Apple conspicuously does not prominently advertise that support, which I'm guessing is due to a lack of ecosystem.


USB4 is prominently displayed on the Marketing and tech-spec pages for the M1 Macs:

https://www.apple.com/macbook-pro-13/specs/

https://www.apple.com/macbook-pro-13/


I would have expected to be able to daisy chain 2 dell monitors (wqhd, nothing 4k fancy), but I was unable to. Also, finding a dock with reliable 2 monitor output has not been easy, and at the end I pluged both monitors to the laptop and call it a day. That the laptop has to power up the discrete graphics for running external displays and goes crazy with the fans while you are just browsing is something that exasperates me, but not being a really fan of apple, I'm not blind and see that what they have accomplished with the M1 is an exceptional product, and sooner or later I will ditch my 16 inch for one of those. The only thing is that almost nobody will want to buy an intel macbook and those aren't going to maintain their value like before.


I’d like to add to this my disappointment in using a G2 dock. Have had two different HP G2 docks (using DP and USB C) for dual external displays and they both broken over two years of sitting on my desk. I don’t know what’s happening to them but eventually both external displays start flashing and getting artifacts and USB and Ethernet devices start randomly disconnecting.

Just my experience. Hope other haven’t had those issues.


I recently switched to a single 6k monitor after using dual monitors for 10+ years. A single high res monitor is so much better ergonomically.


Personally I switched to a 32:9 ultrawide. I use the Samsung CRG9 which I originally bought for gaming. Interestingly I found it to be subpar for gaming (game huds are typically glued to the left and right of the screen, which is so far in your peripheral it's effectively useless on the screen), but I found it amazing for productivity. Having 4 windows side by side has made development so easy. I can have Figma / VS Code / Chrome / terminals all open side by side. It means I almost never have to do window management, whereas with a lot of the apple displays I still find myself only having 2 windows up at once. The DPI leaves some things to be desired, but for real estate it's been unmatched for me.


I also recently bought a Samsung CRG9. It replaced 2 27" Apple Thunderbolt displays. I bought it because I needed something that had Mac and PC support.

Having a curved ultra wide display is much better than 2 smaller displays. I can put my main windows right in the middle of my field of view, and put my reference windows off to the side.


Back before I got the CRG9, my dual monitor configuration was to have one directly in front of me, and the other off to the right and tilted towards me.


My experience with the CRG9 mirrors yours exactly. My only complaint productivity-wise is similar to the issue with gaming, because I'm using Windows all of my taskbar icons are shoved over to the left.

For the games with HUDs stuck to the sides, I just turn down the horizontal resolution and have black bars on the left and right. But for games like FS2020, using the full display makes the game that much better.


I'm not a Windows person, but in places where people post screenshots of their dev setups I often see the taskbar icons in the middle, so it seems there is a hack or third-party tool to achieve that.

Now if only macOS notifications for important meetings weren't in the top-right corner...


Single 43" 4K monitor here and I'm not going back to anything smaller in the future. It was a bit of a compromise for me in terms of game performance (I'd have preferred 144hz over 4K really but less than 4K at this size would be poor). For the first time ever I feel like I always have enough screen space.

The big advantage over an ultrawide for me is that I have a lot of vertical space, which is great for code. It's basically 4x 1080p displays in a 2x2 grid.


What monitor do you have exactly? Looking to potentially get one myself.


I have the LG 43UN700, and I paid about 700GBP at the time, I think. Brief review:

I love it. Particularly excellent for photo editing (it's so huge I have to zoom out sometimes) and programming. There is a slight visual oddity on the very outside 1/2px caused by the viewing angle of the pixels out there, which I think they'd only solve by curving it - personally I only even noticed it because a reviewer mentioned it. I also wish all the HDMI ports were 60Hz (2 are, 2 are 30Hz) but that's just a nitpick.

I play games on it too and it's generally very enjoyable. It has un-advertised FreeSync support, interestingly, which I find works well with my AMD card under Linux, but it does have a very limited range of like 60-70 I think. There's a different setting on my Linux drivers whose name I can't remember that basically means as long as you're over 60FPS you'll never see any screen tearing, which I can attest works for me with it.

The stand is very solid, and the speakers are decent too. I love that it comes with a remote for operating it. It's so much better than using the nubbin under the display.

The input splitting modes are really quite good, surprisingly. Being able to display 4 inputs in a 2x2 grid is awesome, although I use a tiling window manager so I don't really use it much. If you were feeling particularly adventurous you could attach 4 inputs from a single machine and it'll recognise it as 4 separate displays :D


Check the reviews on YouTube for the Dell UltraSharp U4320Q.


Yep, I took a punt on a 32" 4K one recently to see if it'd do better than getting a second 27" 2K one (3 screens including the laptop) and I've actually stopped bothering with the laptop screen too. Saves me a big chunk of mental space just knowing everything is (usually) on the one screen.

Not defending Apple, it's a pretty clear hardware failing.


Do you use yours in native or scaled resolution?


4K at anything below 43 inch is unusable for text without scaling.

27 inch 2K or 43 inch 4K are the sweet spots.

Here's how you can check if your monitor needs scaling:

- Find out the DPI of your existing monitor and your desired monitor. For example, 24" 1080p is 91.79, and 27" 4K is 163.18

- Calculate the ratio: 91.79/163.18 = 0.5625 (this means that on the 27" 4K monitor everything is 56% of the size of the 1080p)

- Now take a screenshot of your monitor (at 100% scaling) and resize it to 1920x1080*0.5625 = 1080x607. View the resized image at 100%. That text will be the same physical size as on a 4K 27" monitor set to 100%.

Without scaling, most people would find text too small on both 27" at 32". You would need a display as big as 40" to get the same density as 1440p on 27" (~110 DPI). For most people, 27" requires 150% scaling and 32" requires 125%.


> 27 inch 2K or 43 inch 4K are the sweet spots.

27" 2K is gross. Compared to 4K, it looks so pixelated that I feel like I'm back in the 90s. 27" 4K at native resolution is far too small for me to read, but it looks great at 150% scaling, or even 200% for some things.


> 27" 4K at native resolution is far too small for me to read, but it looks great at 150% scaling, or even 200% for some things.

The problem is once you start scaling up you're losing real estate. 4k at 200% scaling gives you same real estate as 1080p. Sure the picture will look better but you're not gaining any improvements in being able to open and see more things at once. That and some apps can't handle scaling properly.

Personally I find 25" 1440p at native scaling to be a good balance of real estate, sharpness and ease of reading anything comfortably (even low contrast status bars).

With that set up you can open 4x 80 char side by side code windows in Vim, VSCode, etc.. You can also grab a really good 25" 1440p IPS panel monitor for around $300. I wrote a huge guide on picking monitors for developers at https://nickjanetakis.com/blog/how-to-pick-a-good-monitor-fo....


Nice coincidence running into you here. I did come across your blog while researching! Thanks for that.

I agree 100% about the real estate issue. Physically larger monitors with equivalent real-estate don't make much sense for me personally.

I wish there were more 27" screens available since 25" looks too small to me and 32" requires some neck movement which I don't like.


> The problem is once you start scaling up you're losing real estate. Sure the picture will look better but you're not gaining any improvements in being able to open and see more things at once.

Depending on what you're doing, that might be exactly what you want. I use my 27" 4k display to read PDFs zoomed in to fit two pages side-by-side onscreen. That way, the clarity of the rendering is good enough to be an alternative to print.


>27" 2K is gross. Compared to 4K, it looks so pixelated that I feel like I'm back in the 90s.

I don't know, that sounds a little embellished at first read. Either you sit way closer than ergonomic norms or you have irregular vision.

Should you be able to notice a difference? Yeah that's plausible. To see pixelation at 1440p doesn't sound right however. Claiming it is like the 90s borders on absurd.


27” 2k is acceptable, 27” 5k at 200% scaling is great. 27” 4k with 150% scaling ok.


Sadly fractional scaling is imperfect and looks blurry to me (on both MacOS and Windows, GNOME doesn't support fractional scaling (last I checked)).

I would love a 5K 27" screen at a reasonable price. :(


> GNOME doesn't support fractional scaling (last I checked)

It does, but is hidden behind dconf key; except maybe Ubuntu 20.10. It works nice for Wayland applications, but X11 ones are blurred (as in upscaled from @1X). X11 includes Chrome and all the Electron apps, so I it would not be a popular choice... hence it is hidden by default.

If you want to try it, it is `gsettings set org.gnome.mutter experimental-features "['scale-monitor-framebuffer']"`


I use macOS. 2880p on 4k 27” is a poor man’s 5k :(. It’s slightly blurry but I think it looks better than 1440p on a 1440p 27”


I use a 31 inch 4K display at native resolution and it's perfectly usable and text is perfectly readable, although when all my other displays are HiDPI/Retina, everything looks pretty chunky in comparison. My ideal monitor would probably be a 30 inch 8K display running at 2x scaled mode, but we're many years away from that being affordable.


The point is to use scaling; fonts at @2X are much easier to look at than at @1X.

Also, your DPI calculation are Windows and Linux specific - they assumed 96 dpi at @1X scale. Mac started with 72 dpi for @1X, so @2X is 144dpi. 27" 4K is slightly above that, thus perfectly usable at @2X scale. For Windows and Linux, it is better to use 150-175%.


Sorry, I should have said PPI (pixels per inch) - as calculated by dividing screen resolution by screen area.

Thanks for catching that. The rest of the comment still remains.


I assure you that 27" 4K at @2X (i.e. 200%) is perfectly fine... with macOS ;).


Native but with text sizes bumped up on the things I'm staring at a LOT (e.g. VS Code). Scaled is a major resource hog in my experience, might be better if your macbook has a dedicated GPU

My next monitor probably will be 40" 4K but sitting a bit further away.


I've got the same thing, I'm running a 32" 4k at 1x resolution, so the desktop thinks it's 3800px wide.

I've got a hacked version of rectangle.app giving me 3x2 tiled windows. Some stuff will be in 6ths, some will be in full height 3rds. And occasionally, I'll get a spreadsheet that needs full screen.

Web browsers tend to run at normal res, terminal and emacs are using Anonymous Pro 14pt. At that size, terminal is 157x122 when running in a full height third. Works pretty well for my aging eyes.


The first time I ever really felt like 2 monitors was an improvement over one huge monitor was after wide screen monitors were ubiquitous and monitor rotation support was rock solid.

These wide monitors are really not very good for watching logs, or other status displays. Rotating a second monitor and dedicating it to showing lots of logs or several rectangular monitoring screens tiled vertically hugely reduced the amount of window management I had to do.

In fact I kind of wonder why all my monitors are horizontal again. Probably because horizontal works better for personal computing, or I have my monitors arranged wrong, or both.


I have a 21:9 ultrawide, not 32:9 or whatever, and it's _also_ the highest screen I've ever had at 3840x1600. Going from 1440px on dual 27" screens to 1600px was really nice for log reading.


I am fairly certain the only CRT I ever owned as an adult had 1600 vertical pixels. Basically I've been dealing shorter screens for fifteen years. It's nuts.

Even now, my laptop defaults my 4k display to 1080p Retina mode. So I get 5 more lines of terminal on my old display, but the text is not as easy to read. I get the same number of lines on the new display if I go down one font point, and it's still more legible. But I'd really love to be able to scroll logs in an 180x160 terminal window, instead of 200x75.


Agreed, I've been using an MSI Prestige PS341WU, which is a so-called 5K2K, so a 4K but wider, and I have no complaints.

I use Moom to create three screen regions (there's an undocumented command-line switch to allow a 36-based zone, which I broke up 11-13-11), and it resets everything when I plug in or out. I do have to juke the text magnification for that, which I have on macro keys on my Ergodox.

Now if only I could get focus-follows-mouse, everything would be perfect... I've only been complaining about that for fifteen years though, so I'm sure it'll happen one of these days.


Are we talking about this one : https://www.apple.com/no/shop/buy-mac/pro-display-xdr ?

You can buy 20 full-hd gaming screens for the same amont of money. Or 12 4K screens. And while they may not look as nice, they have a much better frequency. 60Hz feels laggy.


One thing that's really surprised me is that I bolted my old 60Hz monitor to my current 144Hz and the difference is so bad my eyes almost hurt to look at it. My eyes actually physically struggle with the 60Hz now if I have both in my FoV


There are flicker free 60Hz monitors.


That doesn't even make sense. You may be less-susceptible to the refresh rate--many people are--but for someone who notices it, the only solution is faster refresh.


LCD's don't refresh like CRTs do. The frame is basically persistent until the next frame comes.

What _does_ "refresh" is the backlight in the sense that the brightness is controlled by PWM. However, there are many monitors that use PWM frequencies in the several hundred Hz range, or don't use PWM at all.


It makes sense. The rate is the rate at which new imagery can appear on the screen. It says nothing about how much of that 1/60 seconds it's visible, and it could be all of it. By comparison, on a CRT not even all the pixels are lit up at once, so 60Hz on a CRT is unbearable to me (I could bear 75Hz, but 85 was nicer), while on an LCD I'm okay with 60Hz.


gaming screens typically have terrible viewing angles and poor picture quality, optimizing for low input latency and high refresh rate. While I probably wouldn't buy a very expensive monitor at this point that didn't have a high refresh rate, I'd certainly pay for picture quality, viewing angles, resolution, and maybe just appearance before refresh rate. As an external display, you'd need gpu power to drive the display anyway, and I'd take my 30" Dell 2560x1600 IPS over some PoS 1080p 120hz TN panel.


There are gaming screens that strike a very good balance of features. The Dell S2721DFG, for example.

27" 1440p, IPS panel with wide viewing angles, 165 hz and FreeSync 2, DCI-P3 color gamut. RTINGS describes it as "excellent response time" and "outstanding low input lag."

https://www.rtings.com/monitor/reviews/dell/s2721dgf

You get the typical IPS tradeoffs: it doesn't win any prizes for contrast, HDR, or black uniformity. I'm very happy with mine.


Modern high refresh rate displays (100/120/144 Hz) come in VA and IPS flavors and good ones are factory calibrated to UltraSharp levels as well. 2560x1440 monitors are the sweet spot in terms of price to performance.

Your information is 2-3 years out of date.


Can you please recommend a VA panel that is good for text sharpness, too? My eyes don't react well to IPS, for some reason: I used to have a BenQ GW2760HS (1080p, VA panel) and then upgraded to a Dell UP3017 (2K, IPS) and find it harsher on my eyes, even if I enjoy the text sharpness!


> My eyes don't react well to IPS, for some reason

In what way? If it's just discomfort using it for long periods of time, there's a 95% chance, you need to turn on the lights in your room or turn down your monitor brightness.

> Can you please recommend a VA panel that is good for text sharpness, too?

Some subpixel arrangements might require you to change the font smoothing (Windows ClearType), but any monitor with high pixel density will do, really. I'm not an expert on specific models so I won't recommend one.


> In what way?

I do not face the same eye fatigue after hours of sitting in front of a VA panel compared to sitting in front of an IPS one.


If there were other 6k monitors I'd consider them.


For 1/30th of the cost you can get a 1440p monitor (based on UK prices).

I do agree that a single monitor is more comfortable.


1440p?? Someone who is buying a 6k monitor is not going to be comfortable with a monitor of such a small resolution.

1440p monitor has like a fifth of the pixels a 6k monitor has.


- The parent comment brought up full-hd monitors (1080p), which have 2/3 the pixels of 1440p.

- The parent-parent comment suggests that a single high resolution monitor is a good solution for people in general. I am suggesting that a 1440p monitor at 27-inch is a better choice than a 6k monitor, for people in general, taking cost into account.

- You could get five 1440p monitors for 1/6 the price of this 6k.

- As resolution increases you get diminishing returns.

- 1440p, 27-inch is a sweet spot, imo.


Without windows power toys it’s so much worse workflow wise. Currently running a 49 inch ultra wide at 5k and 2 smaller screens as ancillary. Always losing windows and such.

Also, installed an app to allow for more resolution flexibility and 120 hz “feels” so nice but at least my setup only supports 120 hz at max 4K iirc.


I find a single small size monitor to be superior.

I switched from a larger monitor to a smaller one with the same resolution and not having to turn my neck any more was rather pleasant.


Especially on macOS with its crappy window manager.


Spectacle isn't being actively maintained according to their webpage. I use better touch tool (https://folivora.ai/) for this.


Rectangle (https://rectangleapp.com/) is Spectacle's successor, works great, and its source is available as well (https://github.com/rxhanson/Rectangle).


+1 for BTT, particularly in combination with Karabiner Elements and goku to handle Karabiner configs in a good way.

If you like tiling window managers, you can get a very similar experience (in terms of using the whole screen and being able to flexibly throw things all over the place) with this kind of setup.


Use spectacle app. It's awesome!


Rectangle is an alternative that's actively supported: https://github.com/rxhanson/Rectangle



Love it but important to note "Spectacle is no longer being actively maintained" and it has been like that for over a year


Yup, hence mentioned above Rectangle: https://github.com/rxhanson/Rectangle


Spectacle is deprecated, Rectangle is a fork that is still maintained.


spectacle is no longer maintained, but the open source alternative rectangle.app works just as well. https://github.com/rxhanson/Rectangle


Since everyone is recommending Spectacle/Rectangle, I just want to chime in that I tried that, and Yabai, and ended up settling on Moom.

What you can't have is focus follows mouse, however. Moom gets me everything else I want.


It's worth noting that there are workarounds, although none are particularly great:

1. The Mac mini form factor officially supports two external displays. One must be DP/TB3, the other must be HDMI. However, I would argue that the Mac mini is the least enticing form factor in the new line unless you have very specific requirements, and the need for two different protocols is both annoying and confusing.

2. Sidecar works just fine. This is my current solution of choice.

3. You can configure a Raspberry Pi to act as an AirPlay display. The software I attempted had a hardcoded resolution, and any attempt to change it failed.[0] You can see from that issue debugging is still a work in progress, but it's fairly tedious, and I haven't made much progress since my last reply--all of my attempts have failed to produce meaningful info. However, if you have a screen with a very common 16:9 resolution officially supported by lots of Apple devices, this will likely work quite well.

4. You can use an Apple TV as an AirPlay display target, provided your monitor is one of the supported 16:9 resolutions. 1920x1080 is a safe choice.

5. You can get an AirPlay-capable display. I haven't really looked into this, as it seems like a fairly limiting criterion. I suspect my options would be limited to projectors and overpriced TVs. I could be wrong--again, I haven't checked.

6. Some people have reported being able to get DisplayLink working.[1] I plan to try this, but I haven't gotten around to figuring out which of my adapters support DisplayLink. I'm hoping I'll be able to get this to work with the adapter plugged into my CalDigit TS3+ dock so I only need one cable.

Side note: despite the limitation, I got a CalDigit TS3+ just to dock my M1 MBA. My Intel-based MBP had four TB3 ports, which meant I could connect it to my dual-monitor KVM with three cables (2x DisplayPort + 1x USB) and still have one for power. However, even with only one DisplayPort cable, there still aren't enough ports on either the M1 MBP or M1 MBA. It works great, aside from the fact that I'm wasting a monitor.

[0]: https://github.com/FD-/RPiPlay/issues/138#issuecomment-74523...


I wrote a comment on DisplayLink below: https://news.ycombinator.com/item?id=25675013

It works, and it's the only option, but you won't like it.


Re #5, pretty much every 2020 TVs from Sony, LG, Vizio, Samsung, et all support Airplay 2. This was largely a result of Apple's push with AppleTV+.


Incidentally, I was just looking at Mac M1 mini display support. I have LG UltraFine 4K USB-C and 5K display port (not the LG one) monitors. M1 can support 5K only through USB-C. This leaves me with HDMI to USB-C so that LG too can be connected. It seems LG speaks display port under USB-C, so there might be a way. Adapters would be some 60-100$ just to test this, though.

Other than that I’m all set for M1. My 2018 MacBook Pro needs power 2-3 times a day no matter how much I try to weed apps. Something remotely strenuous happens fans take off.


M1 seems to be pretty heavily I/O limited - the amount of supported Thunderbolt ports is way down (2 on every M1 device and no more), the screen support is down as well.

Hopefully they fix it in the next iteration.


It might be worth noting that M1 Macs actually has two Thunderbolt controllers for 2 ports (dedicated controller for each port), whereas previous Macs shared two Thunderbolt controllers for 4 ports. The 2 ports limitation feels to me like a design decision rather than a hardware limitation (to differentiate with an upcoming higher-end model?)


The entry level MacBook Pro has always had only 2 ports, so this fits with the existing lineup. Most likely the next models to switch to ARM will get 4, like they do now.


The distinction is moot since hubs don’t exist, so I’m still facing the problem of lack of physical inputs.

At best on the market there are hubs that give 1 extra type c input for power.


There's a recently announced OWC Thunderbolt Hub[1]. I would still prefer to have 4 ports built-in even with two controllers rather than having to deal with hubs, though.

[1]: https://eshop.macsales.com/shop/owc-thunderbolt-hub


I did see this and it’s a bit sad that it’s been like 4 years since type c has come and hubs still aren’t generally available (even if just for USB and not thunderbolt). I mean I didn’t fully adopt USB type C just to convert the ports back to type A.


This is mostly because Thunderbolt hub support was optional in Thunderbolt 3 spec, but mandatory in USB 4. Apple just added support for it in Big Sur.



Before USB 4, USB-C spec itself say "see also USB 3.1" for USB hubs and disallow passing through Alternate Modes (e.g. Thunderbolt/DisplayPort) or Accessory Modes (e.g. audio dongle) which vastly limited the usability of USB-C port. USB 4 introduces a new discovery protocol with added support for passing through Alternate Mode in USB-C spec (as long as downstream is also USB 4).


Which leaves me deeply confused.

Daisy-chaining had to be supported. From the host point of view, a hub is barely any more complicated. Why was it optional?


My understanding is that daisy chaining doesn't require a mixed USB and Thunderbolt signal, only Thunderbolt. Before USB 4, USB-C spec itself disallows hubs to passthrough an Alternate Mode and even Thunderbolt daisy chain also have to terminate at any point where USB is plugged in. In this case, I believe you can have Thunderbolt 3 hub that only carries Thunderbolt signal, but it will only work with Thunderbolt devices and not USB-C.


But you don't need to mix signals. When you use a thunderbolt dock right now, doesn't it use pure thunderbolt as a backhaul, and have a PCIe-based USB 3 controller inside of it?

If that's right, then even without mixed signals you could have a pure-thunderbolt hub, and you could have a device with several thunderbolt ports and several USB 3 ports. And it should be straightforward to combine them into dual-purpose ports. The rules for passing through alternate modes don't matter because you're not passing through, you're not a USB hub, you have your own USB root.


That indeed could work, and it looks like Belkin's Thunderbolt 3 Express Dock HD[1] actually does this. Though in this case, Thunderbolt is limited to just 2 (1 for downstream and 1 for upstream).

I guess the biggest benefit of USB 4 hub is that allows the hub to have more than 2 Thunderbolt ports (the upcoming OWC Thunderbolt hub has 1 upstream and 3 downstream) and no longer have to worry about unplugging a device breaking a chain (usability problem).

[1]: https://www.belkin.com/us/p/P-F4U095/


Which is fine as they were all declared as entry level machines into the new ecosystem. That Pro label was a bit odd but no matter the efficiency of their new systems it was shown a fan is still needed to reach full potential so it provided a means of product separation.

What becomes important is if this constraint it carried forward into larger machines. I cannot believe the M series chips are constrained on how many ports they can support as that usually is all in the external chipset. The real issue I think people are looking at is memory expansion, will it always be what you buy it all you get or will there be user expansion available?


What's new here? The non-touchbar 13" MBP has always had only 2 ports, as has the Air.

They added the touchbar to the 'low end' 13" MBP but it just happens to have better performance than the Intel 13" MBP because the Intel one hasn't been replaced yet - it's still the "low end" model.

The only one I'm missing is the Mini -- I'm not sure how many TB ports it had in the past or has now.


The Intel Mac mini had two Thunderbolt ports but 4 USB-A ports. The M1 mini only has two USB-A ports. I had a ton of USB stuff plugged into my old mini so the dearth of USB-A ports is an annoyance.


I was a bit surprised too that my new fastest processor macbook pro only has 2 ports. To make matters worse thunderbolt 3 hubs don’t actually exist!


That "low end" 13" MBP has always only had 2 ports. It just happens to be that the low-end is faster than the high end during this transition :)


Could you elaborate on the non-existence of TB3 hubs? What is the true nature of a colloquial "TB3" hub then? DP + USB 3 + Ethernet?


Basically I googled for why there aren’t any hubs that add more type c thunderbolt ports (or even just type C USB ports), and apparently items because there is no chipset for it released yet.

You’ll notice pretty much all the “hubs” on Amazon just take a type C port and turn it into a bunch of old USB A ports and other non type C ports. At best it would replace the type C port it takes up with a power delivery only type c port.

I think OWC is about to release one that takes 1 thunderbolt and turns it into 3, but it’s large and seems to need external power.



You pretty much need a $300 TB dock.


I think this was clearly stated, nothing shocking about that right ?


While its stated in the tech specs, it might still be surprising for current MacBook users. The M1 was loudly marketed as strictly superior to previous CPU/GPUs so regression in I/O and screen support might be easily missed.


right, I think the title should change to indicate that the link is dated from November, just after the M1 Macs announcement and nothing new


The MacBook Air, the Mac mini, and the baby Pro are the M1 options right now, so people are trying to bodge them into being something they aren't.

I've heard a lot of stories of people who bought them not expecting to replace a 15" MBP or even iMac with a MacBook Air. Then once the bits hit their shiny new CPU they went whole hog.


Also usbc on M1 is slower than on 16 inch macbook pro

https://www.google.com/search?q=m1+usb+speed&oq=m1+usb+speed


Wifi is also slower, M1 Macs have only 2x2 MIMO chip (Broadcom BCM4378). In AC mode (5 GHz, HT80), it maxes with 866 Mbps link speed. Intel Macs are faster in this regard.


The two port 13" always had 2x2, didn't it?

That said, I wouldn't necessarily expect the 16" MBP to have 3x3 either. Last time I checked, laptop oriented wifi6 3x3 chipsets didn't exist (though something may have shown up in the last few months)


Well, two-port 13" might had 2x2, but touchbar ones never did... So one doesn't know, what to expect, especially since there is no 4-port M1 MBP, as there is no non-touchbar M1 MBP.

Though you might have a point with 3x3 wifi6 chipsets, I didn't do any research whether there are any. Given the way the market went with wifi5 ones - with majority of vendors just using 2x2, this is something I should have expected.


The four port ones, and my 2013 pre-touchbar MacBook Pro, both have 3x3 MIMO.

I found some interesting Apple documentation on this: https://support.apple.com/en-hk/guide/deployment-reference-m...


Do theoretical speed limits matter though? Real speeds will be much lower anyway. And if you need consistent high speeds you're likely to use a cabled connection anyway.


I've found out, because wifi was subjectively slower. 5 GHz is nice, because the range is lower, your neighbors are not causing much noise anymore, so the real speeds did approach the theoretical limits (for me; in my environment; using Ubiquity APs, mostly nanoHDs).

Apple used to be one of the vendors, that bothered to put better wifi in their products than competitors; now they fell back to mediocre 2x2 that all the PC vendors use.


More MIMO antennas means better resilience against interference, the higher theoretical value is a non-quantitative proxy for that.


Outside of theoretical limits, I see 500 mb/s on my M1 Air. It's plenty fast in the real world.


One of the first things I did with my new M1 MBP was install the OS update (~2 GB) - it came with 11.0.1 - and Xcode (~12 GB), followed by nice big `brew --cask install ...` that included microsoft-office (~1,5 GB).

When transferring tens of gigabytes over network, one definitely notices when the network speed is less than what it is supposed to be.


Can anyone comment on if the m1 MacBook Air can handle 4K external display well enough for daily use? I know it handles my ultra wide 1440p display, but I wanted to confirm before I purchased a 4K display.


I'm on a 4K display connected to an M1 mac mini through Thunderbolt here, works great.


My m1 MacBook Air drives my 4K monitor better than my top of the line 2019 MBP.

With my MBP the fans turn on when it does anything other than look at the 4K desktop.


>With my MBP the fans turn on when it does anything other than look at the 4K desktop.

Some of this conversation confuses me. Since the M1 MacBooks came out, all of a sudden people are talking like every MBP version prior is junk. Top-of-the-line MBP that can't run dual monitors? Where were these conversations about poor MBP performance 6 months ago?

FWIW, the last MBP I had was a 2015 i5, and I used a single 4k monitor and no had zero issues in regular usage.


> Some of this conversation confuses me. Since the M1 MacBooks came out, all of a sudden people are talking like every MBP version prior is junk. Top-of-the-line MBP that can't run dual monitors? Where were these conversations about poor MBP performance 6 months ago?

The 15/16" MBP is certainly capable of supporting 4K displays without issue (posting from one now). Most likely some people have something screwed up in their setup and didn't realize it until they moved to another system.

With the new MacBook Air, they won't even know their setup is screwed up because there won't be any fan noise. It'll just slowly degrade over time.


The fan noise isn't the only tell. That macbook air chassis can get very hot. It makes my hands sweat when typing and resting my palms on it.


Usually when I'm coding, my laptop is on my desk so I don't notice the heat.


It's not just Macs: doing any significant work on any thin laptop caused thermal throttling or ran the fans up. For the last five or six years, every work laptop I've had at multiple jobs has sounded like it was trying to take off for a fair amount of the time I was using it. This is especially annoying during Skype and Zoom calls, since noise reduction makes people's voices very loud or soft at random times during meetings.

Offhand, I've been blaming Electron and Docker for Mac/Windows, but maybe there's something more fundamentally wrong?


I guess it's because annoying laptop fans were a given until recently; my 2012 MBA, 2015 MBP, and 2019 MBP all sounded annoying. Like a Retina display, once you experience a quiet office, you don't want to go back.

Also, I think the 16" MacBook Pro in particular was way, way, way overrated because people desperately wanted a story arc where Apple comes to its senses and builds great laptops again.


I tend to agree, had a 2017 MBP prior, but the laptop certainly got hot and things had to throttle a little. Had I not tried an M1, I doubt I would have noticed.


I've been using a single 4K display with mine -- works great.


M1 on 49" dell U4919DW 60Hz at 5120x1440 no issues


Works great.


Even the new Thinkpads can't support two external 4k monitors through the USB-C docking station. You can plug in one monitor via HDMI and another one via the docking station though, which is not optimal but still ok. Anyway, as others have pointed out, using a single large external monitor is usually better than using several, and most of my colleagues already switched to 42 inch curved widescreen monitors recently. I'm still using two 3k/4k external monitors as they're still in good working condition, next time I upgrade I'll get a single large monitor as well though.


That's due to USB 3 not having enough bandwidth for 2x 4k monitors at 60Hz. Get a Thunderbolt 3 Dock instead if your laptop supports it, and you can have 2x 4k monitors at 60Hz.


This has been a big one for me in regards to pulling the trigger on an M1 (and I need something new for work very soon).

I found a whole bunch of YouTube videos of people using DisplayLink adapters such as various docks from Targus or smaller single adapters from StarTech (brands likely don't matter but this happens to what people are testing in these videos). They all show good results. Comments all say "DisplayLink is faking it and it's gross" but the people making these videos say "Sure, but it's not gross. It's working almost perfectly."

Anyone here have any experience or thoughts on this?

I'm not gaming or needing to do high level graphics or video editing on a 2nd or 3rd display. Just basic dev stuff and window management.


I bought the StarTech dual 4K DisplayLink adapater after watching one of those videos. I use it to power a second 4K monitor with my M1 MacBook Air running lid-closed. I have a 1080p gaming monitor as my primary connected to a Elgato dock via DisplayPort.

DisplayLink is definitely suboptimial, but it's the only option. There is noticable input lag (my guess is around 100ms) that makes mouse and keyboard input feel sluggish. Showing a moving image on the screen pegs the DisplayLink Manager process to 100% of one core while it's playing. DisplayLink does not currently support rotation on Big Sur, though it probably will in the future. However, the image itself is pixel-perfect, so it's fine for secondary apps like chat or monitoring.


Thanks for the answer!

> There is noticeable input lag (my guess is around 100ms) that makes mouse and keyboard input feel sluggish.

That's interesting to hear. In some of these videos (and additional Reddit/YouTube comments on them) it seemed like a lot of people would say "yeah but it will have lag" and the person making the video would claim that it doesn't seem to while dragging windows all over the place in their videos. The rotation seems to be a problem across the board though.

These things are difficult to dissect. It could be that there is lag for these other guys, but they aren't the type to notice. Or maybe they didn't get lag because of some detail in their set-up that differed from yours.

Oh well. I don't even have two monitors right now and probably don't really have enough room for a second one. I had hoped to get one this year though.

Maybe I'll get the M1 and just keep working with the one monitor. If I decide it's time for an upgrade maybe I can go the route discussed elsewhere in this thread and get an ultrawide, or a single 40+ inch 4k.

EDIT: Video example of what I'm talking about:

No lag with a displaylink second display on the 2020 M1 MacBook Pro!

https://www.youtube.com/watch?v=k1J3lSZfrko

He's not typing anything though. But for this guy it generally _looks_ like it's close to behaving like a normal second monitor.

But then here's this thread complaining that a particular dock, with 4k displays, is super laggy: https://support.displaylink.com/forums/287786-displaylink-fe...


I have the new MacBook Air and am able to run two 1080p monitors just fine. I use the same adapter for gaming with my HP omen as well. It looks great. (StarTech)


Might work with one wired and one wireless. Otherwise, maybe with an external graphics card if supported, failing that a VNC or other network display protocol server streaming to an external rendering unit across ethernet or wifi. But honestly, I think we reached peak multiscreen ~2010 and it's sort of falling out of favour these days except in a minority of fixed workstation environments. In which case, I'd be waiting for the 16" Pro later this year.


> I think we reached peak multiscreen ~2010 and it's sort of falling out of favour these days

When we upgraded monitors at work I took several of the old ones to see how many monitors I could connect. I got up to 5 I believe (with one being a new ultrawide). It was impossible to use them all productively. At best they could serve as live status for some Jenkins job. Two monitors are useful, but now you can just use one ultrawide (I like to use two cables since it makes Window management easier, but you can use software for that too). I like to have one extra in addition to an ultrawide, but it's often unused.


I use tripe 4K monitors all the time and it's a very productive setup. Left is video conference, far enough away that I can use it without glasses if I want to, front is development work, browser, email, right is the live view of the application that I'm working on. And at $500 each the price really isn't an obstacle if you are using them professionally.


> It was impossible to use them all

You could always take up trading:

https://library.tradingtechnologies.com/trade/ttd-managing-a...

> ... productively.

Oh well.


I use Spectacle for window management on a 49" ultrawide and it works great for dividing the screen into 3 segments. Having one window centered helps avoiding too much horizontal movement of my neck as well (which I've been recommended to avoid by a physiotherapist).


Can you go into more detail as to how using two cables makes window management easier on a single ultrawide?


Most environments snap windows to borders of logical screens.

If the OS believes the ultrawide is actually two 4:3 monitors, you'll get one such snapping range right in the middle.

If you connect the monitor with one cable, the OS believes it's one single 21:9, but if you connect them with two cables, the OS believes it's two smaller monitors.


Is this true for just MacOS or do Linux (GNOME specifically) and Windows also work the same way?

I don't know anyone with an ultrawide so I can't try this myself.


All of them work like this, but some linux environments and on windows PowerToys Zones support splitting the screen in different, custom ways.


Windows also works this way, IDK about linux. FWIW Samsung's ultra wide monitors are supposed to support splitting the screen up into multiple areas for the purpose of window management, even when using a single cable.


I guess I work in a very different industry to most here but 3+ screens is still very much the norm for those of us working in financial markets. However Mac desktops/laptops are very uncommon.


> maybe with an external graphics card if supported,

That's a big if. Mac OS X only supports AMD cards and AMD didn't provide a Radeon driver yet for the M1. It's not clear what is going on, we know the amdgpu driver works an arm64 just fine https://news.ycombinator.com/item?id=20404547 and that contains the most hairy bits namely starting the bastard so I can't see what the hold up can be.


Over the last 10 years, I've migrated from three screens down to two screens and am currently down to one and a half screens. (I currently run my code editor and a browser on a 4k monitor and airplay to my 11" iPad where I keep a terminal app fullscreened.)

I think it's because resolutions have gotten better. 2010 I was running one screen for editor, one screen for browser, and one screen for everything else.


Nailed it about peak multiscreen. I was on three screens when everyone was on one, then on the highest consumer DPI monitor in 2009 before Retina or Windows OS support, and now I'm on a single-monitor display with a laptop. In 10 more years that will be the format: Laptop below, 8k 40" atop.


Or just a cellphone that plugs into a monitor.


I would think that would already exist if it was actually compelling.

But having to plug your computing device into a monitor and keyboard to do any work is just a downgrade from a laptop that always has a monitor and keyboard. It's a bizarre portability trade-off that's even less exciting as laptops become thinner, lighter, and more powerful.


Are you saying it doesn't exist, or just that it's not compelling?

I'm typing this comment on a phone connected to a standard monitor, running a desktop-like experience [0] and it's surprisingly usable!

Hardware - Samsung S8 (2017), generic USB-C to HDMI adapter, bluetooth keyboard and mouse.

[0]https://www.samsung.com/au/apps/samsung-dex/


I do this with my Samsung note 10. Also when I plug my Samsung tablet into my usb-c monitor I get a nice desktop. Lastly, purism’s phone supports phone-as-desktop as well


PinePhone as well - the PostMarketOS even shipped with a HDMI/ethernet/USB dock for this. :)


I use a 7-monitor setup — four landscape in a 2x2, a portrait on either side, and a very small one at the end (just for monitoring my front door camera). It works great — I can dedicate an entire portrait monitor to Slack, for example, and I can just make a tiny head movement or chair swivel much faster than surfacing windows or switching virtual desktops. I tried a single large flat monitor, but when the center was the right distance the corners were too far away to read comfortably, and this way they are all angled directly to me.


That link seems to be broken, at least for me. Hangs establishing secure connection in Chrome and times out.


The link actually works (for me), but calling it "slow" would be the understatement of the year, so far.

It loaded in just over three minutes (yes minutes) for me, with these timings according to Firefox dev tools:

    Request Timing
    Blocked:        31.73 s
    DNS Resolution: 1 ms
    Connecting:     155 ms
    TLS Setup:      31.57 s
    Sending:        0 ms
    Waiting:        2.43 min
    Receiving:      107 ms
This is on 100/100 fiber.



Same in Safari and Firefox, so probably nothing to do with Chrome.


I have used DisplayLink [0] to connect M1 to two monitors via DP and worked well. It not fully plug and play, I had to install displaylink software. However I noticed that if I connect the other port to charger it would (sometimes) add some latency to running apps (for example safari might not respond immediately or vscode will take few seconds to load).

[0] WAVLINK USB C Laptop Universal Docking Station 5K (up to 5120x2880@60Hz) / Dual 4K Ultra HD video, 2x HDMI and DisplayPort, 6xUSB3.0, Gigabit Ethernet, Audio, Mic, Supports Windows, XP, Mac https://smile.amazon.co.uk/dp/B082SR6GMB/


DisplayLink (as it's name might suggest) has nothing to do with DisplayPort. DisplayLink runs a compressed video stream over host USB to an external box (which decodes that compressed stream and pipes it to the dual DP ports on the box).


I read bad things about getting Thunderbolt working properly with Linux so bought a DisplayLink port extender (Lenovo) and it was an awful experience. Janky, didn't support the later kernel versions, didn't have a RPM or Fedora repo, crashed a few times a week. Awful.

Since then I've also bought a TB one and it works flawlessly. The DL one sits on my desk at work as a glorified charger now.


How does this square with OWC's dock: https://appleinsider.com/articles/21/01/07/owc-debuts-refres...

From the article:

" It connects to a computer via a single Thunderbolt port, and can drive up to two dual 4K displays or a single 5K, 6K, or 8K display. " "


I'm more impressed with tsmc's 5nm process than the M1. It will be interesting when we have more processors out manufactured on the same process.


Since I get asked this about my M1 a lot, you can however:

- Use a quality DisplayLink adapter that will give you an additional display over the USB bus. It's unsuitable for gaming or video, but it's perfectly suitable for coding, web browsing, and office work.

- Use the internal display while using a single external monitor. People seem to not understand this often.


I eagerly await the day all these display standards problems go away. I can throw away the large box of miscellaneous USB-C, DisplayPort, HDMI and adapter dongles I have and might grow some hair back. Just yesterday I was arguing with something that had a USB-C to HDMI which didn’t support 60Hz on one monitor but was fine on another. Grr.


What makes you think it will go away? If anything I think it will get worse as there will be even more USB connector types introduced, and screens with even higher resolutions and refresh rates that todays cables don't have the bandwidth to support.


I often dream of a centralized service where everyone can send in all their backup cables and then pay a subscription to get any of someone else's cables in exchange if they need it. things that aren't used for >5 years get responsibly recycled.

too scared about the warehousing and the complexity to do it myself but its probably something I'd pay $5 a month for just for the peace of mind and freed up space.


> I eagerly await the day all these display standards problems go away.

Based on the history of display standards, you will be waiting a while.

Displayport was meant to be the magical works-for-everything standard... then the various versions of Displayport started showing up.


Are you talking about versions of Displayport or something else?

Because people want to use 4k@120hz+ or 5k+ displays. Which require more bandwidth


Oh, I'm just talking about how it was seen at the time it was introduced. It was never going to be a permanent fix, but it was kind of seen that way, after a lot of weirdness with DVI extensions and proprietary stuff and pushing VGA way beyond its practical limits.

This lasted right up until multiple DP versions started showing up, and then it became transportable over Thunderbolt and then USB-C, but with the version of DP dependent on the chipsets on both sides, and... suddenly everything is terrible again.

USB-4/TB3 _may_ become the next "it fixes everything!" standard... until people want 8k screens.


And you can bet by the time that day comes we'll have a new USB-D / USB 6 connector to get new adapters for and Apple will bravely drop USB-C from their laptops in favor of this new standard.


This seems laptop specific, i.e. having the MacBook + 2x external extended displays via Thunderbolt. I think surely the Mac Mini M1 works with 2x monitors though (extended)? Has anyone tested this?


The Mac mini M1 can drive two independent screens - one over HDMI, one over USB C. The limitation seems to be number of screen outputs, and a laptop has one already built in, hence only 1 external.


That is what I'm planning to do. Mac Mini with two 4K screens running at 60Hz. Was just making sure that it would work.


I've been running this setup. Mac mini M1 connecting to LG 4K monitor over HDMI at 60 Hz, with DisplayPort to a 4K Wacom Cintiq also at 60 Hz. It works flawlessly.


Out of interest's sake, what kind of resolution are you running? I found 3008x1692 to be perfect for my 4K screen, but it actually doesn't show up in Display preferences unless you option-click the "Scaled" radio button.


I'm using the standard scaled 1920x1080 resolution on both of my screens. I don't see the 3008x1692 option on my 27-inch LG 4K, but I do see the option on 16-inch Wacom Cintiq. It's a little bit too small on my 16-inch, though.


I see. On my screen, 1920x1080 would be far too large, but then the full blown 3840x2160 resolution would be absolutely tiny. I tried to approximate something that is similar to my MacBook Pro's retina screen. I guess it also comes down to what one is comfortable with. The beauty of the 4K screens is that no matter the resolution, the text is always crisp.

Example (screenshot of half my screen):

https://imgur.com/a/qu0Yzc7

I sit about 45 cm away. Quite comfortable. Loving the Dell 4K.


Ah. I sit about 60~80cm away most of the time from my bigger 4K screen (due to the space needed for my Cintiq when I'm drawing). I tried several resolution after your previous comment and found 2048x1152 to have the best balance between real estate and readability at my viewing distance. I should give this a try for few days. Thank you for suggesting to have a look at the resolution!


I have been using an M1 setup with three screens for the last month. For the first two monitors, I use one usb-c port and one hdmi. For the third monitor, i use a Startech usb-a to htmi adapter using display-link. It works.


I think they'll go wireless and offer a fancy wireless adapter, using the same tech they use for Sidecar – with it you can hook up iPads as external displays.


This and the smaller screen sizes are the reasons we won’t be rolling out any M1 machines at Flatfile. Excited for them otherwise though!


Why do M1 Macs (which are supposed to be new) even have Thunderbolt 3 ports? I would expect Thunderbolt 4 these days.


They’re the same port. TB4 is TB3 + USB4 + two display support (and maybe some other things?). The M1 isn’t compliant because of the display limit, but it does have USB4.


I see. Thank you. I thought Thunderbolt 3 is MiniDisplayPort.

By the way, can I convert the MiniDisplayPort Thunderbolt port on an old MacBook Air into a USB 3.2 gen 2 host port to connect an M.2 NVMe enclosre?


meanwhile: Dell Precision 7740 with 5 external monitors - 6 screens https://www.youtube.com/watch?v=qrVjjQpzivQ




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: