For example, with my 4k monitor running at 3008x1692 with HiDPI ("retina") enabled, I get this:
DELL P2715Q:
Resolution: 6016 x 3384
UI Looks like: 3008 x 1692 @ 60 Hz
Even with those settings my 16" MBP's idle GPU power usage is only ~8-9W. It makes no sense that enabling HiDPI on a 1440p monitor results in better performance, so this definitely sounds like a bug.
I recommend displayplacer [0] for changing and automating macOS display settings. It lets you set all sorts of configs not available in the GUI. Should be easy to have displayplacer enable HiDPI on a 1440p monitor. And it doesn't make persistent system changes or require disabling SIP (i think?) like that script referenced in the article.
It doesn't make sense, but Big Sur is all kinds of broken with external displays.
Display Stream Compression, working perfectly in Catalina, is completely non functional in Big Sur. I don't mean that dramatically, it has not worked since public beta 1, and still doesn't.
Apple just doesn't care if you don't have a Pro Display XDR.
makes sense! Screenshots are in the "real" resolution and not the "Looks Like" resolution. So that helps confirm that doubling the resolution/enabling HiDPI fixed the performance issue. Such a weird bug. Thanks for the write up.
Hey Jake, Does displayplacer have an option to control brightness? I'm running a Hackintosh and ran into displayplacer as the Mac OS equivalent for xrandr but I can't seem to find any command to control brightness.
It still works great for me! And the only issue I’ve ran into can be entirely avoided just by using mode numbers directly instead of specifying each setting.
Just for posterity, I ran into the same problem as joshgel. My 1440p's highest "scaling:on" mode is "res:1920:1200 hz:60 color_depth:8 scaling:on". I tried that, but iStats Menu still reports the discrete graphics card using 19W, which is as high as before.
Given that the linked script just seems to create some sort of XML file in the right directory, maybe this is doable directly? I made an issue in the displayplacer Github to this effect: https://github.com/jakehilborn/displayplacer/issues/69
For those who come across this thread later, I found a reasonable solution that involves fewer hacks (at least to me). It still doesn't lower power usage below 18W with both displays open, but it's a cleaner version of the above solution. Here: https://github.com/jakehilborn/displayplacer/issues/69#issue...
what if `displayplacer list` doesn't show an option for my current resolution with "scaling:on"? does that mean my 16" MBP w 5500M doesn't support this resolution?
These machines are a complete joke for anyone doing “Pro” work. Endless thermal throttling driving CPU frequencies down to 2.4Ghz, low GPU performance, it’s utter bullshit.
I’ve spent $5k on maxed out machines for my engineers, and it’s been a complete mess. Half of them have had to switch to a Windows machine as their primary so they can work without staring at a compiler spinning its wheels.
Maybe an Mx Pro will live up to the legacy of the 2012 era, but Apple’s obsession with thinness and mediocre cooling doesn’t give me much hope for pro users.
Absolutely this. I have a 16'' mbp and it constantly throttles, even when I manually set the fans to full blast and use a cooling pad. There's no way to sustain the 45 watt tdp for more than a few seconds.
Anecdotally, my outdated i5 quad core is much snappier than the mbp i7 8 core because it can go all out at 69 watts/max boost for hours and not throttle for a second.
Disable turbo boost. Throttling is not directly linked to any temperature sensor, and turbo boost almost instantly causes throttling, which then takes some time to return to normal because of a back-off algorithm.
I’ve used the software that lets you modify all the CPU flags and nothing helps. Killing turbo boost sucks, having a frequency in the 3.0Ghz+ instead of 4.4Ghz+ is brutal when you are single thread bound.
This is exactly it, the thermal ceiling leaves almost no room to use the full 45 watt TDP. The fact that this problem isn’t a huge PR nightmare shows just how “pro” the typical buyer’s workload is.
WSL2 is nearly Linux. File IO is a bit slow compared to native Linux and if you want to call a Windows application running on localhost from within WSL2 it's a bit tricky but otherwise it works very well.
I spent years developing in WSL and while it’s decent, it’s still not fool proof. Recently I’ve made the switch to pure Linux work stations and I have the opposite to offer:
I’d rather config WINE to run a Windows app then configure and play with WSL. It’s a great tactic but Linux DE’s are so polished these days I think WSL is a poor choice.
WSL2 is quite a bit better than WSL1 imo, speaking as someone who has used both a fair bit. Especially now that GPU, GUI, and audio support are available on Insiders. Some tasks still don't work very well, like connecting to external devices or anything that needs systemd (like installing snaps), but by and large I've found it harder and harder to justify dual booting Linux with WSL2 available.
I thought dual booting would be a pain but what actually happened is that migrating was mostly pain free these days and I haven’t booted the Windows install for any reason.
As soon as Visual Studio can run on Linux I will gladly switch but currently my backend tooling is Windows only and WSL2 is godsent to connect .NET with cloud tooling.
WSL1 was not well designed. If you only worked with WSL1, you would be surprised how well WSL matured.
Throttling occurs when these values go over 100. Throttling is not directly linked to temperature. There is some kind back-off algorithm that means that even if you trip this thermal throttling threshold, it will take some time before throttling is disabled. Even if you CPU returns to a normal temperature.
After upgrading my MacBook Pro 13" (2017) to Big Sur in Dec 2020 I had a similar experience. I use a 4k monitor and everything got super slow, fans started spinning, processes got throttled, and I could barely get any work done. I later did a fresh install of Catalina and everything was fast again.
I never did proper benchmarking, but my feeling was that resolution played a role. When using 1080p things were fast, but the resolution is unusable on a 27" 4k display. When using 2160p things were fast, but too small for my eyes. Any resolution in between (this implies things are being scaled?) was sluggish.
Note that this is a 13" model, so the problem does not seem to be restricted to the 16" model that the author is talking about.
> this fix only works when the MacBook lid is closed
Not really solved then. Everyone I know that has a 16” has the same issue.
Since the top of the line model is the one most chosen by professionals, very few will not connect to an external monitor. Apple should be issuing refunds left and right after this fiasco.
I got a 16" MBP last year in order to WFH. I could not get it to work with an old 1920x1200 Dell monitor using DVI adapters. Months later I bought a 1440 monitor that works over HDMI and DisplayPort, but the display is fuzzy. I found some tweaks to improve it, but it's still noticeably worse than a hackintosh running High Sierra.
I have felt like the performance isn't great, but I chalked it up to Docker and the Linux VM it uses. However, I also noticed that the computer was warm to the touch with fans running even when I hadn't used it for hours. I've taken to manually forcing it to sleep at the end of the day.
I'm excited to try this HiDPI hack, because it may solve the power usage problem and the fuzzy display. It would be a real shame to lose the built-in display if this only works in clamshell mode, but it just might be worth it.
Edit: I just realized that I can't use Touch ID in clamshell mode. That makes this a harder call.
If you try it out, I would be interested in knowing if it helped with the fuzzyness. I experienced the same on a 38“ 3840x1600 display, which only looks bad in macOS. I no longer own a macbook so I can‘t do any testing on my own.
Yeah, I can reproduce. Installed iStats Menu, connected, and it's 18W on the Radeon GPU as expected. I was kind of resigned to this always being true, but maybe not! I am somewhat optimistic that Apple or AMD will fix this now.
I was having exact same problem and done a ton of research on the issue. Ultimately y I decided to buy eGPU which solved my problem.
It seems that Radeon is taking a lot of power even on idle screen and with low load. This increases temperature which in turn creates environment where throttling on CPU happens so not only Macbook gets hot but also it’s much slower.
After research I found out that I can either work in clamshell mode (and I’d rather grew fond of having MBP + external monitor) or find a way to mitigate it.
I decided to opt for eGPU which solved issue immediately. MacOS feel snappier, I don’t remember last time when fans started and it’s cool to touch. Not an exactly cheap solution for a i9 MBP but, eh, you do what you have to do.
eGPU brings its own share of problems to MacOS (like ejecting it is not a simple thing - I always have to resort to doing this manually from console) and pulling the cable will kernel panic, but the upside is that I get quite nice connector hub and photo editing software works much nicer.
I do think it’s a shame that Apple even designed MBP like that. You buy beefed out MBP (5600M wasn’t even available when I was buying it - and supposedly it fixes this issue) and it can’t handle so complex task as displaying the second monitor without launching the rocket engine.
I have two eGPUs; both of which are collecting dust.
One is a BlackMagic eGPU Pro (since discontinued).
The main reason that I disconnected them, is that I plan to get an M1X/M2 laptop, once they come out, and I know that the new chips don't work with eGPUs, so I might as well get used to it (Makes for simpler under-desk wiring, too).
Also, the eGPUs seemed to introduce random crashes, and I don't like running on discontinued kit.
I installed Fedora Linux recently and used it for a few days. I did not spot a single bug, despite using it in a quite heavy scenario (Docker, Android Emulator, IDE, big file transfers). For example Windows had issues with sound drivers and GPU drivers, Linux is rock stable.
Also it was as easy as macOS install: I just installed it in a 5 minutes and it was ready to work. With Windows I had to spent hours to install all the drivers from the manufacturer website (I tried to let Windows to install their own drivers, but it did not find all the necessary drivers).
I have few issues with some Linux UI choices, for example I don't like Gnome 3 UI. But that's not a bug.
I used docker, that's what I used before. I'm aware of podman, but did not try to use it. But thanks for suggestion, I'll look into it.
PS I just tried to use it. I've got some permission denied errors from containers which were not present with docker. I guess podman is not ready as a complete replacement of docker for me.
It wasn't a suggestion, because it doesn't have all the features that many people utilized in the Docker ecosystem. Like Docker-compose. (Right now on 22 May 2021.)
There's podman-compose which seems to almost work for me, but not completely, yep.
If I would write containers myself from the scratch, I'd definitely try to ensure that podman works. But I'm using containers from other people, so I have to live with that for now.
I installed docker from the docker repository, it's definitely not a symlink to podman on my system right now. I did not check whether that symlink existed before docker installation, but I doubt about that.
I have solved the last "bug" by using xubuntu, which has an astonishingly stable, coherent and well integrated xfce desktop experience. There something similar for Fedora perhaps?
xUbuntu really is one of the better polished distros. When I had a spare laptop, I tried some distrohopping on it. Little quirks existed in most distros but it just felt like xUbuntu was the most (for the lack of better words) sane overall. Everything I'd change in XFCE on another distro was essentially what xUbuntu had configured by default.
The only thing I've had to do after installation, aside from installing my apps from repos, is to add the Mainline PPA and update the kernel to newest just to take advantage of newer kernel drivers for some of my hardware. Other than that, everything is there and everything works.
This is one of the reasons why I use openSUSE, it has automatic btrfs snapshots that let you roll back in case anything breaks. Tumbleweed is already one of the most stable rolling-release distros, and I haven't had anything break yet, but it's nice to have regular snapshots as an an extra layer of safety.
Fedora uses btrfs by default. I don't think that snapshots are enabled by default, but quick googling reveals some instructions which should enable snapshots for updates. I guess that might help to achieve similar behaviour.
Yeah, just put the interns on it next season. They'll just do the boring work the real programmers don't want to do. <eyeroll>
Whenever a solid Apple bash comes up on HN, people just throw statements around to the effect of "Apple -- as an entire software and hardware conglomerate -- just doesn't 'care' any more," and that nothing in the entire freaking stack should ever exhibit a bug. It's almost like NO ONE around here actually works in SOFTWARE or HARDWARE. I guess no one else here has ever shipped an uncaught bug. I sure have.
From my experience, the number of the kinds of issues being discussed here is EXCEEDINGLY low for the number of moving parts involved. I know, I know. Someone's going to say that since they do the whole stack themselves, they should NEVER ship a bug. Well, good luck with that. I'll believe it's humanly possible when I see it.
And don't even start with how Linux or Windows is any better. I've been at this for 30 years. I know better.
DSC, hardly an uncommon technology, worked perfectly in Catalina. It was entirely non-functional in public beta 1 of Big Sur.
Here we are at 11.3.1 and it is still absolutely, utterly broken. Legions of users of varying monitors, varying GPUs, can't access the full feature set of their equipment that they used to be able. There are Radar reports back to the betas.
Instead we get things like making sure you can buy AppleCare+ from the About This Mac option.
Or buy a Pro Display XDR, because it seems that that is all Apple cares about for external monitors.
In my opinion it's more buggy. The biggest difference is that many of those bugs are an active choice on Apple's part but I still see them as bugs. Especially because if people shout loud enough Apple changes the "feature" and says "this was a bug" but only if Twitter blows up enough. Otherwise it is a feature.
Hmm. I've not had any success with Teams on Linux. Oddly, the behavior is also quite client-dependent:
- Chromium: infinite login loop because why not
- Flatpak: login works, but says my account isn't eligible for Teams (which blocks joining as even a guest?)
- Firefox: doesn't even ask for a login and I can join, but no audio or video (either direction); chat works though
Note that I only join existing meetings, never create any, so why I can get to the call without logging in on Firefox, but am blocked from even joining elsewhere is a real mystery to me.
Also using Teams on Linux, but this one happens to all clients.
There's an authentication bug that causes a log in loop for anyone who signed up for a personal account using an email address that they also used as part of another MSN/live.com service (eg- as a secondary address), especially if your original Teams/O365 account was converted to a domain/business account.
If you log into live.com with your Teams email, add a different fallback email and remove your current one, that may help.
Teams is a case study in subtle and not so subtle bugs and ux issues:
Just yesterday we spent time in a team meeting because someones Teams instance was malfunctioning.
Sometimes restarting the application completely (be aware it idles in the systray) works, other times restarting the pc or disabling (or enabling) hardware acceleration will work. Other times it helps to move it to the built in monitor it seems.
And sometimes, like yesterday, nothing including
- all the above,
- disabling incoming video
- and reinstalling the application
works.
As for UX issues there's clicking on "join meeting" and continue working for a few minutes until you realize "ah, the first join meeting doesn't join the meeting, only opens the join meeting dialog".
Another winner: when you click on the 7 more people icon to see who else are online and it does nothing.
Or the fact that joining a started meeting is a multi step process (find calendar, find today, find the meeting click join and join).
Yesterday I'm also sure I had an instance of sound getting through before I joined.
Still, somethings works flawlessly for me: jumping seamlessly from one device to another during a meeting is fantastic.
Sound is OK for me. Background switching is ok etc etc.
I'm not sure whether that's a Linux specific issue. I've used teams on an older MBP and also on newer PCs (8th gen i5) with Windows, and the lag is always there for some reason. I've found ticking "disable animations" helps a little.
I have a very beefy laptop at work but with Teams I'm feeling like I'm using my old 133 MHz PC again. Everything takes so long to open that I'm growing increasingly mad when using it. Plus the statuses of my contacts are consistently wrong.
Well, Teams has problems using my external USB mic on Linux Mint 19.3, when e.g. Zoom has no problems with it. Otherwise no significant bugs using Mint for last 5 years or so both on desktop and laptop.
I recently had to go through the hell of setting up a Java development environment, plus Android emulation, on my new M1 MacBook .. mid-way through, I was cursing myself that I wasn't just using a Linux machine .. but okay, eventually, SDKMAN came to the rescue. For a while there it really felt pretty regressive ..
Your "average" Linux distro isn't "buggy" as you claim.
You sound like someone who either hasn't used Linux, or extrapolates their experiences from one specific distro, to all distros.
I say this as someone who uses Linux daily. Specifically, Arch Linux, with testing repos enabled. The only bugs I run into are from using Git master versions of software that I'm interesting in testing.
I've used Linux since it was first available in the mid 90s.
I use it at work and on machines at home. It is almost certainly more reliable than Windows and MacOS as a server. However, the context here is desktop UI and laptops.
Linux does exceptionally well considering there is almost no money at all going into this use case. But there are many bugs, sometimes due to hardware (e.g ACPI inconsistencies, even Thinkpads have issues) and because of software churn (few fix bugs in their spare time, it's more fun to create something new).
If you wish to pretend that everything is perfect and there are no problems, then you aren't helping Linux either.
> I've used Linux since it was first available in the mid 90s. I use it at work and on machines at home. It is almost certainly more reliable than Windows and MacOS as a server. However, the context here is desktop UI and laptops.
Desktop UI is where Linux tends to shine. Laptops... oh, those are a royal mess, usually.
> Linux does exceptionally well considering there is almost no money at all going into this use case.
No money at all? There are countless developers being paid to work on the Linux kernel. And on other pieces of software in userspace ~ usually by Red Hat.
> But there are many bugs, sometimes due to hardware (e.g ACPI inconsistencies, even Thinkpads have issues)
Hardware-related bugs are always an unpleasant can of fun...
> and because of software churn (few fix bugs in their spare time, it's more fun to create something new).
Bugs caused by software churn are also a thing. I personally haven't noticed anything major from many packages in the main repos.
> If you wish to pretend that everything is perfect and there are no problems, then you aren't helping Linux either.
Oops, I could have worded things slightly better, as in "not as "buggy"' instead of 'not "buggy"'. I sometimes miss words, only to realize later.
> Linux does exceptionally well considering there is almost no money at all going into this use case. But there are many bugs, sometimes due to hardware (e.g ACPI inconsistencies, even Thinkpads have issues) and because of software churn (few fix bugs in their spare time, it's more fun to create something new).
This is me figuring out my Optimus without Mux.
My screen freezes when laptops (ASUS N76VB) resumes from sleep. (I don't know which state S1 - S4.)
So I have "Fixed" the issue. But this was a problem of the HD Audio device driver? in the NVIDIA 740m?
It was difficult to fix and I'm thinking about making a GitHub repository thing about it.
>You sound like someone who either hasn't used Linux, or extrapolates their experiences from one specific distro, to all distros.
Ironically you've just described the biggest problem with Linux, and the biggest non-solution often proposed "Problem with distro Y? Just use distro X".
But I'm not saying this as blaming Linux about the brokedness. They're doing amazingly well for the huge amount of combinations of hardware and software they're supporting.
Apple, on the other hand, has decided to seriously limit the amount of hardware they support, which makes it a much easier effort to support those ones.
Mac os hasn't just worked for developers for a long time.
They missed the entire container revolution with docker. I still come across Devs using Mac's that are afraid of docker because it's too confusing and black box. (It's Linux in there right?)
At work we have a rather overengineered method of proxying to our production services for security reasons. Mac users are currently constantly having to deal with abstraction layers on top of abstraction layers to make things barely reliable. I just use a systemd unit file and haven't looked at it in years.
Homebrew tool, while great for more obscure things, it should really only be a fallback, not the default. It's basically a confusing and black box version of the AUR.
BSD/apple ways of doing things are just annoying. It's fine for the average user. But for Devs that want to do things in the same way they do them on their server it's just another hoop to jump through.
The sad thing is that moving every Dev in the company over to Linux would probably be worthwhile long term, but I really don't think they have the willpower to relearn things even if they are that much better.
I evaluated it less than a year ago, and it froze after about 30-60 seconds. This happened about five years ago as well, when I evaluated it last time. It turns out to be the global search that somehow just... freezes the whole program while indexing.
To be honest, outlook for mac isn’t particularly great either. It is not much of a step up from outlook web access, which is what I ended up using on linux after trying all the mail clients. Evolution with the ews plugin came close, but required a ton of configuration and the tasks integration was never quite right.
> To be honest, outlook for mac isn’t particularly great either.
I've never used Outlook for Mac, but there's the Apple Mail.app on the mac which works great with Exchange. I'd choose it over Outlook on Windows every time.
This was (and is) such a big issue that I sold my 16” MBP and bought an M1 MacBook Air.
The monitor issue was particularly egregious because it would cause heat and noise even at idle, but the thing gets perilously hot and loud when put under even the slightest load.
I am incredibly surprised that none of the reviews seemed to pick up on it, though I suspect they may have been highly swayed by this being the first MacBook in some years with a usable keyboard.
I would not be surprised if we start to see a bunch of them fail en masse in a few years time.
Oh wow I’m pretty sure I’ve been debugging this issue all week and was on the verge of reformatting my Mac back to Catalina (as I don’t recall seeing this there). What great timing!
I noticed the issue as my laptop feeling laggy overall, but especially that graphics on the laptop screen (eg the minimise animation) ran at like 5fps when my external monitor was connected - they were fine without, and the external monitor was ok.
A friend said theirs was fine with the same setup but a 4K monitor so I did wonder if it was something to do with scaling but wasn’t aware you could do anything about it. Will test this out tomorrow!
Slightly off topic but related - had a fun day yesterday tracking down the fact that in Big Sur, timers in tasks that are not in the foreground (ie the active window) start firing up to 10 secs late. I thought it must be a symptom of whatever issue was making my computer laggy but it actually seems to be aggressive power saving... so if anyone knows how to opt out of that I’ll be happy, really messes up what I was doing yesterday (basically a program that forwards IPC messages between two programs - they start arriving late after 10 secs or so).
When levels are above 100, your laptop will throttle. Once they return to 100, throttling will cease. There is a back-off algorithm at play too, so its not good enough that your CPU returns to normal temperature, it must stay at normal temperature for a period of time before things stop lagging.
In case this helps anyone, I spent this morning debugging my issue further and I am pretty certain it's a hardware fault either with, or affecting, the dGPU in my laptop.
If I force internal GPU with `sudo pmset -b gpuswitch 0`, animations and graphics benchmarks are much faster than with dGPU enabled! For example, I tried the Unigine Heaven benchmark and I get 14fps with internal GPU, 4fps with discrete (5300M) GPU, which I am pretty sure is not right lol. (If you run this command, `sudo pmset -b gpuswitch 2` set it back to auto).
Did a clean install of Catalina to rule out OS issues, and got friends with the same laptop to test out the same website I was seeing extreme slowness on (it was fine for them).
The laptop recently had a repair and pretty much every part was replaced as a result of liquid damage, so I suspect either a faulty part or incorrect installation. Off to the Apple Store I go!
I've always taken issue with the positioning of "batterygate" - IMO, they extracted the maximum the battery had to offer along all points of the battery's lifecycle curve. They are consumables after all, and their performance degrades over time. They should be user-serviceable though.
Since as I'm only occasionally reading the Medium I just disabled cookies for the entire Medium domain and no longer get bothered with either paywall or switching to incognito.
Github.io is a surprisingly good option. Gitlab.io if you like gitlab better.
Substack seems pretty popular and not terrible.
Edit: Apparently a bad idea: Dev.to seems to be fairly decent, and maybe more like medium.
Hosting your own static site via netlify (or similar services) is only the cost of a domain name.
Hosting your own vps-based site on gcp is also only the price of a domain name, and if you put the also free cloudflare in front of it, it scales well, but this route does mean a lot more maintenance than any of the options above.
Have a scan through the homepage. I often find the same few people posting broad questions over and over again to generate engagement. On top of that I feel most of the article titles tend to be click bait. I have come across one or two really good articles but they are very much the exception.
The other answer is apparently the reason. It’s a shame, too, because it flags people as self-promoting even if they’re posting interesting articles that have no self promotion at all.
If your data is metered, is a good idea for peace of mind. Otherwise you risk someone dosing you and getting a large bill. That applies to both static or dynamic content.
Could you explain how that would work? I thought DOS generally worked by overloading a host with mostly invalid incoming packets.
Are DOS attacks that cause lots of outgoing bandwidth common? And wouldn't launching such a DOS attack be somewhat expensive in the first place?
Also, I don't think any of the hosting companies I use have metered bandwidth. I assume that if I used excessive bandwidth I'd just end up saturating the network, but for a blog with mostly text and a few images that shouldn't be an issue.
As example: The comment above explicitly mentioned using GCP. GCPs "free" tier includes a whopping 1 GB of free traffic each month, everything over that is individually billed. (Arguably the correct answer is "don't use cloud hosts for such things, their bandwidth is stupidly overpriced for this use case", not "use a CDN to mitigate that")
It's perfectly normal for legitimate traffic to accidently DDoS you if you have low specs (or a huge unwieldy site) and get a link somewhere popular like HN.
I run a blog on a small cloud VM. If it made front page of HN all those visitors would DDoS my site "by accident" in no time.
Don't forget, accessing a website the normal way is outbound bandwidth.
Incidentally, I like to use my blog to explain topics that aren't widely understood outside of tech people or domain specific experience.
Maybe I should be a bit more active and explain things like this?
I personally like posthaven, if you don’t mind paying $5 / month and you want it to stay up “forever” :) Otherwise github pages is probably the way to go
Seeing as how it makes no sense that rendering at 2x resolution cause 1/3 of the power draw
My bet is that another scaling step is causing it; and HiDPI mode is "native", so no scaling.
You can see this in the comparison of the two screenshots; the first one looks like an antialising filter was applied to all the pixels and halved the resolution (so a 2x2 block of 4 pixels turns into one averaged "huge" pixel), the second one doesn't.
macOS has two rendering modes: Normal (1x) and HiDPI / Retina (2x). For this reason, most apps have two different sets of assets like icons and other images shown in the UI.
After rendering the content either at 1x or 2x, the picture is scaled to the "real" resolution of the display. This allows Apple to support a large number of "scaled" resolutions, but the apps only have to support two modes (1x or 2x).
You can see that this happens if you have two displays attached to your Mac, one with Retina and one without. When you drag a window from the Non-Retina-Display to the Retina-Display it will be pixelated at first (it was drawin in 1x mode), until it's completely on the new display, then it will redraw in 2x mode and become sharp.
Normally 1x mode is faster. Try opening an app that shows a lot of data (eg. a large spreadsheet) and scroll quickly on a 1080p display vs. on a 4K display. You'll should see that the scrolling is much smoother on 1x than on 2x, since the 2x display needs to draw four times as many pixels.
So this is clearly a bug, and not because of some additional scaling from 2x->1x. As this blog post shows, it's faster when downscaling from 2x.
I'm saying that the bug could be another scaling step is happening where it's not necessary. In 1x mode it could really be rendering at 1x, scaling up to 2x for whatever reason (this is the bug) and then down again.
I took fullscreen screenshots to get the statistics, then cropped via another screenshot. The first one was 2560x1440. The 2nd one was 5220x2880. So noticeably sharper.
That's not always the case. It can be, depending on the resolution selected (in this case it's "default for display" so no idea).
HiDPI just means 2x rendering. But it still might be scaled to the resolution you've chosen for your display.
On my MacBook Pro 13", there are four scale choices from "larger text" to "more space". The first option upscales (slightly blurry), the second option displays the 2x directly ("natively"), and the third and fourth options downscale.
They're all still HiDPI though. In contrast, when I plug into my 1080p projector, it goes back to 1x ("LoDPI"?) rendering.
MacBookPro’s are dual GPU systens, a low power low performance Intel iris Pro and also a dedicated nVidia/AMD discrete GPU. The nVidia GPU kicks in when you run OpenGL/Metal apps or when you plug in an external monitor. My MBP is well behaved when I dont plug in an external monitor, and noisy/hot when I do. This is with both OSX and Linux. Funny enough, Haiku is silent on the same hardware (no 2nd screen though).
Yes, but it is not correct to claim Macbook Pros are Iris Pros. They are Intel graphics.
Intel GPUs (including the UHD Graphics 630) can be used with external monitors. HP, Dell, Lenovo, Acer etc all do it.
In fact, in 2013-2015, Macbook Pros were available with ONLY Intel graphics, and supported external monitors.
The reason why you cannot do it with dual-GPU Macbook Pros is that Apple has their own display muxing solution that requires the discrete GPU to be powered up to use external monitors (presumably the ports are attached to the discrete GPU).
For the past year I've been WFH with the MBP (which has a 5500M gpu) driving its own panel (3072x1920), an external "wide" ~4K~
"Big-ass" (3440 x 1440) and an external "4:3" 4k (3840x2160)
You’ve gotten really lucky but responses like that make it harder for those with problems to get them addressed.
Some of the issues include: screens refusing to turn on, random screen shutdowns (then turns back on), incorrect refresh rate on wake from sleep, incorrect resolution on wake from sleep, monitor position gets reset again on wake, screen image offset on monitor, once detected resolutions and capabilities not detected after restart (hdr, 4k, brightness control), refusing to pass audio through, clamshell mode will keep internal display active…
That is so annoying on my machine, a band an inch wide is offset by an eight of an inch when it comes out of sleep one time in five. The only solution is a reboot.
I've had all the other problems as well, on both Intel and M1.
"screens refusing to turn on, random screen shutdowns (then turns back on), incorrect refresh rate on wake from sleep, incorrect resolution on wake from sleep, monitor position gets reset again on wake, screen image offset on monitor, once detected resolutions and capabilities not detected after restart (hdr, 4k, brightness control)"
And this can be the display itself that's causing the issue.
An example that i've found:
```If you suspect that you are affected by this issue, see if you find a reference to YCbCr in the menu of the monitor settings. For the Dell, it was in > Menu > Color > Input Color Format``` Dell (P2720DC)
This is not what I meant to say. It's difficult for me.
And it's what I expected.
MacOS or the Display manufacturers treat each other differently in terms of information shared between the display and device itself.
It’s an Apple problem. It’s not the monitors fault that Apple can’t handle different EDID. Both my Linux and Windows machines have never had these issues at all.
My MBP (2015/2016, I think) even sometimes hard crashes when plugging in my dual monitor setup (daisy-chained DisplayPort). It feels reminiscent of Windows BSODs from circa 2000 :(
This is an infuriating bug that impacts my 16" MBP. Everytime I plug into a thunderbolt dock I have a 50-50 chance of a hard reboot due to Kernel panic. There have been threads about this issue for a couple of years now and nothing has improved.
I originally came to the MBP from the world of windows and being very jaded by BIOS and sleep issues with my Dell XPS 15. Macs just work they said. Well, jokes been on me. I literally have to constantly mute myself during video calls because my $3500 MBP fans are always whirring away so loudly that it ends up fucking up the audio for others during a call. I've had my entire system throttle and grind to a halt during video calls because of thermal throttling when temperatures got higher during this summer.
I'm so over Apple and their laptops. My wife's MBP has a swollen battery that's destroyed the touchbar and touchpad. For all my complaints with Dell, it was nothing in comparison to the pain of owning my MBP. Multimonitor support is a joke. Docks are a joke. Opening apps is a fucking joke due to their sandbox slowing things down to the point that opening any app takes 5-10 seconds.
At this point all I do is use my $3500 MBP as a VNC client to my Linux desktop where I get all my real work done. This is the first and last Mac I will personally own. I can see why they were as popular as they were with the developer crowd. But Apple has lost its way and doesn't give a shit, at least not when it comes to the developer / professional users that they originally targeted with their MBP line.
I'll be going back to Windows/Linux soon and there is very little I will miss when I make the switch.
> I originally came to the MBP from the world of windows and being very jaded by BIOS and sleep issues with my Dell XPS 15. Macs just work they said. Well, jokes been on me.
This is pretty much by sentiment too. My MBP is a "2nd" computer for me, that I only use when building iOS apps - my daily driver is Windows 10, and I'm very happy with it.
It's not quite 50-50 for kernel panic for me, but I'd guess it happens around 20% of the time if the MBP is switched on when I plug in the monitors - hardly much better. The keyboard on my MBP is crap too, and has 2 keys that stop working from time to time. I'm not even a fan of the trackpad that most on HN seem to gush about - it needs too much pressure to click even on the most sensitive setting, and something just feels "weird" about the feel of it and the way teh cursor moves. And it seems something breaks or goes wrong after every.single.os.update - IMO, there are some serious QA issues with OS updates.
Meanwhile on Windows 10, everything Just Works (TM).
Random thought, when I used the official Apple HDMI adapter, I had no issues. When I used a third party USB-C-DP cable, I had random crashes, especially when turning on or waking from sleep.
Thanks for the reply. Unfortunately, Apple refuses to make laptop docks for their own "Pro" line of laptops, which means you have to resort to 3rd party docks. And I'm not using cheap docks here. I've tried several $200-$350 Thunderbolt docks and I've had this issue on every single one of them. Meanwhile my significantly older Dell laptop has no issues on any of them. It's hard to give Apple an out on this when they won't even make an overpriced dock that I would still purchase just to retain my sanity and have the ability to easily work on a 3 monitor setup without random reboots.
Seriously. Even with one external monitor I get things like windows "disappearing" into some in-between land, wallpaper getting messed up... It's shocking.
It might be awful, but when I use high end PPI, pro color spaces, and move between workspaces with different equipment, sure seems less worse than everything else.
This is not a solution at all. Post says this only works with lid closed, but most MPB16's will achieve reduction from 20W to 8-9W just by closing the lid, with no settings changes needed.
That being said, 8-9W with closed lid is still high. Consumption for second monitor when it's idle shouldn't be more than 1W.
After seeing the title I got my hopes up, that it will work within 1-2W range for idle regardless of lid being closed. That would be a solution, 8-9W definitely is not.
I agree, this doesn't seem like much of a solution. Due to the incredible amount of noise my 2019 16" MBP makes when I have it open and a 4K monitor attached, I have to always use it in clamshell mode, which makes almost no noise at all unless I'm watching a video or doing certain things in VS Code.
I too looked at the title and thought someone had finally figured out how to get around having the laptop open and a monitor plugged in with no noise, but it seems like this basically makes no difference at all since it only works when the lid is shut. I've resigned myself to the fact that it'll never be solved, and hope to get this MBP replaced at some point
> Will switch to Apple Silicon version of 16 when it comes out
After buying Mac during a 4 year run of devices with unfixably broken keyboards and GPUs, that will not teach apple to focus on quality or repairability.
>This is not a solution at all. Post says this only works with lid closed, but most MPB16's will achieve reduction from 20W to 8-9W just by closing the lid, with no settings changes needed.
Of course, because you can dissipate enough heat with the lid closed.
>That being said, 8-9W with closed lid is still high. Consumption for second monitor when it's idle shouldn't be more than 1W.
I am not about that, because laptop displays draw more power than 1W most of the time. (I'm aware of those low power displays that Lenovo utilizes.)
And there are a couple of factors that I've forgotten.
I'm on a late 2019 MacBook Pro with an external monitor that has:
B326HK:
Resolution: 3840 x 2160 (2160p/4K UHD 1 - Ultra High Definition)
UI Looks like: 1920 x 1080 @ 30.00Hz
In iStats, Radeon High Side shows 5.6-6W of power usage. But still, my CPU cannot sustain CPU frequency above 2Ghz for more than 5 minutes due to thermal throttling.
My mistake, thanks for pointing that out. I have an old 2013 MBP (on which I'm typing this) which only does 30Hz.
On 2019 MBP with DP I am getting 60Hz but the problem remains.
It's amazing how many thermal/throttling issues have plagued macbooks over the past few years. IIRC there was a bug that would cause throttling if you tried to charge on the left side ports. Not sure if it was fully fixed by a patch.
The M1s are a huge improvement but the bar was pretty low.
The bug is in hw, so no, it's not fixed. Specifically, it's caused by internal layout - the chips handling power, especially for the cases with beefier wattage and at higher power draw - also can get quite hot. However, apparently the ones on the left side are much closer to CPU/GPU area on the motherboard than the ones on the right side, and thus "contribute" more to temperature observed on CPU/GPU.
Wow. Hilarious. I was one of the original commenters on that thread and stopped following when I returned my - fully loaded - $$$$ - 16” MBP. I think I’ll be glad I returned it because M1 is awesome, but god damn. Why can’t Apple fix this? You figure they would get the idea when I return an ungodly expensive machine and tell them this is the reason. Christ.
> You figure they would get the idea when I return an ungodly expensive machine and tell them this is the reason.
You would hope so, but behemoths like Apple just don't care and don't have to. I'm guessing it would take thousands of returns before it even made a blip on their radar.
the problem is that NO monitor works. it's not a bug just with 1440p, once you connect an external monitor it jumps without closing the lid. I use two 1080p monitors, but a single one does not work, aswell, closing the lid fixes the problem
I followed the Macrumors thread for a good while and there was speculation that it might even have been a hardware flaw; something about needing to run the memory clock at full speed to prevent flickering when driving different resolution screens.
Apparently there are some AMD desktop GPUs that exhibit similar behaviour, but obviously an extra 15-20W draw is far less of an issue in a desktop than a thin laptop.
Either way, this machine is coming up on two years old. It’s not gonna be fixed, ever. I already sold mine because of it.
Haven't really persued a solution to these gpu/heat/noise issues yet.
Though, a few months ago I had contacted Apple about a different matter and mentioned these problems to the agent who then told me that was the first time they and their colleagues had heard of this issue.
Their tone and the very many words they used to convey this told a different story. Going to be an interesting case when I finally have time to get this sorted.
Edit: Not saying it's company policy to deny this issue exists. It could just have been this agent not wanting to deal with it (maybe because it'd end up in a concession which would look bad on his profile?)
Download Intel Power Gadget and watch the Frequency graph. If blue line doesn't match the red line then you are throttling.
Run `sudo watch -d -t -n1 thermal levels`. When any reading goes over 100, it is in throttling mode.
Once it trips over 100 you start throttling. Throttling doesn't end though until this value returns below 100. This number is not directly linked to temperature either. Even though your CPU may be at a safe temperature like 80 degrees which normally shouldn't throttle, your `thermal levels` will not decrease immediately. The best way to get them to decrease is by sleeping your MacBook.
Maybe there is a hard-coded delayed back-off algorithm that is a function of the time the CPU spends above a certain temperature threshold - I'm not sure.
I have also read that its potentially the voltage-regulator module that doesn't have a temp sensor that this may be linked to.
---
You can also do a [cooling mod][1] if you really want to but its probably going to void and warranty and its risky.
I've had thermal throttling issues with my 16" since I've gotten it, and it's only gotten worse. I had to do a downgrade to Catalina to even be able to use it. It's so funny that a $4.3k machine can't even run Zoom or Meet properly. I also had to do a SwitchResX trick + clamshell mode to even be able to dock for a few hours (and it'll overheat again).
I play an occasional Dota 2 match, that my 2014 13" handled without any issues, while my 16" just shuts itself down because of overheating (even while not docked, because of dGPU usage). I don't care if it's an AMD issue, then the GPUs shouldn't be in their machine.
MacBooks used to be so good, and just work. Now not so much. I'll think twice before buying anything Apple (computer-wise) ever again.
This doesn't fix the issue, many people have had success with similar methods when running with the built-in display off. There's no getting around this with the built-in display active, or with any two non-matching resolution displays.
My specific issue was it was still hot and noisy even when the built in display was closed. I'm sure other people have 1440p monitors and would love this partial solution.
No doubt that's true; I haven't observed this particular problem. However, OSX' logic for external monitors doesn't work a whole lot better than for older laptops -- in particular, if I connect it to a monitor that's connected through a Startech KVM it remains stuck at XGA (1024x768) resolution, which reflects the EDID the KVM provides to inactive computers. It doesn't update when the laptop is selected, and there seems to be no way to reset it.
Nothing else does that. IMO, it's all indicative of a prototype-level implementation.
Had the same issue on my 16" before I spilled coffee all over it and broke it.
A lot of the time kernal_task would take up like 400% CPU, which was because plugging the power cable into certain thunderbolt ports (on the left side) would overheat the computer.
Even then I was consistently getting low performance, my fans were on all the time, and my computer was burning hot trying to power a 4k monitor and a 1080p monitor.
After my 16" broke I got an M1 Mac Mini and obviously have not been having the same issues. Now my computer feels fast again. My old set up felt like I was using a Macbook Air or something.
I had this and then got an M1 MBP and it flickers wildly on dark backgrounds when connected via USB-C. I think that's going to be my last Apple laptop.
I had the issue when going from driving my two 4k monitors with usb-c directly to going through a thunderbolt 4 hub that then did HDMI out (Caldigit).
After some searching it seems HDMI conversion can trigger it. Switched to a TB4 hub that did usb-c to monitors (which uses display port instead, Kensington) and radeon high side idle went back down and fans slowed down.
Will be happy to move to Apple Silicon just for the lack of fan noise.
I did a pram reset and have had zero issues. First day with the machine with a 27” 4K monitor and temps were soaring, fans going crazy with just chrome, sublime and vagrant running. PRAM reset and fans only turn on when something actually intense is happening. Latest big sur, and pretty much a maxed out machine - 64gb/1TB/ and I think the same graphics as mentioned in the article.
Looks like my 5500M GPU draws 10W fewer when I change my external 4K monitor from Scaled (to accommodate for every OS UI element being minuscule) to the display's native resolution. It makes the OS barely usable from a UI standpoint, but the fans are now a lot quieter. I'm on a Dell monitor.
I just tested it on my 2017 rMBP (Radeon 555), and I saw an immediate drop from ~15W to 11W. The total power usage is around 30W, so it's an immediate ~15% drop in power usage.
The MBP 16” is a buggy garbage machine. It also routinely kernel panics when disconnecting thunderbolt devices. For the first month it also regularly kernel panicked waking up from sleep like some piece of shit Windows laptop, but that seems to have gotten fixed in Catalina.
I have two 1440p monitors with MBP 16". Using hidpi fixed the high wattage issue when I connect one monitor.
But when I connect two 1440p monitors, I still see GPU wattage at around 20W. Any idea what could fix this?
Note: Mac lid is closed during all of the testing and usage I am doing.
Difficult to find out and it's difficult to compare an AMD GPU with an Intel or NVIDIA one at this point.
Because they do things differently and the laptop that using is doing things differently too. And... about those other laptops, they do things differently too. (ASUS Zephyrus vs a Legion 7 differs a lot.)
2020 16” MBP with 5500M and Big Sur doesn’t have any fan issues in clamshell when using my 1440p display. I don’t use HiDPI, but it is running at 144hz.
Another data point. Seems like putting more load on some part of the system prevents this, but it is very strange.
There are lots of reasons I absolutely despise my 16" MacBook Pro and the heat while plugged into an external monitor is definitely in the top 30 or so. Really don't understand why people are buying these garbage computers.
The amount of defects and other miscellaneous issues I've encountered, along with numerous productivity losses moving from Windows and Linux. Makes no sense. My computer needs to function as a productivity machine. I don't have time for it.
Next laptop(s) are going to be ThinkPads. Because I can buy two for the price of a macbook. We program for Linux servers anyway.
I have this problem. Had no problem working with the laptop open and 2 monitors for a year or so. Upgraded to big sur and now it’s an issue. I think i can manage but i hear those fans struggling.
Can the title be edited to remove the exaggerated capitalization? This is an interesting enough article but the title makes it feel like some clickbait video
My comment was just a clarification. Maybe this is just my experience but me and my coworkers using both external and macbook displays. After reading the title I thought it will fix heating without closing the lid
I have always ran a 4K monitor and never hit this issue. My 15" (2018) and 16" MacBook Pro run silent when connected to the 4K monitor as it uses 2x scaling by default (so 4K "looks like" 1080p).
However if I connect a 1440p display after half an hour or so the fans kick in.
I don't own a 1440p display anymore to test this solution unfortunately.
TLDR: My Macbook Pro is hot and noisy when connected to a 2560x1440p60 monitor, even when the lid is closed at idle. This article outlines a partial solution that enables silent operation at idle when the lid is closed.
You can just open the link in incognito mode! Also, sorry, I guess I should have posted the friend link instead that doesn't ask to subscribe, but I can't change it after posting! But here's the non paywall link: https://axu2.medium.com/solved-by-1440p-hidpi-macbook-pro-16...
The author can choose to have articles in front of the paywall. I believe it defaults to behind the paywall, and if you want the Medium aggregators to be able to share your piece, it has to be behind the paywall.
Basically the tradeoff is to have better circulation among subscribers or non-subscribers. I always choose the latter.
"You’ve read all your free member-only stories. Become a member to get unlimited access and support the voices you want to hear more from."
Can we ban medium posts on HN already? imagine if we login-walled every piece on HN. we should not encourage this behavior and we have the clout to send Medium a message.
Or of course if the browser stops providing the necessary API, which seems the most likely cause for the demise of uMatrix, at least in the shorter term.
news/media livelihoods depend on that paywall, i'm fine with that.
medium literally exists to just put a wall between you and the authors content. nobody's making real money blogging on medium. and HN is an audience of developers, for whom standing up their own blog is trivial.
ironically dev.to is shadowbanned on HN (understandable because lots of blogpspam), when medium is the one that gives readers a terrible reading experience.
Seriously does nobody think of moderation? I moderate a large sub on reddit and there are local news outlets that allow 4 articles a month and then you have to use trickery to get around the paywall to read the article. As a moderator I need to ensure the articles are related and comply with the rules - you know what I do when I reach the 5th article? For the rest of the month that site gets no posts on our sub.
As another moderator, that doesn't seem like a great approach; you're effectively punishing the readers of your sub for someone else's decision, somewhat arbitrarily. Your subscribers may not have hit that 4-article limit.
The decision to ban partial-paywall sites is fine, but doing it just for a month after the moderator personally runs into a problem seems very subjective.
The article also has a link at the bottom to send the author money if you found the article helpful. If I paid money upfront to see the article, I'm going to be much less likely to send more money afterward.
I just don't click on Medium posts, but I wholeheartedly agree. Medium (as well as Substack, for that matter) is one of the largest SAAS failures in the past decade IMO.
"It's ok to post stories from sites with paywalls that have workarounds. In comments, it's ok to ask how to read an article and to help other users do so. But please don't post complaints about paywalls. Those are off topic. More here."
More logical solution would be for Medium, Financial times etc. disable their paywall when the source is HN, Reddit etc. like they do for Social Media.
Perhaps there's a need gap for 3rd party service which can work out a deal between news websites and submission forums.
You've created dozens of accounts, mostly to go on and on about Apple for some reason. It's not surprising that HN users would flag comments that are so predictable and ranty.
As for the mods: we don't care about $BigCo or "overlords", but definitely draw the line at single-purpose accounts and serial trolls.
I recommend displayplacer [0] for changing and automating macOS display settings. It lets you set all sorts of configs not available in the GUI. Should be easy to have displayplacer enable HiDPI on a 1440p monitor. And it doesn't make persistent system changes or require disabling SIP (i think?) like that script referenced in the article.
[0] https://github.com/jakehilborn/displayplacer