EDIT: As far as I know, the best long-term answer here is for apps that present visuals full screen to "capture" the external display for exclusive use using an API (https://developer.apple.com/documentation/coregraphics/14562...), but that's not super common right now.
> As far as I know, the best long-term answer here is for apps that present visuals full screen to "capture" the external display for exclusive use using an API...
If I'm understanding you correctly, that means there's already a supported workaround for this if apps just use that API? I don't want to downplay the annoyance of this for apps that aren't using that API, but this suggests there's already an official answer.
I'm mildly surprised the orange dot shows up in full-screen apps; I was going to suggest that might be the easiest "fix" for Apple to make that doesn't require either adding a new security setting or taking away the indicator entirely -- have it only show up in the menu bar, and not when the menu bar isn't present.
I'd like to ask a follow-up also as I don't do any Mac dev, but has this API been around for a bit then? Like Monterey has been out for a bit and it's a little surprising that this article and the subsequent discussion is only popping up now, especially if there is an API answer to it already.
I ask earnestly if the change is really such a substantial one?
I have mixed feelings after reading the comments as I think that there are fairly valid arguments in both directions (e.g., that the solutions are plentiful, but also that workarounds aren't really a solution), but the arguments feel a bit empty if there's a "right" way to be handling the visuals that just isn't being used.
As a user I like the change in general as I have caught naughty applications that try to use mic input when I really don't want it, and my misclick/absentmindedness is not uncommon, so seeing such things helps a lot as I don't really think it's reasonable to constantly be checking the various app permissions to make sure they're what I want. This is a good reminder for me.
But I totally get not wanting the dot, as it's even been a prank on a site I go to to have a tiny red pixel just to annoy people (and it's a prank I've used). So I get the frustration with an unexpected visual. But, if there's a way to do the same activity by having the app utilize the correct API, it seems like an issue that is solved in the next update from these visual production apps, no?
CoreGraphics and the associated screen capture APIs have been around for a very long time. However they aren't particularly easy to use and do require your app to be trusted (reasonably so). You can't have just any old Mac app start capturing the screen and doing things with the pixels.
Why would you think that? The CGDisplayCapture function that the GP linked includes a "display ID" keyword to specify a specific external display, like a projector or a screen. Why wouldn't it work?
Is that really all it takes to disable this? I guess if I'm a malware author looking to do surreptitious recording, I'll have to bundle these extra 10 lines of code lol...
You can, but note that as a user you have to open System Preferences and check a checkbox to allow said malware to do this. (Apple locked down the accessibility APIs that let apps easily manipulate each other a few years ago.)
The user has to manually open System Preferences and allow this program all the same. One place where this workaround would work for malware is embedding in apps that are expected to need these rights though.
On my machine, Dropbox, Alfred, BetterTouchTools, and Bartender have this permission. Zoom is in the list of apps that can be given this permission, but the permission is disabled by default and Zoom works fine without it - though the very fact that some may have given this permission to Zoom might be a cause of alarm! And it's possible Apple may patch away the ability of accessibility tools to mess with this, without giving a better system-level way to disable it...
If I had to guess Zoom is probably using the accessibility API to implement their remote control feature. I don't know enough about the other apps to guess why they need it, but dropbox needing accessibility permissions does sound strange.
The security agencies would pay good money for a solution that bypasses this requirement. They were paying good money for exploits that disabled the tally lamp on web cams.
I mean, I’d take advantage of that program. If it’s already installed then it probably has the permissions granted. So I’d only have to run it before recording audio.
That you built this in such a short span of time is impressive, and really does a great job undercutting the “security” reasons for the dot to be there.
I don't think this undercuts the security reasons. I think the general idea is that if you leave Zoom / FaceTime / OBS open and recording, the orange dot is there. Same dot, same place, no matter what app you are using, as long as the developer doesn't disable it.
Using the API to disable the dot requires some pretty scary permissions to be enabled on the app disabling the dot.
Yes, exactly. I'm sure there are more elegant answers — plus watching events so that it can hide the window right away instead of running in a loop — but I haven't used the accessibility APIs much lately and this is the first working approach I found.
On vacation and no laptop, but perhaps someone can add better readme directions to the patch.
To us software folks, I t makes sense, but I imagine this'll be linked to many individuals outside of software who won't know even what git is or how to get the fix.
In the film editing space workstations use dedicated video I/O hardware which fully circumvents the display/video stack of the OS to be able to input/output exactly the pixels you want in exactly the format you want (something that's essentially impossible using OS facilities, much less in a cross-platform manner). This seems like a very good application for those cards (they're not particularly expensive, around 200 bucks [1]).
I'm 100% certain that most professionals are already using these for performance displays. I don't know a single person that doesn't use these cards for exactly this reason. We even use one for presentations/events at movie theatres just to make sure that nothing is going to put an errant pop up on the display.
You can render on your GPU to a texture, then stream that texture to these devices. No different to rendering to your screen and then streaming and saving to disk, as you would do if you were screen capturing.
No. If they did, that would defeat the purpose. Usually, they have their own setup software or routing software. Something like Blackmagic, for example, is supported directly from certain apps. Others let you route windows to the output. You still need a graphics card for these to work. They only handle the video input and output.
Though a quick search through online forums suggest that module isn't compiled into Ubuntu's version of VLC by default, so you may need to compile your own version:
Not sure about VLC, but ffmpeg has great support for Blackmagic, you just have to download the Blackmagic SDK, compile ffmpeg with Blackmagic support (and the SDK in path) and then you'll have a separate input/output device available in ffmpeg. The other great thing about this approach is that this way audio also takes a dedicated, integrated path, bypassing OS layers and maintaining sync with much less effort.
I'm not terribly familiar with Linux use of hardware I/O devices so my guess would be yes unless Blackmagic (or someone else) makes some kind of Linux kernel extension that would allow you to manually specify that or VLC has built-in support for external I/O devices.
Edit: Just checked because I was curious. The default installs of VLC do not include Blackmagic support but you can compile versions from the source for Linux that include it (if you also download the support files from Blacklink first).
A good suggestion if you have the budget. NDI is also incredible for this. But small producers rely heavily on built-in I/O on their computers, and I suspect that the market share from small/amateur streamers using Apple products outsizes Apple’s revenue from large shops today.
Big productions can afford Decklinks and media servers like Disguise D3.
NDI is fantastic, having dealt with racks of gear for switching and video distribution before I get excited about the possibilities every time I play with it. Dedicated hardware for bridging NDI to input and output devices is also appearing. It's a lossy solution (the video is compressed, unlike SDI/HDMI) but for live productions I'd definitely be willing to use NDI throughout (cameras -> switching -> recording/projection/streaming).
So long as they keep NDI Tools free. It would be great if they could also eventually build in some of the functionality of Dante (NDI is something like Dante but for video) since the Dante software toolkit is very much not free. I think Newtek has been on the right track.
Birddog has a great selection of dedicated NDI hardware that we found to be rock solid.
Indeed, we use three of the Flex 4K Out boxes to three projectors doing alpha blending to form a single virtual screen via ProPresenter and it works great!
Especially since someone donated an M1 Mac Mini, not realizing it only supported two traditional external monitors. With NDI we could have an external monitor for the operator display then let ProPresenter generate three virtual monitors via NDI. We can send differently formatted "screens" via NDI to our OBS computer for streaming too - different configurations of lower thirds for texts, full screen mirroring of the main presentation display - whatever we need. NDI really works a treat and I will NEVER go back to analog video again!
We also use the Birddog NDI PTZ cameras for streaming and some crude IMAG too - it's amazing how well NDI works.
Small producers spend lots of money on equipment in that range of cost, even for ancillary things like lighting and cables. I think it's a great recommendation.
There are a lot of guys out there with 2016 era macbooks (cause it still has HDMI) running it into a projector to do visuals at small clubs and things like that.
Maybe some of those guys scraped together enough money to buy one of the new macbooks (since it got HDMI back). Being forced to spend another $200 to get rid of a stupid orange dot would really sting.
Those guys can probably crop at the projector though and not lose too much? Last I played with projectors was around 2010 and back then crop controls already came standard
2015 was the last MacBook with HDMI. Anyways I agree with you but it's true there is a lot of the richer professionals too - those who buy the new maxxed out MacBooks.
Yep. Used heavily for editing works, colour grading works for tv / film, sound works. Pretty much anything professional, they will discard what the GPU is doing (and inbuilt audio) and use these sorts of external devices.
This way, you know EXACTLY what the program is computing, nothing else.
This thread has exposed a very weird dichotomy between HN users that believe that they (as owners/operators of the device in question) should ultimately be in control of what their machine outputs - down to the pixel - at least if and when they care to do anything about it, and those that accept it's our role to just take whatever bones the manufacturer's are kind enough to throw our way, apparently unless and until it is an egregious violation of what one considers to be "the" line that shouldn't be crossed.
As a hacker without a horse in this particular race (Macs and I parted ways a long time ago), it's definitely interesting to observe the interactions between the two groups in this thread!
IMHO the fundamental difference between the two sides is that when it's posited as a dogmatic matter, it's immediately clear whether or not your (perceived) rights have been violated (Can you do X? No => Violation, Yes => Keep chugging along) but for the latter group it becomes not just a question of whether or not X is possible but also whether or not each individual can agree that X should/shouldn't be determined by the vendor/manufacturer (c.f. the recent hullabaloo about on-device scanning).
I see similarities to the concept of "I may not agree with what you are saying, but I will fight to the death for your right to do so," which just makes it so much easier to agree on whether or not rights are being infringed, regardless of it's something you'd want to engage in yourself or otherwise.
I wouldn't call this changing a default. If this were windows and my only option was 1) Petition the developer to rewrite their app 2) Buy more hardware 3) Edit the registry...
I would not consider that a mere change to the default.
There is no guarantee the work arounds will continue to work and it's asinine to have to rely on a playback card - which doesn't work on the M1 Mac's anyway.
That's a bit of a false dichotomy, though, because there also exists the group of people that are the intended audience for this feature that is so dreaded by this small minority. The large majority of users benefit from being made aware when an application is using their microphone as it helps them see whether that's being done without their consent. Apple could, in theory, put in a toggle to allow more seasoned users to choose to hide that function but then that toggle will just be used to compromise the whole point of the feature for the majority of their users.
Case in point, my mom is getting paranoid about this stuff after watching stuff on Netflix and CNN about how Facebook is "listening in" on her. I can easily just tell her to watch for the dot and know, with reasonable certainty, that she didn't install some app on her own that would hide that.
Case in point, my mom is getting paranoid about this stuff after watching stuff on Netflix and CNN about how Facebook is "listening in" on her. I can easily just tell her to watch for the dot and know, with reasonable certainty, that she didn't install some app on her own that would hide that.
Nothing about this scenario would change if Apple chose to include a setting in the system menu that turns it off. You would just tell your mother not to mess with that setting... which you've presumably already done for numerous other settings.
There are plenty of settings that get turned off by someone falling for a guide ("customer service rep" or YouTube video or written tutorial or infographic or Facebook post or email or...) which tells them to turn that setting off as an important bypass for whatever other issue they have or think they have.
The obvious solution would be some sort of prominent warning that disabling this setting might allow malicious behavior or something, but many people are so inured to frivolous warnings that it might not work. I think the only fix to that is to make it even harder to disable, but I'm no UX expert.
Yes it would. That's how malware gets installed surreptitiously. It gets wrapped in nice instructions for how to enable some functionality that actually does something else. It's literally the entire reason for changes like this.
I still think Apple should walk it back a little but pretending like that doesn't have unintended consequences is kinda foolish.
Like other powerful tools, computers can do a lot of damage if used carelessly. That's the way it has to be. Such tools should be made as safe as possible, but no safer.
Don't nerf my computer because your mother can't be trusted with the digital equivalent of an electric carving knife. Just add a few more warning labels and let Darwin sort it out.
Ignoring which side is right: A critical part of HCI is "never surprise the user."
A feature like this shouldn't come as a surprise in an update. When deployed via the update, it should be optional and disabled by default. It should only be enabled by default after a major OS release, where UI changes are expected. (And, enabling a feature like this should be part of the typical beta/preview process that happens for all major OS releases.)
That being said: I wonder if covert audio recording on Mac is a problem. IE, did Apple need to "do something" about people being targeted by malware that records their audio?
The number of people in this thread not understanding the importance of not having any interference in live visuals makes me believe this won’t be fixed by Apple. Absolutely baffling.
I didn't understand the importance and came to this thread to learn. Please don't assume a lack of prior knowledge reflects an ongoing lack of curiosity.
Learning is perfectly fine. But there are people here who have it explicitly explained to them, and continue on with the "so what? It's just an orange dot" line of thinking.
Looking through the comments, I can't imagine how else it could possibly be explained or clarified further. For this use case, it is assumed that the software producing the visuals has full control over the output. An orange dot that cannot be removed means that the software does not have full control over the output, which is unacceptable. It's that simple.
EDIT: As far as I know, the best long-term answer here is for apps that present visuals full screen to "capture" the external display for exclusive use using an API (https://developer.apple.com/documentation/coregraphics/14562...), but that's not super common right now.
Sounds like it's totally possible for the software to have full control over the output if it wants to, this only affects software that runs in a "standard" fullscreen mode without explicitly taking full control over the output.
Sounds like it's totally possible for the software to have full control
Yes, so you can go to the software vendor's "Ideas" or "Suggestions" page and ask for this, and post in forums asking people to upvote it, and hope the devs will do it. Not a great workaround. And sure, someone developed a script to get rid of this but no one is all that happy when Windows does something user-unfriendly that only a regedit hack will get rid of either.
While other people are struggling with "it's fine when Apple forces us to jump through a bunch of undocumented hacks and workarounds to keep doing what we bought the machine for"
And still others are struggling with, "How do you not understand that Apple is trying to balance the trade-offs between 'allow an app to do anything' and 'protect users from malicious software'?"
I think they could do a better job (e.g., a simple setting in System Preferences that turns off recording notifications), but to pretend that they're doing this for no reason at all or that the orange dot is a show-stopper for a significant fraction of their user-base is just not making an argument in good faith.
> to pretend that they're doing this for no reason at all or that the orange dot is a show-stopper for a significant fraction of their user-base is just not making an argument in good faith.
Well, look at the other person who replied to my comment:
"Every time this happens a small minority that is affected screams "this is interfering with my workflow" and the majority says "get over it"."
That's not technically untrue, but it fails to acknowledge that there is a tradeoff here. Again, I think Apple could be doing a better job of managing the tradeoffs, but this is not simply the majority saying "get over it". This is the majority saying "it sucks that this change interrupted your workflow, but look at the bigger picture"
I don't think a failure to explicitly note that apple had a reason in every comment is at all the same as pretending there was no reason. Same for explicitly noting that there is at least some reason to "get over it".
But is that actually different from trying to get any budget for e.g. IT security software, redundancy, backups or any kind of nonfunctional investment in an IT network - especially in smaller shops? Putting up my ignorant penny-pincher hat, I cannot see a difference between a hardware output card and an HDMI cable in a laptop, at least until apple put an orange dot there. So let's rather buy more flyers.
I'm very much not surprised at saving the wrong pennies.
If I’m forced to choose between live entertainment and a great privacy feature like this one, the privacy feature is going to win. Artists will get creative and find workarounds; other companies will fill the gap, and other commenters have pointed out cheap hardware that solves this problem.
You're clearly not involved in live entertainment, so you're not making that choice. It's also a false choice, because Apple could easily preserve privacy while also making their hardware usable for live entertainment again. Just a couple ideas off the top of my head would be to allow the user to choose which display(s) have the indicator (defaulting to all displays, of course) or adding an additional permission level for applications that are already approved for audio input to not show the indicator.
It doesn't have to be an all or nothing situation if Apple is even remotely interested in addressing this use case, which they should be.
> if Apple is even remotely interested in addressing this use case, which they should be.
I think the reality is that the set of people who want the feature and who would use it in a live professional environment with expensive hardware and who can't afford a $100 dongle from BlackMagic is so close to the null set that Apple is unlikely to care.
For goodness sake, Apple's own HDMI dongle costs $70. Just spend $30 more and buy the BM one instead.
> You're clearly not involved in live entertainment
I used to be.
> It's also a false choice
Yes, yes it is. It's been like this for decades. Entertainers are expected to buy the most expensive hardware available even when they're just getting started. Conform or be stigmatized for using inferior technology. We're talking about an industry in which AudioQuest has no trouble selling passive 3m HDMI cables for upwards of $1k.
Maybe this is the first time Apple has stepped on your toes, but it won't be the last. Be prepared to fork over the cash for a specialized dongle you wouldn't need on any other platform, lest that orange dot mar your reputation.
Let's be real: this isn't going to harm Apple's sales in the slightest, and they know it. The only way they could possibly harm their reputation in the entertainment industry is by lowering their prices, opening up their ecosystem, and eliminating $1k monitor stands.
They know you'll ultimately cough up the dough for an adapter, no matter how much you complain. They know you don't have a choice. They get money from me, a privacy-conscious end user with a choice, and they get money from you, a disgruntled professional without a choice.
What is the benefit of forcing the stupid dot on ALL displays?!?
How is offering an option for it to ONLY be visible when the menu bar is also visible compromising the notification in any way?
There are software solutions like NDI that the hardware solution doesn't fix and why should I have to buy hardware to fix a problem that should have never existed in the first place?
It's asinine and really not that complicated - don't show UI shit on displays that aren't showing the menu bar. There is always one display that must have a menu bar - show crap like this there and only there.
Apple is absolutely not forced to choose. They can provide a way to override it. Considering how much they love to boast about this market, I'm pretty sure they'll change it.
Many musicians have giant screens in their live sets, which display incredibly detailed visuals. When you're spending hundreds of thousands of dollars (if not more) on a live setup, it looks really amateurish to have an orange dot in the corner of the screen.
Yes, there are workarounds. But artists shouldn't have to deal with that when it worked perfectly fine beforehand. In addition, the more stuff you add to your setup, the higher chance that something will go wrong.
Professionals who spend hundreds of thousands of dollars on a live setup are not going to be using the internal I/O of their Macbook/Mac Pro. They're going to have dedicated video output cards that would not be affected by this. Those cards are specifically for having 100% control over the output.
The right call her for Apple is to allow users to give permission to specific apps to disable this but let's not start with the idea that pros are outputting directly from their computers without the right hardware.
>Professionals who spend hundreds of thousands of dollars on a live setup are not going to be using the internal I/O of their Macbook/Mac Pro. They're going to have dedicated video output cards that would not be affected by this. Those cards are specifically for having 100% control over the output.
Have you ever worked in this industry? Because yes they absolutely are.
The people building the video walls (renting them), and the people actually running the visuals are not the same people.
Of course I have. A VJ with an old Macbook Pro isn't someone for whom this dot makes the display "unusable". If it did, then they wouldn't be using it because they could also have notifications, OS alerts, security prompts, or anything else that shows up on a display come up during their performance.
There's a _massive_ difference between an unremovable and highly visible orange dot and the small chance that "notifications, OS alerts, security prompts etc." could pop up. I think it's undeniable that there's a contingent of people who play live video who will be negatively affected by this change and I'm surprised so many in this thread are implying that if they don't own a playback card, their experience doesn't matter.
No one is saying their experience doesn't matter. Stop arguing straw men. All anyone is saying is that, if having the ability to control the output that's going to a display outside of the OS is a necessity, then you need a hardware controller. That has always been the case. The OS can always interfere with a full-screen app on a secondary display. The only reason there's any issue now is that these people disagree with this specific feature of the OS. It's not "unremovable". Just turn off whatever recording device is active and it'll go away. If you're a bit more tech savvy, turn off SIP and change it yourself or go to github and built the utility that already exists to get rid of it.
All anyone in this thread is implying is that, if this is important to you, you need to have the hardware to do it. If Microsoft tomorrow decided to put a Windows logo in the corner of the screen just to say "fuck you", you would still be unaffected with a hardware I/O device.
> All anyone is saying is that, if having the ability to control the output that's going to a display outside of the OS is a necessity, then you need a hardware controller. That has always been the case.
And without the orange dot issue, it wasn't necessary to have absolute perfect control. (Which could still go wrong anyway, because it's a computer.) Making it necessary all of a sudden is bad.
Or to put it another way, even without a hardware controller it was possible to have your macbook's use of an external output be one of the strongest links in the chain. Which is enough. Was enough.
>it was possible to have your macbook's use of an external output be one of the strongest links in the chain
Not if you need to be able to decide was is and isn't displayed. The I/O subsystem on Monterey is almost exactly the same as it has been since like Mavericks. If it was enough in the past because people were lucky, then that's awesome. There are ways to deal with the dot if people want to continue running these setups off of luck and ignorance. I'm only arguing against the people that say the dot makes their setup "unusable" and somehow ruins their livelihood.
I'm not arguing a strawman, I'm arguing your preposition that because "notifications, OS alerts, security prompts" can also pop up, it's unreasonable for someone to complain about the recording indicator. They're entirely different animals and the fact you're comparing the two seems very uncharitable to me.
> Just turn off whatever recording device is active
It seems you haven't considered that the most obvious scenario someone would be annoyed about this is during a multimedia presentation when they're running both a video output and recording audio over top, an extremely common use-case that performers I personally know take part in. Suggesting this clearly non-technical individual to turn off SIP and compile and run some arbitrary code from Github is laughable to me.
They're not, though. I only used notifications and the like because they were common things that people probably have experienced outside of this specific situation. It doesn't have to be that. It could have been a microphone icon or a speaker icon or literally whatever the OS wants to display. And I'm not even arguing that it's unreasonable to complain about the recording indicator. I'm arguing that people who claim that something being on screen that they didn't want makes their workflow "unusable". If that was the case, there are solutions for that and they've been around for years. Just because they weren't affected by their ignorance in the past doesn't excuse that.
>clearly non-technical individual
If this person is that non-technical, then something like this is definitely not a showstopper that makes it "unusable". They are not the people I'm directing these comments at.
> because they could also have notifications, OS alerts, security prompts
Most will be using HDMI out as a second screen, so those won't show up
> Yes. I currently work in this industry
What do you use for video output? I'm a hobbyist and used iPad & TouchViz with HDMI plug. I just picked up Resolume and planned to use a Mac Mini M1 w/ Monterey and HDMI out. I'm livestreaming, so I'm only doing 1080.
You're not typically cloning your main screen but rather outputting to a second display (that doesn't have the focus). It would be very weird indeed for a notification to wind up on that display.
Weird but not impossible. The whole point here is that professionals who can't take that chance have always had to use a hardware I/O device because there is no other option.
Perhaps professionals with a lot of money riding on it, sure. But for prosumers, the status quo was good enough that breaking it cannot be justified by "oh it wasn't perfect so who cares?"
Airplane mode nearly prevents almost all the triggers that would cause such things to pop up. I thought this was standard VJ advice; turn on airplane mode before a performance.
Doesn't fix the orange dot, but it helps pretty much everything else.
Helps but doesn't solve. There is nothing that you can do to remove the OS's ability to put things on a secondary display outside of your control. The only option is to have a separate I/O controller.
You meet professionals not taking professional precautions in every field. It doesn't mean they're in the right, it means they're playing fast and loose and hoping common stuff doesn't bite them and their clients.
Yup! And that leads to situations just like this. I'm not saying it's right or wrong. I'm just saying that people who depend on not being surprised by stuff like this prepare for stuff like this because it happens, in whatever form it takes.
Yeah, I was seeing some of the tweets claiming that conferences had been cancelled over it and I can't help but think "good, maybe they'll learn to be more paranoid about critical projects". Though I assume it's 99% hyperbolic.
To have this be a business blocker, they clearly not only don't have any other standard tools available to prevent this display issue, but they also upgraded their OS, can't roll back to an earlier backup (probably don't have backups), and did all this on the machine used for the presentations, without any alternatives.
That's a shocking lack of care and they deserve to receive flack for it. You test that kind of thing well before the event, and then you do not touch anything. Least of all upgrading the OS. I can't help but assume that these people are just using their personal laptops and hoping for the best on the day of an event. If they are, a small orange dot is the least of their concerns, and serves as a nice canary to their employer.
> The right call her for Apple is to allow users to give permission to specific apps to disable this but let's not start with the idea that pros are outputting directly from their computers without the right hardware.
The unfortunate case with a preference is that as soon as you enable such a permission folks can force users to enable that permission to use their invasive software. The orange dot exists because applications have been abusing privacy by invasively using audio and visual recording to spy on people. The solution to this problem isn't very simple and while the orange dot is causing headaches the lack of an orange dot also causes headaches.
As per my comment - the unfortunate truth is that offering users a choice means denying users the freedom from being creeped on as every app under the sun asks for silent microphone access "for design reasons". We've seen how ineffective app permissions (that can't be selectively restricted by the OS as on Android) have been for iOS devices. Apps boot up and demand access to contacts, your camera and your microphone and if you refuse they quit out.
It can be empowering to users to deny bad choices - since it prevents users from being coerced by malicious software (i.e. tiktok, facebook, instagram - not like virus laden software).
That all said there is some legitimate functionality being lost with this decision.
Just my opinion, but these linguistic contortions undermine your point.
Providing users with a decision in which there is an asymmetry and/or incentives could be setting them up for manipulation. But i think there are ways to balance the asymmetry vs. just removing the choice. A simple report showing which apps were watching/listening along with screen time could be useful, for example.
I hope this isn't nitpicking but I don't consider those linguistic contortions. A minimum wage empowers workers to receive an (ideally) living wage while, on the surface, restricting them from being able to sell their time for ever lower amounts. There are a lot of debates as to the efficacy and justifiability of things like a minimum wage but it's important to remember that any prevailing sense of the linguistic definitions you might assume is a local effect. Comparing American vs. European definitions of empowerment is a pretty clear demonstration of this where in Europe the ability to live a good healthy life is paramount and restrictions that promote that life style are generally considered empowering.
I do think there might be some other solutions but I also think the orange dot is, for almost all users, a perfectly acceptable solution - visually obvious without being obnoxious.
> but I also think the orange dot is, for almost all users, a perfectly acceptable solution - visually obvious without being obnoxious.
That is why the user should be given the choice to activate it: make it a sensible default choice in the respective settings. The experienced users who know what they do should be empowered to make a different choice if it makes sense for their workflow.
The OP mentioned a preboot setting specifically which would not be per app and be tricky enough to scare regular users from being tricked into doing it. Sounds like a good solution for Pro users.
Yea - a preboot setting actually seems like a pretty rational way to allow this fix. I don't think it's easy to get ignorant users to mess around with bios or other system level settings.
I think you underestimate how many places do use the internal I/O. My tiny church has a single iMac doing both recording and running slides. Small concert venues aren't much better.
Some of the conventions I've gone to ran everything in a room off a single laptop. (I've set up such things.)
Concerts aren't much better. Only the largest events and venues have the kinds of "professional" setups you're thinking of.
I really don't understand why you are being so dismissive of many users, just because they don't meet your personal definition of "professional". This change makes some use cases objectively worse, and telling thousands of people to spend hundreds of dollars plus some amount of time to mitigate a change they didn't ask for is not a respectful position, IMO.
I think this change would be fine as a default, but it should be configurable by the end user.
I'm not being dismissive and this has nothing to do with my personal definition of the word "professional". You mis-framing my position doesn't help this discussion at all.
The point is that users never had control of the output on external monitors. This situation is exactly like the situation that's happened multiple times on every OS where someone discovers a way to do something that relies on some function that they either misunderstand or are misusing (think Y2K). Just because it hasn't bitten people in the ass doesn't mean that it's a good way of doing it. People in this thread have commented surprised that Apple hasn't run into this issue during internal uses by its production teams but the reality is that Apple's production teams don't run off the built-in I/O because that's not how you run a video system you need 100% control of.
I agree that Apple should find a way to make this better (or at least hide it on secondary displays) but that's only a workaround. Today it's an orange dot. Tomorrow, it'll just be some other display indicator. The fact is that you can't (and have never been able to) control what displays from the OS on these secondary monitors. The fact that it's just now biting people in the ass is their fault, not Apple's.
One issue is I don’t think software like PowerPoint or Keynote can output to a DeckLink or similar. A quick Google search indicates you need to use ProPresenter, which is a big change for someone who just wants to display PowerPoint slides.
That's partially true but only if you're only looking at DeckLink devices exclusively (which are mostly for full video productions) that rely on some specific things to function. There are other cards that have software that can capture the video from any window and use it as an I/O source. Think OBS but to an I/O device instead of to a virtual camera.
> They're going to have dedicated video output cards that would not be affected by this. Those cards are specifically for having 100% control over the output.
Um, no, how about the built in one worked fine, didn't require me to buy a very expensive additional piece of hardware. This change took away functionality that worked before the upgrade?
> let's not start with the idea that pros are outputting directly from their computers without the right hardware.
If professional means derives income from work, you would be wrong. If pro means works for an organization with unlimited budget, you would be right.
It had the same problem that it still has. The OS can place items on the display that you don't want in the middle of a presentation/performance. The only thing that "worked fine before" is that you were ok with what the OS put there because it was rare for that to happen.
>If professional means
It means that a dot in the upper right hand corner makes the function "unusable". If that's the case, then a professional would not extend/mirror a desktop display. They make sure that they control exactly what is being displayed and you can't (and have never been able to) do that with macOS.
It's not reasonable to say it's the "same" problem when they changed it this much.
> The only thing that "worked fine before" is that you were ok with what the OS put there because it was rare for that to happen.
So rare it may have never happened. So yes, it did work fine. Are you implying that's wrong? There's no way to make a bulletproof setup, after all. Maybe with an external device you get a glitched or blank screen instead of a notification, but any hardware or software could fail. Possible chance of failure is not the same as a constant 100% chance problem, and does not excuse a constant 100% problem.
They didn't change it much. They added an indicator to the existing OS UI when a recording device is active. That's the only change that was made. As I've said elsewhere here, they could have added anything to the UI in the past and it would have had the same effect. People just didn't care because that stuff didn't affect them.
>There's no way to make a bulletproof setup, after all.
No one is saying it needs to be bulletproof. If it did and that's important to your production, you'd have backups to switch over to immediately. You're taking what I'm saying out of context and arguing straw men. If a small dot on the screen makes things "unusable" for you, then your setup is wrong. If anything that you don't want on the screen that you didn't put there is important for you, then you need to create your setup to function like that and allow for that.
All I'm saying is that there are people all over here, professionals or otherwise, who claim that a small dot on the screen is a dealbreaker for their ability to do their jobs. If that's the case, then they've been leaving their livelihoods up to chance because every OS has the ability to display things on an external display on top of a full-screen application. I'm glad some people were lucky enough for that not to happen but people whose livelihoods depend on that don't leave those things to luck.
They changed the percentage of time you have an OS overlay on the screen drastically. That's the metric I was using.
> No one is saying it needs to be bulletproof.
When you accuse people in situations where the dot matters of "leaving their livelihoods up to chance" for using normal output, you basically are saying that the notification-stopping part of the system has to be bulletproof. Swap 'bulletproof' with 'nigh-perfect, far in excess of all the other parts of the system' if you want.
And no, that's not going "by their own logic" or anything. Refusing to accept a constant dot in the corner does not mean a single notification would ruin their career.
>It means that a dot in the upper right hand corner makes the function "unusable".
I'm pretty sure professional has nothing to do with dots in the corner. There's really not much to defend about this change, and this change really reduces the utility of macbooks for many people who derive their income from work using that macbook. Hopefully, apple gets the message from users and fixes it.
>I'm pretty sure professional has nothing to do with dots in the corner.
It absolutely does in terms of this conversation thread since that's what the topic of discussion is.
>really not much to defend about this change
Yes, there is. It makes the vast majority of Mac users aware of when their input device is activated in situations where they may not know.
>reduces the utility of macbooks for many people who derive their income from work using that macbook
It absolutely does not. It only affects the very small portion of users for whom the dot is a dealbreaker, that have to capture audio while presenting, that are not technical enough to follow the steps to remove the dot, and that also have never cared before about the OS being able to render chrome on their work/display output.
Because professionals can't take the chance that an in-app notification (from another app) or a menu bar or something else will end up in their output. We're not just talking about an external monitor here.
Yes it is. Notifications can pop-up if someone forgets to disable them. Any OS prompts can pop-up on the display. You don't leave those types of things to chance.
Well, this is a “no true Scotsman” fallacy. In practice they do. It's not frequent but I've seen that a few times and it it's always “fun” to watch.
So it shows that there's a lot of professionals out there not following best practices (which isn't surprising to be honest, it's the case in every industry, including super critical ones…).
Maybe the orange dot will actually help these people start using best practices in the end… (note that I'm not defending Apple's move when saying so, I really hate their tendency to think there customers are wrong and because they are Apple they know better)
It's not a "No True Scotsman" because I'm using their definition of "Scottsman". If someone wants to be able to have full control of what goes on the display, outside of the OS, then they have to have a hardware I/O controller on a Mac. Their only argument is that they were OK with what the OS was putting on there because it didn't affect their specific use case. It's only an issue because they don't like what the OS is doing now. It's great if people got lucky in the past and never ran into an OS prompt or an alert from an app (looking at you, Steam) but that doesn't change the fact that the situation is currently the same as it was before Monterey. Anyone who's saying that a dot in the corner makes it "unusable" has to admit that anything else would have also made it unusable yet they chose to continue without managing the I/O of the device and didn't care.
You're all over this thread trying to gaslight people into believing a constant dot is somehow the same a rare chance at an OS notification that you forgot to disable. People plug there mac directly into shit and that worked fine; now it doesn't end of story.
For an incredibly small set of people who are both professional enough for this to be an issue but also not professional enough to spend a hundred bucks on a device specifically meant for this purpose.
If you're going to claim gaslighting, then maybe actually include the intricacy that you seem to have missed there.
It continues to work fine for the vast majority of people.
No. You're describing a different setup when you suggest that hardware. Read the post again, "plugging the mac directly into shit" is broken now (for people that are using their microphone).
> for people that are using their microphone and are outputting video that they want undisturbed
Let's call a spade a spade. You're blowing it way out of proportion while in reality it's an issue that impacts an incredibly small fraction of the userbase.
I never said it affected a large fraction of mac users. I'm not blowing anything out of proportion.
Your post above was also only talking about those people with mic input and video output, so it shouldn't be a problem that I did the same.
Also it's not just video, it's basically anyone "plugging their mac into shit" for professional output purposes. Could be video, could be a powerpoint, could be a software demo.
Woah! Someone learned a new word and then used it incorrectly!
It's not about OS notifications that you forgot to disable. It's anything that the OS wants to display. Notifications and alerts were just an easy example because everyone knows what they are.
People can still plug their shit in and it'll "work fine". If they're recording audio, they'll see a little dot. If they want control of what shows up on the display, there are answer for that. Pretending like that hasn't always been the case is misguided.
I really don't believe anyone running Steam on their video computer is worried enough or even serious about reliability to use a dedicated video playback card. Sure it would be nice if everyone used a dedicated card, but it's 100x more important that those people stop running Steam... unless maybe if they're pro game streamers or something.
Also, even Steam requires extra permissions on Mac to display the overlays you mention.
I only mentioned Steam because I've had an experience where that happened to someone. They were running some game (similar to Jackbox but it wasn't that) for a conference and were just outputting the display and the Steam update prompt showed up on top of the display. It wasn't a big deal, it was just an example that summed up exactly the types of things I'm talking about.
And no, this wasn't a Steam overlay. It was a prompt to restart the Steam app to complete some update. That does not need additional permissions.
The "prepared" in my post implies that notifications are disabled.
Also notifications won't really be an issue for anyone but people using the machine both for personal and professional stuff. In the worse case, you can have different user accounts. A professional machine used for VJing or even audio recording will have zero notifications.
Not a problem in practice. On macOS, OS crashes show up on the first monitor, and so do other crash alerts. Also, again, if this is not an amateur thing, the only programs that will be running will be those directly related to the presentation.
Also I wonder if we're talking about different scales here. I'm not talking about the 150 inch monitor, I'm talking about video art, VJing, and small scale stuff. macOS works fine for those things.
But the "current" monitor is the one with the GUI and the mouse cursor. The secondary monitor is the one being used for external video. There are even dedicated APIs for it.
Are you a macOS user? Your other examples talk about Windows Update... the situation in macOS is a bit different, which is a lot of people doing audio/video flock to it. Not everyone needs external hardware, just a MacBook can do a lot.
Yes. macOS is my daily driver and I'm on Monterey.
All I'm saying is that, if anyone wants to say that this dot makes their use case unusable, then they have to admit that the current OS setup was always unusable for them because the OS was always able to display chrome on their displays. It may not have happened often or even in a way that they thought was "unusable" but it was able to happen. The only difference here is that they're not happy with the type of OS-level things that are displayed.
In my experience, people for whom any kind of errant display items matter use dedicated hardware devices for their I/O. If it didn't matter before because it was only windows/alerts/notifications/whatever, then that clearly doesn't make it "unusable" just "not preferred". I fully agree that there needs to be some kind of option for this on presentation displays but the people saying that SNL wouldn't have dedicated hardware for their displays is asinine.
Sure, in theory you are correct. We should seek the more reliable solution. In practice, this is not really a problem for anyone using macOS for small time visuals/performance/presentations, as long as you keep your computer well prepared for those situations. It works 99.9% of the time, which is 100% for most people (even pros) doing it sporadically. Maybe your solution covers a few more 9s, and you need those 9s (I know live broadcasting does), but this is unnecessary for most common folk, and you're dismissing this use case across this thread, which is why I'm answering to you.
I feel like the notifications issue you mention is bit of a red herring, because having too many things running in the background will cause problems regardless of using external gear, regardless of them showing on the screen or not. You can't rely on external gear alone for stability, the computer itself has to be stable. And the computer alone being stable is enough for 90% of people. And even if there are notifications... so what? This is people doing it for art purposes, on parties. They learn a lesson and never have to care again.
If those people are really using Steam on their computers (like you said on another comment), they surely aren't pros worried about performance, reliability, or anything of the sort that warrants a dedicated playback card, so I don't really see this use case (Steam+Blackmagic) existing at all.
Surely the default I/O is nowhere near enough for SNL or even for local broadcast, but it is good enough for a large contingent of people that don't need the same reliability that you or SNL needs. And it does works for them in practice, without notifications, and without OS chrome... except for the new orange dot, which is a nuisance.
You can afford a separate laptop for VJing, but don't want to spend $200 to get 100% protection from unexpected notifications, error messages, calls, and orange dot?
What separate laptop? macOS supports multiple accounts, no need for separate laptop. And people don't want to buy a completely unnecessary $200 dongle (it's actually cheaper) for something that worked 100% perfectly before. Is that really hard to understand?
If I could have audio inputs/outputs that were good enough for my audio work I would also prefer not using an external audio interface. I often compose on earbuds, and that's fine. For mixing I need something else. Some people might not. Who am I to judge?
Also, it's not an "or" option. Pros turn off notifications/internet, and don't leave Steam running when working, like the other poster is saying.
use any professional audio interfaces like DANTE? Just activating DANTE causes the stupid orange dot to show, even if you are only sending audio OUT from your Mac to, say, a digital mixing board.
There is ZERO reason to force the dot on every display. Restrict it to the display(s) also showing the menu bar and this becomes a non-issue. I can't believe anyone involved in this at Apple thought this was a remotely reasonable thing to do!
Wow, this is one of the most misinformed takes I've seen in a while. I know a number of performers who use a Macbook for their visuals, and they absolutely just plug their machines into whatever I/O is available at their venue. I don't know where you're getting this idea that everyone just lugs around a rackmount AV machine, and if they don't they're not truly a "professional".
We're not talking about a rackmount AV machine. We're talking about a tiny device that can output to HDMI.
And, again, we are talking about situations where this orange dot would make the function "unusable". Those situations are not situations where a professional uses the built-in I/O and leaves things to chance.
My friend has VJ'd large clubs and music festivals on her ancient Macbook without ever using an external display driver.
There's not a lot of money in the scene for most people. They use the software/hardware they have. Hiding notifications and colourful dots from the OS shouldn't really be an issue.
there’s quite a few people here replying to you letting you know how there are indeed situations where A/V professionals have and are continuing to use built-in I/O. I have seen the same. A properly prepared machine is immune to the issues I’ve heard you describe in this thread (notifications, etc).
It sounds like there’s something about all of these responses that isn’t resonating with you because I see a pattern of responding and letting us know our experiences are essentially invalid, for some reason. Are you able to speak to why that’s important to you? Why does this seem so far-fetched / unbelievable to you?
I think you're misreading what I'm saying. I have only been responding to people that are saying that the dot makes this setup "unusable". The machine you're describing is not possible without a dedicated hardware I/O device because the OS always has access to display devices and apps cannot override that.
If the dot makes their setup unusable, then the situation prior to Monterey should also have made their setup unusable because the OS could have popped up an alert dialog at any time (or any kind of OS chrome). Using built-in I/O is absolutely fine in professional settings but not for settings where you need complete control of what's being displayed and that's precisely what they're complaining about. They never had completely control of what was being displayed. They were just OK with it because it either didn't bother them often or it wasn't a dealbreaker for whatever they were doing. If you need to know that you're only going to see what you want to see, you have to use hardware I/O.
I've never said anyone's experiences are invalid. Stop talking down to me like a child and making things up.
You posted a ton of comments to this thread (over 60 of them!) perpetuating a massive flamewar, and broke the site guidelines in a whole bunch of places, including outright personal attacks. We ban accounts that carry on this way, and we've had to warn you about breaking the site guidelines more than once in the past.
We don't want flamewars here. Please make your substantive points thoughtfully and without swipes in the future—regardless of how wrong other people are or you feel they are.
Btw, when discussion degenerates to the level of people arguing about what each other did or didn't say, that's a sure sign that the conversation has become tedious and uninteresting to those not involved in the spat, and that it's time to step away from the comment box.
You weren't the only person in the thread doing these things, but (from the subset I've seen), your account was behaving the worst, both qualitatively (in terms of how badly you were breaking the site guidelines) and quantitatively (in terms of how many posts you made and how much fuel you added to the flames).
Could you please review https://news.ycombinator.com/newsguidelines.html and take the intended spirit of this site more to heart from now on? I don't want to ban you, but if this keeps happening, we'll end up having to, because it's not what this site is for and it destroys what it is for.
I'm sorry. I didn't mean to start a flamewar and I didn't think that posting comments was frowned upon. It's just something I'm passionate about and felt was being misconstrued by people.
Can you clarify where I broke the site guidelines? I don't believe that I've personally attacked anyone but that feels like a subjective measure so I'd like to be clear on what you consider a personal attack (in this thread specifically). I don't feel like calling someone out for mischaracterizing what I'm saying or flat-out being dishonest about what I've said is a personal attack but, if it is considered that by the admins, then I'll stop doing that. If it's something else, then I'd like to be aware of it so I can check myself in the future.
I'm reading the guidelines now and will try to take the intended spirit. I don't want to get banned but I also don't want discussion to be surface-level and I get that there's a fine line there.
I'm not going to be responding to any more comments on this post to prevent getting banned and I don't know if there's any way for you to respond to my questions outside of here but, if there is and I just don't know about it, please let me know. I'm hardly an expert when it comes to HN and usually just check the threads page to see replies and this is so far down now that I don't want to miss your reply.
> I have only been responding to people that are saying that the dot makes this setup "unusable". The machine you're describing is not possible without a dedicated hardware I/O device because the OS always has access to display devices and apps cannot override that.
This seems to be the crux of the disagreement in this thread. You're equating the effect of two quite different things:
1. A pop-up that's quite intrusive / potentially embarassing, but has (say) a 1/100 chance of happening any given show, and in any case would only be there for a few seconds; the rest of the show would be unaffected
2. A small but intentionally noticeable orange dot that's there 100% of the show for every show
Yes, if you want to be a top level professional, then you can afford to have neither. But I can certainly imagine people / venues where #1 would be considered a normal cost of doing business, but #2 would not.
That said, if fixing them both is as easy and inexpensive as people in this thread seem to think, then the small "nudge" by #2 to get them to fix #1 is probably beneficial for the ecosystem overall.
You're mis-framing what I'm saying. The effect is the same. There is something on the display that is unwanted. Regardless of what that thing is, the issue at hand is that people do not want things that they didn't choose to be on the display to actually be on the display. It doesn't matter what that is. In this case, it's a small dot but it could have been a little microphone icon or a camera icon or (like Windows) an orange border around the display or literally anything else.
The fact that it didn't affect some people before is great. They were lucky that they didn't have this happen. All I'm saying is that, for years before this change in Monterey, there have been things that pop-up like this on external displays because external displays do not have the ability to prevent the OS from using a display as a display (with a couple exceptions when used with APIs for exclusivity). That's a known problem with using extended displays and it's been known for a while, regardless of the OS you're using. There's also a solution to that problem that's pretty standard in the presentation/production industry.
That being said... there are now multiple solutions for people that both don't break the intention behind this change (they're slightly technical solutions that most people won't do) and allow for app developers themselves to fix the problem (using the APIs already provided to get screen exclusivity).
The point you're not addressing is that in practice, before this change, people usually had close enough to full control, except for maybe a <1% chance of something going wrong. The OS can, in principle, do anything, but in reality it usually doesn't. Whereas with this new orange dot, there is a 100% chance of it being there.
It's easy to imagine a pretty wide range of people for whom a tiny theoretical risk of the OS going crazy and showing some kind of notification even with notifications disabled is acceptable, but an orange dot that's 100% deterministically guaranteed to be there isn't.
No they didn't. That's literally the entire point. Just because people don't know that or weren't affected by it regularly doesn't change that fact. People are focusing on things like notifications because that was the initial example that I used for ease of understanding but there are literally thousands of things that the OS can display, on any OS, on top of a full-screen window or by kicking the display out of full-screen. Even on Windows, Virtual DJ has has a full-screen mode and errors will kick you out of full-screen and pop-up on that display. All of that is avoidable with an I/O box.
The fact is that if you're not ok with the OS choosing what shows up on your screen, there are ways to do that and they've been industry standard for years.
Thanks for explaining, I hear what you are saying about dedicated hardware giving much more control. In the case of this dot and an external monitor / display, and until the OS is updated, that definitely sounds like a great option.
Perhaps I was misreading what you are saying; regarding:
> If the dot makes their setup unusable, then the situation prior to Monterey should also have made their setup unusable because the OS could have popped up an alert dialog at any time
It sounds like we are in disagreement around current un-usability because of (dot/notifications/etc) implying past un-usability. I get what you are saying in theory, and agree with you.
However, in my life experience, and many others in this thread, I am hearing counterexamples to this statement. The way in which I observed you responding to these statements is what invalidating _to me_. My felt experience is not "making things up".
I do not know your age and I am making no assumptions about that. The intention of my original comment was to: respectfully and kindly share the impact of how I have been receiving your comments, gain more understanding about your perspective on this issue, and offer an (obviously subjective) reflection to you about a communication pattern that I noticed in the comments here.
This sort of reflection is always coming from place of curiosity and so carries an implicit invitation to deepen into greater shared understanding and connection; I have no attachment to you receiving that reflection and I apologize if it did not land well for you. I understand these kind of things do not always translate so well in text.
I have never said that anyone was making anything up. Please do not put words into my mouth or make statements on my behalf when I have not said them.
>I apologize if it did not land well for you
This sounds like a abusive partner apologizing. "I'm sorry you made me so mad that I had to hit you"
>I am hearing counterexamples to this statement
You're not hearing counter-examples. You're hearing situations where the person, in the past, was lucky enough to not be affected by the situation they were in. That's great. It's like someone who refuses to wear their seatbelt telling someone that they never wear their seatbelt and they've never been injured even though they've driven 10000000 miles in that car. If reducing your chances of dying in a car wreck is important to you, you would have worn a seatbelt. This is like complaining that the car's seat belt chime wasn't insistent enough after you got into a wreck and are now paralyzed.
I am not carrying any weapons in this conversation.
Nobody is attacking you here.
Yet, I experience your interactions here as carrying defensiveness and un-checked assumptions. That is my experience. That is different from me saying, “you are defensive.”
I am having an experience in this dialogue that feels in opposition to one of the main principles encouraged in this board, which is that of graciousness and good intent. I do not feel you have been reading my comments from that kind of orientation. Perhaps you have and I have misconstrued.
Regardless, I really don’t care enough about this issue to continue engaging with you about it like this.
So you made up something that I said or did and then framed it as my fault (something "not resonating" with me) and then want to claim that you're being gracious and discussing from a place of good intent. What un-checked assumptions am I making? Everyone that has responded to me and disagree with me has done so on the principle that this small dot is a "showstopper" that makes their setup "unusable" (it's literally in the title of the article in the OP). All my comments and responses are based on those statements, no assumptions. The difference is that these people feel that it is Apple's responsibility to fix the issues they're having without taking any responsibility of their ignorance on themselves.
And you're right... I haven't been reading your comments from that kind of orientation because your very first comment to me claimed that I had somewhere let someone know that their experience was invalid, which I never did, or that I had somehow claimed that people's realized experiences were far-fetched, which I also never did. How can I assume graciousness and good intent from someone who didn't assume the same of me and then went out of their way to mis-frame what I have been saying? The only time I ever even suggested that something was far-fetched were the people who were claiming that something like this would affect productions like SNL or Mariah Carey on NYE because those are far-fetched suggestions.
That's fine not to continue but please don't pretend that I was the one that engaged with you "like this". You initiated the current conversation, including its tone and intent.
Yeah, I'm watching this unfold and I don't see how the other person can't see the problem.
Someone spends a thousand, or two thousand dollars on a high-end device with state-of-the-art ports and graphics and processing, and because "OS notifications sometimes pop up if you don't disable them", real pros buy a piece of middleware hardware that does nothing but filter out a software issue?
Yes, exactly. If you need to be able to control what goes to the display and take that away from the OS, you need a hardware I/O device.
OS notifications and alerts are just examples of any number of things that could be displayed that are unwanted. In situations where something like that makes the setup "unusable", you have to have a hardware I/O device. There's not another option.
It's rendered on the external screen also, even when there is no menu bar.
For me, it's good, not bad. I’m sorry that for some people it means they need to do their job a little bit less careless, but the overall importance of this dot is worth it.
Apple also hasn't fixed the Mail.app bug for years (sometimes, when you are in full screen mode, Mail.app suddenly activates and fills half the screen).
Search for it. There are countless threads about the issue online, and last time I checked noone knew of a way to prevent it.
Imagine holding a presentation, or watching a film with someone, and suddenly your private emails fill half the screen...
There still exists the classic solution of offsetting video output through a projector (or on a monitor using vertical alignment) to place the dot offscreen right? If we're talking about people dropping tens of thousands of dollars on equipment that feels like a modestly acceptable short term solution.
As a macOS user, I highly appreciate it if Apple doesn’t provide an easy way for the app developers to disable that small dot either in normal or presentation mode. Of course this is my personal preference but in my opinion it should make it harder for apps to bypass that visual indicator. I want to know _with more confidence_ if an app is using the mic!
Because Apple didn't do the wrong thing. The people just plugging their Macbook into a cinema projector and expecting their general-purpose desktop OS to work just like a cinema projector controller by default are doing the wrong thing. They simply don't understand what's going on.
As repeatedly said by many people, Apple is making a tradeoff between security and special-purpose requirements. If you have a special-purpose requirement, then make sure your tools are configured for that purpose. Don't whine that the general use-case doesn't fit it!
You'll see similar whining from web devs trying to make Chrome or Firefox "hide" the chrome and go full screen programmatically. That's fundamentally incompatible with end-user security and shouldn't be allowed, ever. Asking for it won't get you anywhere. If you need full screen control, use something like Electron, not a web browser. If you need full screen projector/screen output, use a tool/software appropriate for the job.
This isn't good. Most of the live visuals I've seen are in a dimly lit room with a black background. Often this is so that it's hard to see the edges of the screen and the visualization appears to be floating. Having an orange dot in the corner would break the immersion for the audience. It's also going to be larger when projected, maybe a couple inches in diameter.
This is the first explainer in this thread that makes me see the issue. All this time I'm thinking along the lines of "Okay you're, projecting video in a wedding. Now there's an orange dot on the screen. Boohoo."
So, apparently, I gotta dream bigger. Like museum AVP or concert/live performance/DJing type of live visuals. Yeah now an orange dot is a showstopper.
Live visualization takes a live music performance as input and programmatically generates a video that matches said music performance. The video might pulse along with the beat of the music. Or maybe it changes color or shape depending on the “mood” of the audio. The computer driving the visualization receives audio from the live music via a line in, which Apple treats as an active microphone.
A simple live visualization would be to display a video showing the waveform of the live audio feed.
This is a tough intersection between highly-visible privacy controls and the ability for artists to use their Macs in live performance situations where any overlays could be a distraction.
In the Privacy menu in MacOS now, you can authorize applications to do things like use locations services and record the screen. I think it would be reasonable to include an authorization for audio recording and bypass the orange indicator. Or they could turn on the webcam green light when audio input is active?
This is desktops creeping towards an attitude, familiar on Apple's phones, in which the user is essentially untrusted to make security decisions. Because let's face it nobody seriously audits the security of software going onto Macs. The loser is the user's freedom to enjoy the computer as a true general-purpose tool.
Maybe this makes more sense if we view this as involving three parties: Apple, users, and app developers.
On OSX at least, some apps are developed by third parties whose code isn't easily scrutinized by Apple or by end users.
I think Apple's policy helps users navigate that situation pretty well. But I also can't see any good reason to prevent users from disabling that feature in a fine-grained way. E.g. per app and/or temporarily.
No it’s not; this thread is full of solutions that power users can apply to avoid or remove the orange dot.
Apple not taken anything away. What they have done is change a default. Since defaults matter most to the least savvy users, skewing defaults toward security makes sense. Power users can apply extra skill to change the default; that’s what makes them power users.
You're assuming malicious intent where none exists. This feature is a security gain for users — previously there was _no way_ to know if your mic was being used. Now there is. It's giving me extra information I can use to make security decisions, where previously I had none of that information.
It's unfortunate it causes problems for some users, hopefully a fix will be forthcoming, but I believe it's an oversight.
The webcam light won't work if you're using a laptop in clamshell mode, or even using a Mac Mini/Pro. I think the best middle ground for this is to allow users to grant some permission that allows the app to access the Mic without turning the orange dot on.
I think the implication in the post was that the user would be prompted to grant the permission, so they would have to click "Yes" when the spyware asked for permission.
Were this to be a permission, there would be no legitimate use case where "hide microphone use indicator" would be a required function. Anything that requires that should be flagged immediately as hostile.
I think the webcam light is completely controlled by the hardware signal, so the only way to turn that light on/off is by turning on the webcam itself, which I don't think is desired either.
All Apple computers since ~2009 have had the webcam light on the same physical circuit as the webcam, such that one can not be powered up independently of the other.
PCs and third-party webcams are of course another matter...
In theses cases, is audio being input into the Mac and that's why it shows up, because they're "recording" into the application generating the visuals? I don't understand why the Microphone notification is there at all if it's just outputting audio/video. Are these applications recording audio? Or is it just a result of the APIs/permissions they end up using?
Seems like there should be a compromise where apps that are using line input instead of the physical microphone could be exempt from the dot.
Is there a line input on newer MacBooks? I thought the jack had just mic input!
Also, does anyone knows whether the orange dot shows up when you also use an external USB audio interface? If so this is kind of stupid on the part of Apple.
The article also mentions that those apps turn on Microphone capture even when unnecessary, so it seems the dot it's doing its job, although obviously there was a massive oversight, and there should have been a special permission to disable IMO.
I have one sample from 2011-ish era MacBook Pro that allowed switching between line-level and mic input for the input plug, via a dropdown in sound settings. The single "shared" plugs on Retina units no longer support this, I think.
Thanks, I did read the article but missed this part. I had it in my head that it would be something like playing a locally-sourced audio feed through an app, but I suppose input from more complex equipment makes sense.
Yes. the problem is visuals accompanying live music performance. Even if that particular computer is not being used for music, many of the visuals are reactive.
Yes, the dot is there to indicate to people that they might be being recorded. There's no way to distinguish if the line input is just another microphone or not so exempting that wouldn't solve the issue Apple is trying to solve.
Got to wonder who made this decision and how many people reviewed it before it was deployed. Then again, most people aren't doing audio recording while using Keynote (or Powerpoint). Something like this could very easily fly under the radar across the entire company.
This looks like one of those problems where the edge cases are damned obvious in hindsight but aren't noticed until it hits production. Happens to everyone with a non-trivial product.
The real problem here is Apple has a history of ignoring these design mistakes for a very long time. Even when they do fix things, they have tendency to hide any mention of it. It will probably be suddenly and quietly fixed months later in some future point update.
I am all for features which help the user to control the users privacy. Having a LED with the web cam for example is a very good thing. Though it doesn't tell you what is accessing it and recording you against your will...
It totally makes sense to display those privacy notifications in the desktop UI. However, when an app goes to full-screen mode, the OS shouldn't interfere with the display by default. There are plenty of szenarios, where the program needs to be able to control every single pixel of the screen.
>There are plenty of szenarios, where the program needs to be able to control every single pixel of the screen.
In those cases, there are hardware solutions for this. That's not the default in any OS right now. Notifications and menu bars cannot be disabled by apps, for example. They need to be disabled from the OS level.
Of course menu bars are disabled by default in full-screen mode, notifications can be disabled on Mojave at least. That is the whole purpose of a "full screen mode", that the whole screen content is controlled by the app.
That is not the purpose of full screen mode. If that were the case, then notifications and menu bars would be able to be disabled by the app. They're not. They need to be disabled or muted by the user at the OS level.
Hum... What goes or doesn't go into a full screen app should be controlled by the user at the OS, with no involvement of the app.
That still doesn't change the fact that the point in making an app full screen is to give it control of the entire screen (or an entire virtual one of your choosing).
As a p.s. to my last comment: I just tried a silly Gtk based tetris I had written some ago under MacOS: hit the green maximize button and it goes completely full screen. The only thing you still see is the mouse pointer - I am sure there is an API to make that invisible too, if one wanted to hide the mouse pointer.
I have never ever owened a computer where an application could not controle the whole screen. Yes, there are situations, where the OS supersedes the full screen apps input, but usually they are based on user interactions or notifications (which can be disabled).
>I have never ever owened a computer where an application could not controle the whole screen.
Yes you have. Every OS has chrome that cannot be disabled in full-screen. You may not regularly experience that, but it does. Imagine you're playing a video game and Windows Update throws an error. That will be displayed over your full-screen game window. It doesn't even have to be originated by the OS. As long as the OS is provided with an extended display, it has the ability to put things on top of whatever your app displays. It may not happen often but it does happen. That's the primary reason why dedicate I/O hardware exists.
As I wrote in the answer to your other comment, I have never claimed, that there aren't events that can cause the OS to supersede a fullscreen app. But by default, the OS does not display additional information on top of a full screen app. And perhaps we should discuss the same thing only in one thread.
About which OS are you talking? On MacOS and Linux, a program can go to full screen mode without any user interaction. The last time I saw a powerpoint presentation on Windows, it also didn't have a menu bar or task bar in presentation model. So what are you talking about?
Let's say you're giving a PowerPoint and an app crashes in the background (for whatever reason). PowerPoint cannot prevent an OS alert from popping up on the screen. macOS has functions to hide the menu bar in full-screen apps but it does not have functions to stop other OS chrome from appearing.
Yes, there are circumstances where the OS is overriding the full screen mode of an application. Usually only in the case of very significant events or user interaction - like moving your mouse to the screen top will show the menu bar on Mac OS. But all of this isn't shown permanentely on top of the full screen app and only triggered by special events.
That's not the point. The point is that it can happen during a performance unless you're using a dedicated piece of I/O hardware. They give you 100% control over what gets outputted. Just because people were ok with what the OS displayed before doesn't mean that that it couldn't.
We are talking about different things. Yes, you cannot 100% prevent a dialog to appear without custom hardware. And of course, even with custom hardware, the OS could decide to disable it to stop output to it.
But this isn't the point. So far, there were full screen modes and the OS would honor those under regular circumstances. When I run a Linux VM on a Mac, I can see the whole Linux desktop full screen. I don't want MacOS draw colored dots on top of my Linux VM. People probably don't want yellow dots on their PowerPoint presentation.
No, we're not. The only difference in what we're talking about is that you were ok with the exact same thing as now because you liked/agreed with what the OS was putting on the screen. Whether it was because it was rare or uncommon is completely besides the point. This whole thread is about people who are saying that the dot makes the display "unusable". If that was the case, then any other OS chrome would also make it unusable.
No, that is wrong. So far the OS didn't display anything permanentely on top of a full screen app. It was only as a reaction to user input and exceptional situations. Especially (under MacOS), when the full screen app was runninning on a secondary screen.
I can only hope/assume, I misunderstood the article or this is just a bug.
No, it's not wrong. That's what I said. It still doesn't do anything permanently. It only does it if you're using audio input actively. That's a reaction to user input. You're enabling a recording device and that's what triggers this OS chrome. Whether you agree with that is irrelevant since full-screen mode has always allowed OS chrome.
That is something entirely different. When I said active user input, it was a concrete reaction to a specific user action, like moving the mouse at the menu area. There is no such user action, which would be triggering a sound recording. This is about running applications in the background. According to the article, that would taint the full screen mode constantly and not transiently as the other actions. It is fully sufficently, if that happens on the primary screen.
I understood that. You are not understanding the problem or pretending to. Imagine a powerpoint presentation during a video conference. I don't want the presentation overlaid by a yellow dot. There was no such thing like that so far. At minimum, there need to be controls to prevent that.
And I'm telling you yes there is. You could still get an alert from Steam logging in on your full-screen app. You could get a Windows Visual C Runtime Error that pops up on top of it. You could get a macOS notification or a kernel panic report and it would pop up right on top of your full screen application. Apps cannot override the I/O system of the OS.
I never claimed they can. But so far you can run an app in full screen mode and unless a special event happens, you won't get something on top of it. The audio indicator is completely different, because as long there is audio recording in your system, you don't seem to be able to get rid of it. At no time. If nothing exceptional happens on your system, the OS should allow full-screen apps to run. So far, all OSes did so.
Stop waiting on Apple. Turn off SIP, find the function responsible for this, and swizzle it with a no-op.
This, in my mind, is exactly what makes SIP brilliant as a default but optional feature. It's fantastic for 95% of Mac users, who can browse the web and write emails confident in the knowledge that their microphone is turned off. The remaining 5% need to do weird crap like run live music shows, and should take advantage of the escape hatch.
Unfortunately, "weird crap like [running] live music shows" is what Apple markets their devices for, and since they're selling machines that have thousand-dollar price premiums over their competitors, I should hope their attitude isn't "fix it for yourself". I know a number of Mac musicians who run live shows and can neither grok what you just suggested or be bothered to disable SIP and lose all their iLok plugins before a show.
It seems like the core issue is trivial to fix: make the orange dot only appear on the primary display. I don't see any advantage to spamming the orange dot to every single display output, given that the primary display is the one the user is presumably primarily interacting with (somewhat by definition).
In addition to the sibling complaint about using a real monitor / closed laptop--which I do constantly, making my primary display a third-party panel from Samsung--I guess I disagree that a fix that would require retrofitting a new LED that didn't exist into a piece of hardware could ever be "trivial" in the way my suggested software update could be (but then again, I am not a hardware person, so maybe there is some kind of cheap, magic LED sticker they make these days that Apple could offer people for free at the Apple Store).
> And it does seem there could be a fix here; you already have to give applications permission to access your mic and camera, and it seems there should be some way for an app to disable the orange dot once its permissions are elevated with opt-in by the user.
The obvious caveat is that a hacker w/ local access can do that themselves to hide their system foothold, which is just par-for-the-course when it comes to physically compromised machines... except I'm willing to bet they're more concerned with "jealous ex" than they are with professional hackers.
Apple is trying to add features to protect against the unprotectable instead of just acknowledging that at some point local access means game over; the only way to do that is to make it virtually unbearable to use the OS as a regular user (see UAC in Windows Vista).
I just updated straight from macOS Catalina to Monterey, and one thing that's really bugging me is the Microphone Usage indicator in the top right corner. I have this app called "Background Music" to help amplify the otherwise terrible audio of my earbuds. I know it's using my microphone; I don't need this constant annoying overlay. Anyone know a solution?
hm, i just tested this on my macbook and the way it seems to work is certainly breaking live visuals, but at the same time does not protect people from being recorded: if i start an audio recording and then watch a film fullscreen, the orange dot does show, but then vanishes (while still recording the audio). not sure if this is intentional or a bug, but it would certainly render my computer unusable for quite some work i do. audiovisuals almost always are reactive to sound in one way or or another, so you do record the ambient audio for your video or installation be able to react to it.
People on Windows are about to go through a similar experience as Microsoft removes ways of capturing your desktop without an obnoxious orange border around the window being recorded. The logic is there…you want it obvious and in the user’s face if something malicious is watching them, but it makes benign use cases obnoxious.
Stuff like this is bad security design because it punts responsibility to the end user. It doesn’t actually stop anything bad from happening. If your design can’t decide whether to stop a program from capturing the mic, how do you expect my grandma to?
Is this dot anything more than a software feature? If so, its absence doesn't actually prove that sound isn't being monitored.
What you want is a dedicated LED that is routed directly to the sound input being on at the circuit board level: like the amplifier is on, or that path is enabled by hardware or whatever. Even then, if the meaning assignment is "LED glowing = sound monitored", you cannot trust it entirely: the LED being off could mean that the LED itself is faulty. But at least you know that the mechanism cannot be tampered with by software.
yeah, i don't use a screen lock on my computer because a sophisticated attacker would just pull my data off my ssd since the decryption key is stored in memory anyways.
it is much more valuable to have an indicator that your mic IS hot than having no indicator at all, not to mention adding an led indicator does fuck all for people who don't buy the $currentYear+1 laptop.
This thread has a lot of people arguing for the live visuals use case over protecting privacy or vice versa. Not only are they both important, but I think we can actually have both.
To easily solve both the privacy and the live visuals issue, Apple should make it so that the dot is only shown on a display of it is:
- the primary display (the one notifications appear on, since that's where the user typically checks for system status),
- the built-in display, if any,
- the display the cursor is on, or
- the display containing the frontmost window.
These are the only kinds of displays that make sense to have a persistent privacy indicator on. Anyone who's sitting at a computer is looking at at least one of these displays.
The secondary displays that artists use to show fullscreened live visuals always fit none of these criteria.
> And it does seem there could be a fix here; you already have to give applications permission to access your mic and camera, and it seems there should be some way for an app to disable the orange dot once its permissions are elevated with opt-in by the user.
MacOS already has security mechanisms meant to prevent malware from e.g. installing a rootkit into the kernel, or reading keychain passwords - one of those mechanisms could be used to prevent programs from altering whatever setting controls "show orange notification dot" (which, in a sane design, would be opt-out - or, opt-in to "disable orange notification dot) on their own.
Hmmm, can you clarify what you mean by "malicious grants of permission"? Do you mean when a piece of software (malware, in this case) tells the OS to give it permission to hide the dot, when the user hasn't consented?
If that's the definition you're using - MacOS already guards against that, simply because the orange dot is already being implemented in software in a way that is difficult/impossible for ordinary programs to change (but is controllable by the OS). And, from what I understand, MacOS already has many settings that are OS-controlled - you can't do certain things without authenticating yourself to the OS, and neither can software on your behalf.
If that's not quite right, I'll have to ask you to elaborate on what scenario you're thinking of.
You allow the user to keep a whitelist of apps they already knew will be using the microphone/line-in/whatever audio source and when those apps use any audio source, don't display the orange dot.
Add another checkbox next to each app in Preferences -> Security & Privacy -> Privacy -> Microphone (and camera) that allows the app to bypass the indicator. Users would have to go to this pref pane and enable that checkbox themselves (with instructions from the app, probably).
Because when you are running video out to a 150" LED wall, that orange dot is a very large dot. And even were it not, you are not building your presentational elements around "having some orange dot on screen".
"I mean, jeez. They just left the mouse cursor in the middle of the screen, what's the big deal?"
Anyone outputting to a 150" LED wall will have a hardware I/O card that allows them to control the output 100%. This doesn't happen in professional settings, only in consumer (and maybe prosumer) settings. Professionals don't leave things like that to chance.
I've been involved with several live productions in large arenas where there were multiple very large LED screens (48 feet wide I think) as well as being webcast to a significant audience. The A/V budget for these events was 7 figures. Several semi-trucks full of equipment for A/V. I don't know if that's professional or prosumer, but it seemed very professional to me.
100% of the computer graphics at these events came from PCs and Macs with their default video output. Some even came from the HDMI output on a Raspberry Pi.
I worked a lot with the production companies behind these shows and it seemed to be SOP to use a regular Macbook with Powerpoint to drive the displays.
Maybe at the super bowl, or the opening of the olympics, or at some major pop stars tour where everything is time coded and planned and is the same production every night, sure.
But at festivals? Clubs? Whatever NYE party you go to this year? No.
At festivals, clubs, and my own NYE party, a tiny dot in the corner is not a dealbreaker and wouldn't make the display "unusable". If it did, I would bring a dedicated hardware I/O device.
Then a small dot in the corner can't possibly be considered a use-case where it's "unusable" because using the computer with the default I/O in the manner you're suggesting means that those people are also ok with notifications, OS alerts, and any other OS chrome being OK on the screen. You cannot control the display of OS functions on the built in I/O via an app.
That being said, I just don't believe you. A 7-figure budget with semi trucks full of equipment that couldn't afford (or didn't think to afford) a video display device is unbelievable if you want to suggest that a small dot makes this unusable.
> means that those people are also ok with notifications, OS alerts, and any other OS chrome being OK on the screen
All of those things can be controlled though, unlike this new dot. I've had my personal laptop hooked up to one of these huge screens with a live audience of 20,000. Yes, you better be careful to close out your messaging and Gmail and everything else. But also, since it's running as an extended display and the program is running in full-screen mode, the OS generally will not show the things you mentioned anyway, in my experience.
An orange dot would not have been tolerated in this environment. "Unusable" may not be the right word, but the people who set up these kinds of things are very particular about how things look, and so is the client who is paying millions of dollars.
If the client is paying millions of dollars, then the production should be using dedicated hardware I/O. You saying "the OS generally will not show you things" admits that it can and sometimes does show those things and those would absolutely not be acceptable in a million dollar production gig. That's why we use dedicated hardware.
I don't understand why you're so combative about this. You say "professionals never do this", but lots of people are telling you that sometimes they do. Perhaps it's time to consider that your experience is not universal?
In the cases I've been involved with, the computers generating the displays came from the clients, though the production company also used Macs with Powerpoint for creating lower-thirds for IMAG. The client-provided computers were running a custom software application designed for displaying data on a secondary screen. I'm not even sure if Decklink can even do that as the software just expects to output to a secondary display (it does not know anything about Decklink).
I'm not being combative and you're (I'm guessing purposely) leaving out parts of my statements to try and argue a point I've never made. I know my experience isn't universal but, by definition, if people are saying that a dot in the corner of the display makes their setup "unusable" then their setup has always been unusable because the OS has access to put things on their display at any time.
If the computers came from the clients, then the clients didn't care whether something might pop-up on accident. There's literally no way for someone on the production side to prevent that with a computer that they don't know the ins-and-outs of so it cannot be an issue so that's not the type of situation that I'm talking about.
I'm only talking about the people who are saying that this dot makes a Macbook "unusable" for the purpose of display. That's 100% not true and anyone that needs that level of precision, as a professional, uses dedicated I/O hardware to keep exactly that from happening.
I don't know what else to say. I work with people who put on live shows for 20,000+, they use PCs and Macs to drive the display WITHOUT Decklink or similar, and they would be upset if there was a persistent orange dot on the screen. If you want to split hairs about whether or not that makes the new macOS "unusable" in these situations, then fine, but in the environment I'm familiar with it would not be tolerated. They would replace the Mac device with a PC which doesn't have this issue because they would deem it unusable, even though it can technically still display images on a screen.
And despite your repeated comments on this article, it IS possible to configure a PC to not show any notifications for use in a live show, between a combination of changing OS settings, closing unneeded programs, and using the fullscreen APIs of the OS. This is something that can be done ahead of time and tested. I've written software that shows fullscreen on secondary displays on large projectors for presentation-like purposes and I have never had an OS notification pop up over the fullscreen software on a secondary display, even on other people's machines where they didn't take care to shut down programs and turn off notifications. Other commenters here seem to have similar experiences.
You don't know what else to say and I don't know what else to tell you. Those people would also be upset if anything else outside of a dot popped up on their display, wouldn't they? Or are they just against dots for some reason? Hardware I/O devices exist for precisely this reason. They literally only exist to be able to control what gets output outside of the OS. Just because you personally never had an OS alert or something else pop-up doesn't mean it's not possible or that it doesn't happen. It does happen, usually unintentionally. That's why we have I/O devices.
It's not 100% unless you're using an input to record audio and then it's there for a reason. 99% of the people in these threads aren't going to see this during their PowerPoints because they're not recording anything and the 1% that do and care about the dot should be using a hardware I/O device anyways because the OS can do a lot more than just display a dot. Additionally, applications can also fix this themselves by using Apple's APIs.
> I don't understand why you're so combative about this. You say "professionals never do this", but lots of people are telling you that sometimes they do.
For what its worth, from someone who doesn't work in this space, there's something really confusing to me:
This thread is about professionals who:
1. Are displaying visuals at venues with million dollar A/V hardware budgets.
2. For which an orange dot (or anything other than pixel-perfect outputs) is a complete showstopper.
3. Who can't afford a $100 dongle mentioned upthread to output pixel-perfect graphics from their Macs.
I'm with you right up until point #3, but I'm really struggling with the last bit. $100 doesn't even sound like "prosumer" money to me. If $100 is really a show stopper for your million dollar business, you need to charge more.
The root of the issue is that things were working fine before this change for the particular use case of computers hooked up to A/V equipment. Perhaps a $100 dongle can fix it, but the point is that the $100 device wasn't needed before - people were happy with the way it was (again, for this use case - I see the value in the dot for other use cases obviously).
Furthermore, while I'm not an expert on these $100 dongles, my understanding is that they do not present as just another monitor (since that would defeat the purpose). Thus, you cannot just show anything on their outputs that you could otherwise show on a monitor, right? My understanding is the application has to be written specifically to output to the Decklink (but I may very well be wrong on this) - if that's the case then the $100 dongle does not fix every situation here since a lot of things that get presented may not support it.
Apparently not because the OS has always been able to do this. They're just not happy because the OS has now chosen something that they disagree with. Ignorance of the limitations is not the same as being "happy with the way it was" even if ignorance is bliss.
>written specifically to output to the Decklink
With the DeckLink specifically, yes. That's a choice made by Blackmagic, though. There are other I/O devices that let you output any window's content (like OBS does with its virtual cameras).
> people were happy with the way it was (again, for this use case - I see the value in the dot for other use cases obviously).
Sure, but this is a classic case of shifting the externalities of your industry onto the rest of the rest of the public. People who operate coal power plants were happy with generating tons and tons of smoke (but they do see the value of less smoke for the communities around the power plant).
You're asking everyone else to accept being a little less secure so that it can be easier for you to make money.
> if that's the case then the $100 dongle does not fix every situation here since a lot of things that get presented may not support it.
The most annoying part of this entire thread is how the goal posts keep shifting. The use-case that the $100 doesn't fix is one that has all of these constraints:
1. Must use audio input on the same computer that is generating the visuals (so you get an orange dot)
2. Is generating the visuals from software (that can't be modified) that can only draw to plain old monitors
3. Is doing all the above in a professional environment with XX,000 people in attendance, where not looking professional (having an orange dot) is a show-stopper.
4. The venue and/or AV professionals have no facilities to crop / letterbox / pillarbox real-time video streams
If a customer hands you a mac with a PowerPoint deck, you break condition #1. If the use-case is an artist who wrote a custom sound->viz app, you break #2. If this is a church group or a VJ in an underground club, you break condition #3. I'm having trouble imagining situation where #3 and #4 are not mutually exclusive.
If a customer hands you a laptop at the event with custom audio->viz software that doesn't work with the dongle and this is a professional event with XX,000 attendees and you don't have a way to letterbox, I believe correct answer is "We're the AV team not tech support. Your equipment is creating the orange dot. We're just projecting what you provide".
In other words, I'll stipulate that the $100 doesn't cover literally all use-cases, but the cases it doesn't cover are the nichyest of niche cases and ones where the customer has unrealistic expectations.
If you're expecting Apple to turn off a feature that is useful (and wanted) by millions of users just to help a number of users who can be counted on one hand... well, those handful of users better be buying a lot of Mac Pros.
To how many people are you going to tell that their concerns aren't valid and their experiences are irrelevant? Maybe it's you who is wrong, when so many people are telling you about the real world problems and uses that this update is going to ruin?
Please show me one place where I said their concerns weren't valid or their experiences irrelevant. I have never said that and you're being disingenuous to say that. The only thing I've said is they're being hyperbolic by using the term "unusable" because, if it was truly unusable, they wouldn't be relying on something that wasn't out of their control.
The mouse cursor on the side of the screen is stupid indeed. But it doesn't make presentations impossible. I think all the objections here are objecting to hyperbole rather than defending the dot.
Yeah, I get it. Not gonna be accepted in practice. Could be accepted though, it's not literally useless. Being not as bad as literally-useless doesn't mean anything about the situation is good.
What's the processing system going to do? Blur that side of the screen? I don't think Photoshop context aware fill is real time. And both are hacks that will lead to situations of clearly wrong outputs
Well they may be processing the output but the raw input has an orange dot so at the very least this is additional labor to configure additional custom processing.
"In our particular case, this means that this orange dot appears on the stage output, which is totally unacceptable for anyone using macOS as a professional video tool that sends video output to a video projector."
Imagine an orange dot appearing on top of every video billboard in Times Square.
> Imagine an orange dot appearing on top of every video billboard in Times Square.
Seems unlikely that whatever's driving the billboards in time square needs to have the mic on...and that a tiny bit of postproduction is a burden on the people making said videos.
> Seems unlikely that whatever's driving the billboards in time square needs to have the mic on
There have been many interactive billboards in Times Square. Some of them are at ground level. The article even mentions this use case: "Those applications don’t even need to be obviously using audio; live visuals often use mic or line input to produce sound-reactive animation and the like"
> tiny bit of postproduction is a burden on the people making said videos.
What postproduction are you referring to? The dot is added to the video output by the Mac, not added to the video file. You can't edit it out.
> "Those applications don’t even need to be obviously using audio; live visuals often use mic or line input to produce sound-reactive animation and the like"
Why would anyone want to suppress a warning to the user that they are being surveilled?
Removing it in full screen ought to be the easy fix for Apple. By the time you've started the app and are done fiddlign with it while setting it up for production use, it should no longer be a surprise that it's using the mic.
Also, AFAIK apps don't use to autostart in full screen (might even be against design guidelines) so there should always be an opportunity to notice any spying in time.
1. Have you never seen a billboard or tv screen advertisement break before? 90% of the time it's running windows, and you'll see a very zoomed in top left corner of the desktop.
It's absolutely not a stretch someone has a mac mini to run these.
That said, only a fool would upgrade to bleeding edge software in this usecase.
I’ve never seen any billboard or electronic advertising running on anything running MacOS. The only time I can think macs were used for anything other than the creation of the content was when we used to put Mac minis into broadcasters to deliver the content to. But we put windows on them.
About 10 years ago, I had a job with one of the largest outdoor advertising companies in Times Square, running several huge LED displays, including one that used a camera feed for interactive experiences with the audience.
I don’t know too much about the specifics, but at the time I worked there, all of their screens, including the one that used a camera for AR stuff, were driven by Mac Pro towers running OS X.
So while I can’t say if this is still the case or not, in 2012 many of the times sq billboards were being driven by macs.
I'm sure Parent read the article, but the question is why is this a big deal? For someone not in the live visual space, it's not immediately clear that an orange dot in the corner is "totally unacceptable".
> Imagine an orange dot appearing on top of every video billboard in Times Square.
As a New Yorker, I’d love to have that sort of visual feedback on the surveillance of Times Square, a public space. And whether I care is quite relevant.
> Live visuals aren't making a recording, they're reacting to sound inputs in realtime. That's not surveillance.
Potato, potato. If a billboard has a microphone attached, it's not unreasonable to put a visual tax on it. (Or the economic tax of using dedicated equipment. Nobody is running Times Square billboards off a Mac.)
It's their laptop, they are aware that something is capturing their audio input, and they don't want to be reminded of that in this particular instance.
Frankly, it doesn't matter if you would notice or care.
For the same reason why you may not want to see an orange dot when you watch a movie in a theater and also the reason art galleries don't put orange dots in front of artworks.
It is a special case where Macs connected to a projector are used to display live visual. The microphone is on, most likely so that the visuals can respond to sounds, and you get to show an ugly orange dot to your audience.
Macs are used a lot in the entertainment industry so it is not an insignificant problem.
From purely a privacy perspective, that is an interesting comparison. Some art installations that record their observers do have disclaimers that recording is happening, sometimes by law.
For a lot of the examples being given in this thread, like billboards and movies, I'd actually welcome an orange dot notifying me that audio is being recorded - since that's not an expectation I'd otherwise have. Though this probably isn't what Apple are trying to do, and would need to be enforced to prevent circumvention.
The initial example of live shows (specifically the subset of which where the same device is being used for some kind of audio input and visual output) is so far the most convincing example because the orange dot is redundant there.
I note "some kind of audio input" for that case. Notifying of audio-recording billboards/etc. is a separate case signalled by the dot (which I think could theoretically be a positive, but isn't practically achieved with just this change).
Looks unprofessional IMO if you were watching a live stream, and there was a permanent orange dot on your video. People may be screen capturing using an external output...
Would it bother you if Star Wars had an orange dot in the corner the whole time? People want to make professional videos and live streams, and not have a constant reminder that there's a Mac somewhere involved in the video... and this has hosed people's workflows.
Everyone focuses on the dot and can no longer listen to the content of the presentation? /s
My honest guess is that the author of the article is SUPER mad about this out of some principle-of-the-thing about how the design shouldn't do this, and they just ran with that in writing self-righteous nonsense about how this makes presentations impossible or something.
It's about as bad as having the cursor show up on the edge of a live presentation slide.
"Live presentation" like "the visuals on the video wall behind Mariah Carey while she sings live for 10 million people on TV", not "live presentation" like "your PowerPoint presentation to your 4 coworkers".
This is a news site for professional digital musicians and animators. The kind of people that are in charge of making the digital visuals behind a live performance like I described. This is not a "principle of the thing" argument, it's a "this update stops us from doing our job using Mac hardware" argument.
Yeah, I get it. Apple's design is stupid and shouldn't have happened. But obviously Mariah Carey CAN sing on TV with a little orange dot. Obviously nobody wants that, and in practice nobody will accept it. But it remains completely possible and would not completely destroy her performance.
The degree to which people are whinging about the use of “useless” to mean “customers who pay creators for this work will not accept the result this imposes on it” as opposed to “it is not theoretically possible to use the device for this purpose with the misfeature at issue” is... surprising.
If it were about ignoring their lovingly set font overrides in their browser or their terminal, they'd sing a different time. Which is perhaps indicative of the empathy gap, but hey.
You've never been near a show production, have you? I was involved with an amateur theater that had some projections during one of their plays. Even they were very peculiar about what was projected, where, and how.
One of the projections was a black-and white archive footage. Yes thank you, I'd love an orange dot there, it doesn't ruin it at all.
Honestly, I'm more interested in the way people aren't willing to laugh about it.
I mean, yeah, I don't want the dot there. It breaks the fourth-wall in a way. But there's something to that too. Society of the Spectacle and so on…
It's FUNNY to notice how a little orange dot is taken as such a profound threat. People could do well to reflect on the whole context a bit. Our dependency on tech and the way we depend on a few companies who force things on us, it's all serious, scary, and absurd. And this orange dot business is not the ultimate example of the problem, it's a silly one, that yes, I acknowledge is a huge problem for some people's jobs in practice.
I've been around show production stuff, and I don't respect the way everyone takes themselves so damned seriously in that world. The dadaists were onto something.
They take things so seriously because a lot of money, and the livelihoods, reputations and careers of a lot of people, depend on many, many apparently tiny details. Particularly in live performances every show you get one chance to get that show right. One. And then you have to do it again, and again. But getting it right for the next audience doesn’t make it ok that you got it wrong for the last audience.
Yeah, I get it. People can be really mad when systems they rely on get messed up. It makes PERFECT sense that people are really reactive about this awful design decision and are saying hyperbolic stuff. The part that doesn't make sense is the refusal to acknowledge that it's hyperbole. Although, to be fair, when people are triggered and reactive, we're rarely in the mood to acknowledge such nuance.
In any production things go wrong in a million tiny details.
Anything that doesn't have a workaround gets thrown away. And nope, it's bot a hyperbole. I've seen lamps discarded because they couldn't be made work just right, and stage props removed entirely because they looked out of place for that particular stage.
"Something is shitty, let's laugh about it and continue regardless" is what often separates people who don't care from people who do. Unfortunately, seeing all the shit we have to put up with daily, those people that do are in the minority.
"Something is shitty, let's laugh about it" does NOT inevitably require "continue regardless"
The idea that you have to connect laughing with not fixing things is evidence of being in a triggered, reactive, threatened state.
Did I ever suggest that the problem shouldn't be fixed?
People who can laugh about something are people who aren't in a state of threat. And yes, those who don't care are less likely to be threatened. But we can also find ourselves in situations where we are willing to laugh and not be so defensive and self-righteous while we still care about things.
In our outrage-driven society, that's the real minority: people who CARE and are also willing to laugh and not be in a victim mentality. And to be clear, this isn't some fundamental feature like we simply are one way or the other. These are states we can shift between, and most people are shifting all the time.
I think the context is if you are using Mac to generate visuals that get projected on a screen during performances. I guess now you wind up with an orange dot in top right corner.
The orange dot will appear on the live output, which is fed to an LED video wall/projector/etc. on or near a stage. Completely unacceptable, and makes Macs entirely useless for live visuals.
My temporary fix has been to use an expensive hardware scaler to crop the output when I run into this. You’d think Apple would have plenty of production experts beta testing, I’m sure this will eventually affect their own live events.
Mic input is very useful for receiving timecode to keep production gear in sync.
Yeah, I considered that when I was reading the article. If your content can still look alright being scaled, that'll work. There is definitely some content that won't be quite so forgiving, though.
Because it's not part of the visuals, which should be the only thing on the output. Maybe it's because I've actually run systems for this exact purpose, but I'm kind of surprised at how difficult this seems to be for some to grasp.
But I don't understand why they're mirroring their desktop to some kind of visual display system? Wouldn't they be using something like a DeckLink for this use-case?
Monitor outputs are for monitors - if you want an application-specific video output you can get that and it won't be a desktop so it won't have this problem.
If you mirror your desktop then yeah... you get whatever desktop UI chrome is on your desktop.
Because most of these displays look like monitors to your system. Using the existing rendering pipelines to a full screen monitor is far easier, and less expensive then custom hardware just to move pixels that the built in graphics card is more than capable of
Ok so it's a quick hack for video output but they're going to run into problems like this. And for example any notifications, system updates, Launchpad, whatever, would also appear. That's how you end up with goofy things like a sign with a Windows 'need to update now' message. If they were doing it properly with a production video output they wouldn't have this problem.
You can call it a "quick hack" if you'd like, but I can assure you that all of the potential pitfalls you mention are taken into account by the people who setup and run these systems.
You're not going to "gotcha!" people who have actually done this stuff.
Thank you! It's crazy to me that anyone that would consider themselves a professional would mirror their desktop for the use cases these people claim to be using. "Mariah Carey playing in front of millions on NYE"? Gimme a break. No one is mirroring the desktop from their Macbook for that performance.
That's how the orange dot is appearing - it's a UI element from the interactive user-interface of macOS, but it's appearing on a video out because they're re-using a monitor signal designed for mirroring or extending you desktop.
That might be pedantic, sure, but the point is that extending is not mirroring.
For the use cases like the one described in the article, that distinction matters. I'm not questioning the usefulness of the orange dot, but rather the people saying the workflow of using a second monitor as output as being invalid and wrong.
You know what I mean and the distinction is meaningless. No one doing a show as big as that is doing it without the hardware necessary to make sure that they can control 100% of the output path.
You seem to be confusing two things (well, really just you're being hyperbolic):
TRUE: the dot BEING THERE is useless (and undesireable and bad)
FALSE: the dot makes Macs useless for live visuals
Just because something is 100% bad design, bothersome, 100% negative, and should never have been allowed to happen, does NOT mean it destroys a product entirely.
When Mini Coopers were made with the stupidest turn signals ever (https://news.ycombinator.com/item?id=28661282) that doesn't make the car undrivable even though there's ABSOLUTELY no defense of the design.
The dot absolutely makes Macs useless for live visuals. Nobody worth a damn would ever use a system that required an orange dot to appear on the live output.
You can use them for live visuals. You WON'T (and I don't blame you), but you CAN. "Absolutely useless" obviously is intended to be "I'm super mad about this" rather than literally true.
You're picking a really odd hill to die on. Is it that hard to imagine that having random imagery (orange dot) showing on your live work (which a client has paid $$$ for) is a complete no-go for professionals (if you ever want to get hired again)?
Just mildly objecting to the strength of someone's hyperbole is life-threatening?
Just because other people are feeling super aggravated about this completely stupid design decision by Apple doesn't mean it's valid to project that on me as if I'm fighting some battle.
I don't really care about this. I get it, people are upset about a totally indefensible, short-sighted design that impacts people's ability to use Macs to do presentations that will be accepted in professional contexts.
Yes, there's some sense to the hyperbole that says something is "useless" when reality is "this will not be accepted in my field of work". It's still hyperbole. Admitting that it's hyperbole doesn't mean accepting the bad design.
You're just making a semantic quibble. What if the screen was half taken up by goatse and the other half what you wanted to display? Surely you CAN still use it for live video. But you can't.
The quibble is people saying something similar to "this orange dot is as bad as having half the screen taken up by goatse!" and then when someone else says "it's not actually that bad though, right?" they say "it IS THAT BAD!!" And I'm like, "I get that you're mad, but it's really not actually as bad as you're saying". That's not a semantics debate.
This is like when you crash a car. Insurance company considers it "totalled". But you can still drive it. The cost of fixing the car is more than the value of the car.
In this case, the negative effect of the orange dot outweighs all utility of the product for the use case. So it's useless in the sense that it's "worse than nothing" by some metric. It's useless in the same way that a shopping cart is useless for commuting to work. I mean, TECHNICALLY, you could commute to work in a shopping cart. But it would be worse than just not using it.
I think that's the point that people are objecting to.
The orange dot doesn't total the presentation the way a totalled-car in a crash does, certainly not necessarily. It's NOT worse than nothing. It's not like a shopping cart for commuting to work.
The point of the replies here overall is that those things are hyperbole. It's NOT about nit-picking the language. It's "well, technically, the car will still drive in this case", it's saying that the orange dot DOES NOT ruin presentations that badly. You can ACTUALLY and PRACTICALLY present with the orange dot.
A better analogy: your nice car gets a rock through a window with a noticeable hole and huge cosmetic crack. You say "I can't drive this now! My car is useless! It's totalled!" And people are like "Dude, it's not totalled." And you say, "This is my professional car for business, I can't show up with a cracked window!" and People are like, "well, you could actually…"
It's not like the software has a bug that inverts all the colors, and people are saying "you could in fact do an inverted-color presentation, it's possible". It's JUST a little dot, and it's bad, but it's NOT as bad as the hyperbole.
I'll take you at your word that it's not that bad for you. I'll take someone else at their word that it is that bad for them. It probably depends a lot on the context.
If a car with a shattered window is completely unacceptable for some situation for some reason, a person could say (still hyperbolically), "it might as well be totalled". It's still stupid to say "it's totalled!" (and that's not because I'm objecting to the insurance definition).
The orange dot being unacceptable to some people and situations is never something I've doubted in the slightest.
Some people can't seem to grasp that it's perfectly consistent to say "you're being hyperbolic, but your objection is fully sound". It's as though after a crash that results in a shattered window, someone says "it's totalled!" and then if someone else says "it's NOT totalled" they take that to mean that a shattered window is no real problem.
The orange dot is totally stupid, unacceptable in many situations, the decision was atrocious, AND the language people have been using about it is hyperbolic.
Visually distracting.. a loss of control in design.. a reminder of Apple when you are displaying unrelated content.. A live show doesn’t need the visual art to have an orange dot. Im surprised you can’t see how that could matter.
It mostly concern artists that present video such as for art exhibition or VJ live shows I believe. Any unintended visuals would kill the immersion, the aesthetics and looks unprofessional. Like leaving the mouse cursor visible or UI elements.
I see that some people above suggested a solution with an extra hardware device but even though a lot of theses creatives are using macs, they aren't especially tech literate outside of their field and have to make do with venue that can use quirky hardware with sometimes very limited time to set up their gears.
My interpretation from the article is that you might imagine the external display output is a video wall for a concert with a full screen visual show being synced to music or something, thus the simultaneous audio capture going on.
Of course the correct security solution is to put a physical switch for the mic (that is not software mediated) on the hardware, together with an LED that behaves like this "orange dot."
Why is microphone and camera usage indicated through a software stack onto a display? Shouldn't there be an LED tied to the power in pin on these peripherals?
Yes! Just as there is for the camera, there should be a hardware indicator for the microphone (perhaps enabled by software for an external mic; ditto for an external webcam.)
One of the best features on Macs for the past couple of decades has been the small green light. They should add this, as hardware, to iPhones and iPads. Software dots on the status bar indicate to me that the OS people view it as a key issue but the hardware folk haven't caught up to the issue yet.
That dot is there to let people know that there's a live mic listening to them and possibly recording them in case they don't want that, right?
So basically, what this guy is saying is that sometimes he wants to show an app that is listening to the audience, but without letting his audience know that they could be recorded by his application.
Here's the crux of his issue.
"In the interest of security and privacy, Apple on macOS Monterey has added a prominent orange dot to display outputs when audio capture is active. That renders their machines unusable for live visual performance, though, since it’s also shown on external displays. Dear macOS team – we urgently need a fix here.
The basic idea here is sound – to avoid software hijacking your camera and audio input and spying on you, essentially, there’s an orange dot to let you know recording is active. But this essentially makes the Mac unusable for live visuals, since it impacts external projectors and LED walls and the like. (Those applications don’t even need to be obviously using audio; live visuals often use mic or line input to produce sound-reactive animation and the like."
Here's where I feel he's being disingenuous:
"Those applications don’t even need to be obviously using audio; live visuals often use mic or line input to produce sound-reactive animation and the like."
So they're not using audio, but may be using the microphone or line input to access sound for "reasons". That's called "using audio". He's trying to draw a distinction that doesn't exist. He wants to make it clear he's using audio in a non-privacy-violating way. And I believe that he is. I do believe he's a good-faith actor who is just trying to use audio input to enhance certain presentations.
However, it has the same behavior as people who do not act in good faith. This is a prime example of dipshits ruining it for everyone. There can't really be an exception because then that exception just gets used by the bad faith actors in the space.
To solve his problem what he really needs is a monitor/projector that doesn't show the top however many pixels required by the menu bar. In a very amateur way, this would be accomplished with a piece of electrical tape over the monitor. For a projector, there could possible be a special lens cap that cuts off the required area. Like a matte lens.
1- I doubt anyone at Apple intended the orange dot to be broadcast to everyone watching their television during a live musical performance on SNL with visuals in the background. It is not possible for the performance to be recording any of the viewers, so there's no possible "dipshit" to be protected from.
2- The dot appears on secondary displays running full screen video: there is no menu bar to crop off. You can only crop off part of the visuals, or your visuals have to be lower resolution to produce artificial dead area to crop.
3- Apple has never given any indication that they intend to add indicators that someone else is being recorded by my computer. The indicators are for the computer owner, to indicate that software may be recording the owner without their knowledge. The parties are all different than you're supposing.
2 - The dot essentially forces a bar. It's not an overlay on the application's graphics. This is what is shown in the images. So there is something to crop off.
Not to mention, if it is known, it can be planned around. This is exactly what television has been doing for years. There are scan lines that don't get shown on screens despite being broadcast.
3 - This is almost as disingenuous as the article. The dot is there to inform the user of the device/screen that they could be recorded. The audience during a performance would qualify as a user in this case. It's basically an indicator saying, "if you can see this, it's possible the device showing this is recording your audio". That is completely relevant for an audience. If you think it's not, then why is it relevant for anyone else. There are thousands of excuses one can use to justify hiding this information from anybody. Once you are potentially violating my privacy, it is relevant to me.
I promise you that the people who do those visuals for SNL (and every other wacky scenario people are coming up with) are not mirroring or extending their desktops. They're using dedicated video hardware for this exact reason.
This has the be the weirdest requirement I've heard from someone who wants to read a blogpost on someones website. Why it matters if it's behind a CDN or not?
Problem is that the person hasn't handled even a small amount of usage load before, even the cheapest instance from DigitalOcean + NGINX can handle most loads you throw at it if configured properly for static content, which it almost is by default when you install it.
Why though? It works for me, the page didn't struggle at all. I've seen all sorts of single hosts on underpowered machines handling the extra load just fine, as long as it didn't require much server-side (static websites). Take the solar-powered server from lowtechmagazine for instance.
It might be a conscious decision to support a decentralized web. We can't just host everything on the servers of Big Tech and act surprised when an outage takes down a huge chunk of the web.
HN: Centralization is the purest form of evil, you should self-host everything!
Also HN: If you don't put your site behind a CDN I won't even bother reading it because I will lose interest during the 90+ seconds it takes to load.
(And yes I know that these posts are not usually made by the same people, but it still amuses me to see posts with such radically differing views on the front page at the same time)
Well congratulations, you are the one person in the world who bothered to go dig into the network before opening the page and save him one millisecond of CPU time.
This is yet another instance of the OS reuse problem, e.g., where airports, banks, medical terminals, etc have all kinds of unwanted features because the software is based on booting a GUI OS desktop and using hacks to make it display the program. Not valid engineering of course.
Yes, it would obviously be much more "valid engineering" to have each of these devices have their own custom, invariably much buggier and crappier, OS.
> The basic idea here is sound – to avoid software hijacking your camera and audio input and spying on you, essentially, there’s an orange dot to let you know recording is active.
I'm all for shitting on apple, but does the orange dot actually obstruct anything? It seems to just be sitting in the notification bar being out of the way.
Visual artists often use secondary monitor outputs to display media on projectors or large LED walls. Secondary outputs are also commonly used in broadcast for keyed font overlays among other things.
https://github.com/s4y/undot
EDIT: As far as I know, the best long-term answer here is for apps that present visuals full screen to "capture" the external display for exclusive use using an API (https://developer.apple.com/documentation/coregraphics/14562...), but that's not super common right now.