It matters to the business manufacturing the cameras a lot. I worked PM at a smallish PTZ camera manufacturer in Minnesota and we chased certification from MS, Zoom, and Google for YEARs. Constant moving of goal posts, straight up ignoring us because our volumes were somewhat low because we focused on high end education and live event cameras, and continual changes to the standard we were developing against was common. These companies used the certification to get access to data, keep their experience a walled garden, and disadvantage anyone who didn't play ball with the big soft client owners.
We lost a lot of business because the people specing projects would require certification. It was a no brainer for them because then you would not need to worry about compatibility. Execs of course got on the Teams bandwagon and wanted it for the same reason. I feel this kind of walled garden stunts innovation and prevents smaller manufacturers from entering markets which will lead to a handful of companies controlling exactly what happens in tech behind meetings. Not the end of the world when good enough rains supreme. I just personally enjoyed when meeting USB specs were the only hurdle we had to worry about.
Out of the business now. But figured I would share since I was on the receiving end of this anticompetitive marketing nonsense.
I wonder if it's that kind of consolidation of business that causes, well... all of consumer webcams to suck. In particular, it's super hard to find a quality lens with narrow field of view. I don't need everybody on the meeting to see all my walls and hallways; I need a "50-70mm equivalent" lens to show my portrait, that's it! I have no idea why wide-angle lenses are the norm that they are. Digital zoom of course just compounds the already low resolution and crappy quality of pixels. And let's not even talk about noise and dynamic range :-/
A $30 webcam is a miracle of manufacturing efficiency and one shouldn't expect much out of it. But I'm in the "Shut up and take my money" segment to get a quality webcam (since I use it at least 4hrs a day, every day), but it just doesn't seem to exist (I did run a proper camera with a f/1.8 85mm lens for the first year of Covid, but as great as the results were, that was just a logistical pain and overkill).
If you're in the Apple ecosystem, their continuity camera feature is really slick. It appears to use the 2x lens by default on my iPhone 12 Pro which is a 52mm equivalent (newer iPhones have narrower telephoto). The quality is excellent and it's completely hands off. It just activates itself if enabled on the phone.
An Opal C1 might be what you want. It is expensive, however. You may also be happy with a DSLR hooked up to an appropriate lens. My problem with the Opal C1 is that I don't have a wide field of view. I use it in a meeting room and I have to place it far back to get everyone in.
I bought an Opal and can't recommend strongly enough to buy literally any other camera. The software is hot garbage. Using it on my M1 mac for Slack, Teams, and Zoom is an exercise in frustration and futility - constant crashes, Opal software deciding to randomly open up and take over the video, killing the video feed in the soft clients, etc.
I could go on but this thing isn't worth $50 let alone $300. The only reason I still use it is because the picture is decent (not great) and I don't want to just trash a $300 camera.
Have you considered getting a ZV-E10 for this? That’s one of the smaller sony mirrorless cameras which behaves like any other webcam via USB, so it'd be just as simple to use as any other webcam.
Considering webcams are now reaching prices of 400$ (e.g., https://www.elgato.com/en/facecam-pro ), the quality improvement of a proper mirrorless camera can't be dismissed.
> I did run a proper camera with a f/1.8 85mm lens for the first year of Covid, but as great as the results were, that was just a logistical pain and overkill
I always wondered if this was something for a Raspberry Pi and one of those Pro cameras they got. Is that something that can be done easily?
edit: Quick googling revealed that it is not that easy. The RPi can provide a video stream but that needs (more) software on the computer than just a USB webcam.
it's not terrible, you can make the pi just always boot to showing the video on HDMI and then use an HDMI USB capture tool into OBS which makes a virtual camera (you still need to handle the audio separately, though)
In some cases the software compression offsets the camera quality anyway. Zoom for example does only 720p. You CAN get 1080p if you pay for a special license to flip a switch but even that works….only sometimes.
Not the end of the world when good enough rains supreme
I wouldn't put it that way. I'm perfectly willing to fork out for high quality audio or video, integration, and so forth - but the most important feature for any conferencing hardware is that it be seamless. I love that my headset has excellent audio quality, but I love just as much that I don't have to faff around with it; the user experience is nearly comparable to picking up the telephone.
> On its site, Google says certified peripherals must receive automatic firmware updates over the air and are tested for "quality, reliability, and interoperability."
What the hell. What about keeping hardware I/O thin, and handling "quality, reliability, and interoperability" at driver level?
Why do we keep adding more and more independent software stacks that require updates to our list of worries, and not less.
Sure the OS handles drivers. However, nearly all USB peripherals we use every day are 100% standardized – or at least their core functions. USB HID Class, USB Audio Class, USB Video Class.
Three things have special drivers (on Windows) in my current setup: The various USB controllers (inside the laptop and the Thunderbolt dock), the USB Ethernet controller in the Thunderbolt dock and the Bluetooth controller. That’s it. Keyboard, mouse, webcam, audio, even my wireless headset: All generic drivers.
This all means the devices themselves must implement the appropriate abstraction on top of their specific hardware.
> Keyboard, mouse, webcam, audio, even my wireless headset: All generic drivers.
Not if the vendors have their way. I plugged in an external Logitech mouse to my locked-down, corporate, non-admin permissions laptop only to see Windows automatically install a Logitech mouse utility. Not entirely sure what it does, but it also added a constantly running Logitech update service.
Given the strong security track record of hardware companies, I am sure it is fine that they get to automatically install software without any approval from me.
That’s something slightly different though. Microsoft had this crappy idea that drivers are no longer just drivers but instead you’ll also get all the usermode software you never wanted automatically.
IIRC you cannot disable this behavior without entirely disabling driver installs/updates from Windows Update.
> What about keeping hardware I/O thin, and handling "quality, reliability, and interoperability" at driver level?
Almost all operating systems come with a generic, PnP driver, that can handle most things with a meh-OK level compatibility. For example, the Windows 11 generic driver for Intel graphics causes a black bar on the bottom of my 4K screen until I install the official, everything-and-the-kitchen-sink, Intel driver for my device; but it's serviceable.
I would dream of a world where everything was handled by firmware in-device, so that we wouldn't almost ever need anything more than the basic PnP drivers that come with the OS (even though we are nowhere near that yet). You are asking why we can't go back to having customized drivers for everything. If we could have generic drivers for everything, making new operating systems for hardware would be so much easier, and we wouldn't need to fork the kernel a bajillion times for everything. Niche operating systems like FreeBSD, Haiku OS, Redox, ReactOS - they would also become so much more usable considering they'll never have the manpower to write a proper driver for every device.
On top of that... drivers rarely survive operating system refactoring. Linux doesn't have a stable driver interface at all between versions - so it's either get your drivers upstream past nitpicking and sometimes political maintainers - or keep your proprietary fork up-to-date until you get bored because you've got a new product out this year (story of almost every cheap Raspberry Pi clone board). Windows has a stable driver interface... but they still change it every few releases (Vista was a big one, 10/11 are going in on DCH), and you can't use your Windows XP drivers on Windows 11 [Note]. Firmware-on-device means that if your driver doesn't get maintained or updated, you aren't SOL.
[Note] This is also why... "what's with hospitals still using Windows XP?" Well, their software only runs on Windows XP, and often, the drivers only run on Windows XP; so even if the software could run on Windows 11, it couldn't talk to the medical device, which could cost... tens of thousands to replace.
Firmware on devices and thin, consistent USB layers is a blessing, not a curse. Who is going to port your webcam’s fancy driver to Linux or FreeBSD?
Frankly, I wish as many devices as possible worked this way. Operating Systems would be so much more secure and easier to port if we didn’t have to worry so much about those darned drivers.
> Teams certification has its share of marketing tactics, too. Certification requires that products advertise Teams and encourage its use by featuring an integrated light-up Teams button. As someone who uses various meeting platforms, I can confirm that this feature doesn't feel like a customer-first initiative.
That tactic is not a new one; the reason most computer keyboards still have a "Windows logo" key is because, many years ago, Microsoft mandated the presence of that key, with that trademarked logo, in its certification standards.
> That tactic is not a new one; the reason most computer keyboards still have a "Windows logo" key is because, many years ago, Microsoft mandated the presence of that key, with that trademarked logo, in its certification standards.
Apropos of nothing - save for this comment - is there a reason why Microsoft hasn't switched the primary modifier for keyboard shortcuts to the Windows key? Apple created the Command key to get around having to use Control, long before NeXT and its UNIX roots were even a consideration.
Is the only reason a switch hasn't happened down to 40+ years of inertia?
> Is the only reason a switch hasn't happened down to 40+ years of inertia?
What you call "inertia" others would call "compatibility". We have 30+ years of muscle memory and working software to keep in mind. The same shortcuts which worked in Windows 3.x should mostly still work on Windows 1x. Apple created the Command key early in their ecosystem's lifetime, so they didn't have as much backwards compatibility to worry about.
And even back when the "Windows logo" key was introduced, there were already reasons to not do the change you propose: first, we already had a large amount of Windows 3.x software and users to consider, and second and most importantly, Windows 9x could be used with keyboards without the "Windows logo" key (and they were very common back then). Even today, that's still possible: you can still use a keyboard without both the "Windows logo" key and the "Menu" key, and everything still works perfectly fine.
(I do agree that it would better if clipboard shortcuts used something other than Control everywhere, but that ship has long sailed.)
The Windows key has a lot of system level shortcuts already. As callalex mentioned, the Super key is usually for system level shortcuts while the Ctrl key is mostly for application inputs.
There's a lot of system level shortcuts. Many of which I use all the time.
Win opens the start menu and lets you start typing to search.
Win+Tab is the new task switcher UI. Useful for moving apps between desktops.
Win+L locks the computer, which I use almost every time I get up from my computer.
Win+M/Win+Shift+M minimizes/unminimizes all windows.
Win+Home minimizes all but the currently selected window, pressing it again resets it all.
Win+, shows the desktop until you release the Win key.
Win+E opens a new file explorer window
Win+{0,9} opens/switches to the application pinned to the taskbar at that position.
Win+Arrow Key maximizes or snaps windows to the top or sides
Win+Shift+Arrow moves windows between monitors
Win+R opens the Run dialog
Win+K opens the Connect tool which helps quickly connect to wireless displays, speakers, and headsets.
Win+Shift+S opens the snapshot tool.
Win+V opens clipboard history
Win+P changes presentation modes, useful for projectors
That's...certainly more than I recall, having left Windows behind for good in 2009 after using both a Dell Latitude and a 15" MacBook Pro in my last couple years of college.
I'm surprised that Win+P isn't "Print" by now. Sure, when you invoke it, you're telling the application to print the contents of window or the document you're looking at, but to some people any application should be able to print.
The default macOS application project in Xcode still gives you a print menu item by default and I've even heard stories from people who have gotten rejected from the Mac App Store because they hadn't explicitly removed that item from the menu.
Just about every application that supports printing in Windows and Linux listens for print as Ctrl+P and has since Windows first came out and super keys didn't exist. Firefox, Edge, Chrome, Notepad++, Visual Studio Code, Outlook, Thunderbird, Word, WordPad, Excel, Powerpoint, OneNote, Paint, Photos, etc. all open their Print dialog when you do Ctrl+P. Why would I need Win+P to do Print when Ctrl+P is practically already the universal shortcut for printing?
Meanwhile, when I'm wanting to change my presentation mode for a projector, I'll press Win+P to change from extending my displays to mirroring or have only my local screen or only my remote screen or connect to a wireless display.
Long story short, Microsoft chose the wrong horse in this race. The fallout from this decision should have been apparent when WSL came into being - I would be shocked if the WSL didn't use Ctrl-C to send a keyboard interrupt.
In your use case, I would put that down to "Enterprise entanglements" where the thing driving the development of Windows features is the needs of business customers and not regular users at home.
People need to print far more often than they need to deal with presentation modes in PowerPoint.
> People need to print far more often than they need to deal with presentation modes in PowerPoint.
They can, Windows apps have been able to print since Windows came out. I don't know why you feel that because Super+P (Win+P) doesn't do print then Windows users can't print things. And they can do so with the same keyboard shortcuts they used since before their keyboards even had a Super key.
And its not for changing presentation modes in PowerPoint, its for quickly changing your monitor arrangement system-wide.
> Microsoft chose the wrong horse in this race.
Microsoft (and practically every app developer for DOS/Windows) chose the Control key over the Super key because practically all users at the time didn't have a Super key on their keyboard. Which is more user hostile, force keyboard shortcuts which use a key the vast majority of users don't have, requiring them to buy new keyboards just to use keyboard shortcuts, or repurpose an existing key that a lot of users don't care much about, and that many applications were already intercepting for keyboard shortcuts?
> The fallout from this decision should have been apparent when WSL came into being - I would be shocked if the Windows didn't use Ctrl-C to send a keyboard interrupt.
Linux apps behave practically the same way as in Windows. Guess what the print shortcut is in gedit? Guess what gedit's copy and paste shortcuts are? Guess Firefox's print or find shortcuts in Linux. Take a look at most of the keyboard shortcuts in tmux. Look up how to copy and paste in gnome-terminal. Guess what key is the most common one in Emacs shortcuts. Hint: they don't use the super key!
Control key is for sending control codes. Command key is for issuing commands. We're still dealing with the legacy of Windows 3.x using Ctrl where Apple realized that a special modifier key was needed from the beginning.
God, the non-Apple ecosystem is primitive and user-hostile.
> God, the non-Apple ecosystem is primitive and user-hostile.
Exactly which Windows users do you think... Send control codes on a regular basis?
'User-hostile' is exactly how I would describe being more concerned about conceptual and behavioural purity for a domain that ~none of the users have cared about for more than 2 decades.
(It's also pretty weird to champion Apple, here, when it's taken them how many years to figure out that maybe having a mouse with more than one button is desirable... And I also wonder how people with, say, only one functioning hand were expected to control-click in MacOS. Didn't see a lot of them in Apple commercials, though, perhaps they weren't the target audience...)
It is easy to make a webcam that performs poorly for certain scenarios, particularly with poor lighting, or when mounted to a monitor that has integrated speakers. Jitter in audio causes problems with automatic echo cancellation.
It is much less important than in the 2010's. At that time, Skype certified cameras to provide h264 encoding in the camera. Those encoded frames would then pass through directly to the network. That kind of encoding is very demanding technically, so certification had plenty of value. Today's MJPEG cameras don't benefit as much.
Audio over USB is supposed to be synchronous, but if the webcam is buffering somehow, you could get jitter in the microphone. Some HDMI implementations have been known to have similar problems on playout (variable delay, sometimes with a snap-back). If playout and microphone don't have relatively constant delay, that makes the AEC adaptation incorrect, and you'll get echo defects.
I'm pretty sure back in the day you could get keyboards with a "Works with Netware" sticker on them, so virtually meaningless certifications aren't a new thing of course.
MS Teams refuses to put some webcams in HD mode on Mac. Zoom may require some configuration for some HD webcams.
Certified stuff makes it easy for procurement groups to defend "IT's" purchase choices and easy to distribute across an enterprise when you know it'll mostly work out of the box.
So marketing, but like any product cert, it's a cya for the buyer with the hope of simpler user experience for a distributed workforce. And, of course, an excuse to charge more.
I bought a 'Microsoft Modern Webcam' based upon some positive online reviews. This webcam has the bizarre stipulation that it is only certified for Teams if the microphone is turned off!
To make matters worse, this means that the camera defaults to the microphone being disabled, and you need to download their proprietary software to enable it. Which is Windows-only. I've got a mac. So I ended up having to install Windows in a VM just to let me enable the microphone on my webcam. (Fortunately the setting is persistent and you only need the VM once)
I think the auto light settings on the camera need the Windows software too, it's a remarkably bad camera. Avoid!
> I think the auto light settings on the camera need the Windows software too
Ew! That also means the light is software controlled rather than inherently tied to the camera being powered on. You want those on the same circuit so it's physically impossible for some malicious software to turn on the camera without the light!
Look up a few "teams enabled camera" webcam on microsoft website. They all seems to be 1080p or higher camera as the article described. And they do have little negative reviews on amazon product page. That probably actually worth looking if I want to buy a better camera for streaming or something else next time. Wondering why no product advertise themself as "teams enabled camera", as it actually need to pass a somewhat strict test to get that badge.
For most, if not all of us, it doesn't matter, but to my 55 year old and pretty much tech illiterate father this is pretty major. The one to get is the one that works on his device
On the same note, I find it very annoying that my AirPods dont work with msteams.
It used to work on my iphone but now even that doesnt work, I have to take one out, otherwise the airpods are trapped in a disconnect-reconnect loop.
I feel teleported to the stone age everytime I have to switch to my wired earbuds, just to attend my daily.
And theres nothing I can do as a user other than buy another pair of earphones.
Are you sure you're not having some kind of weird hardware issue? I've been using AirPods Pro 1 & 2 for a few years and they are working fine with Teams. I don't have the regular version, only the pro, so there's a small chance that this is only limited to that.
I think you may have some issue with your hardware. I have 3 generations of airpods and they all work with Teams across my iPhone, MacBook Pro and even on my Windows 10 desktop.
From the consumer side, it’s difficult to find meaningful specs. I can’t find a webcam that explicitly states that it has AV1 support, not even Logitech.
While I tend to agree with the author that there are far too many "certified/designed for" logos that are meaningless, I have always purchased devices that are certified/designed for Teams/Skype for Business/Lync/OCS R2/OCS/LCS[0].
Back in the OCS/OCS R2 days (2008ish) the market was still pretty impressed with Bluetooth Headsets[1] (and people walking around like they're talking to nobody). Many "computer headsets" were extremely low-end mono-headphones/mic combos with physical mute switches (that often didn't "fully mute"). The sound quality was on-par with a terrible analog POTS call.
Our company made the choice to ditch all IP (Cisco at the time) phones and legacy PBX equipment and replace "everything that was practical to do so with" with computer attached OCS R2 certified headsets.
Here's what you (often) get with certified vs non-certified:
- Assuming the device has a mute/unmute switch/indicator, it will match the mute/unmute state in the client. This was (probably still is, but my devices are certified) a huge problem. Non-certified devices still muted but other participants wouldn't see the mute icon. What's worse than waiting for someone to "unmute"? When they think they're unmuted but their second mute button is muted! My Jabra Speak 410 shows the mute state on the device, it matches what's in the OS and it matches what the client sees/reports. My old Polycom (similar) did the same and short of having to be oriented in a certain manner for Echo cancellation to work properly, it was a similarly trouble-free experience.
- It will always[2] pick the right device for the call and the ringer. If the certified device is attached when you join a call, it will pick that device. If the OS has something different setup for when you're playing games, it still picks right.
- The sound quality exceeds what the client is capable of delivering. I'm using a USB wired "speakerphone" device that I picked up for $150 in late 2012. Nobody can tell I don't have a headset unless I'm on video; it has always cancelled noise/echo and works brilliantly with "just me" or in a small conference room with several participants[3].
- Other features tie into the client correctly. If it's got a "call answer" or "hangup" button, it can be used to join or hang up a conference. Volume indicators match what the OS/Teams indicates.
Can I find a headset, today, which isn't Teams certified that does all that in Windows 11? Probably. I've heard that the microphone tracking the OS does now makes the mute/unmute situation a bit better. BT Headsets and their support in Windows has improved enough that if I've got my high-end Bluetooth headphones connected, I'll use them. But for the last decade, picking a "certified" device (Teams/Lync/etc only -- I can't speak to the others) meant a completely grief-free experience. Aside from "no drivers/no setup/It Just Works(tm)", the headsets are all "at least mid-to-high-end"
[0] I think that covers the naming back to 2005?
[1] I remember my BlueAnt Z9
[2] Yeah, there's no such thing, but it's always worked for me across several devices.
[3] So well, in fact, that when we merged with a competitor and they opted to implement Lync (at the time), they stuck (and later secured) one of these in every conference room instead of purchasing much more expensive devices (opting, instead, to outfit some of the better rooms with very high-end solutions).
We lost a lot of business because the people specing projects would require certification. It was a no brainer for them because then you would not need to worry about compatibility. Execs of course got on the Teams bandwagon and wanted it for the same reason. I feel this kind of walled garden stunts innovation and prevents smaller manufacturers from entering markets which will lead to a handful of companies controlling exactly what happens in tech behind meetings. Not the end of the world when good enough rains supreme. I just personally enjoyed when meeting USB specs were the only hurdle we had to worry about.
Out of the business now. But figured I would share since I was on the receiving end of this anticompetitive marketing nonsense.