Hacker News new | past | comments | ask | show | jobs | submit login
OBS (macOS) Virtual Camera (github.com/johnboiles)
395 points by mistersquid on June 3, 2020 | hide | past | favorite | 155 comments



If you are a linux user and own a nice camera you can use gphoto2 and ffmpeg to create a virtual camera. I posted howto on HN couple of days ago[0][1], here it is for anyone who might need it. I tried it with both Sony RX100VA and Sony A7III, in both cases it works really well.

edit: forgot to mention that this works over USB, you don't have to pay crazy markup for capture card

edit2: (because I'm so excited about getting this to work) here is a list of supported cameras[2] - sadly I was not able to get GoPro Hero 6 to work.

[0] https://www.crackedthecode.co/how-to-use-your-dslr-as-a-webc...

[1] https://news.ycombinator.com/item?id=23325143

[2] http://www.gphoto.org/proj/libgphoto2/support.php


Hey! I was trying to do this a while back, and couldn't make it work.

I followed the instructions in your post, and (Although it didn't work upfront) gave me the will to make it work ;)

Little advice: I fixed my set up by finding the correct v4l2 device, because video0 was already assigned. If you run:

v4l2-ctl --list-devices

it will tell you where v4l2 is plugged in your machine, in order to enter the correct command (That was the only part missing for my puzzle) as if you already have a webcam in your computer it will already /dev/video0 assigned, and the gphoto | ffmpeg piping gives too cryptic messages (It complains about the formats not being correct, while it should complain about it not being a v4l2 device)


Shame that there is a perfectly good standard for cameras... The USB Video Class... Why have all these cameras decided to go use a different set of protocols that don't work out of the box in any OS?


I'm guessing this is because proper live view video from a DSLR requires much higher bandwidth than USB can provide. The protocols used for remote capture and download over USB should be more standardized, for sure, but live view seems really hard.


It's actually much more because the DSLR companies, for the most part, are technologically-backwards, and don't get things like platforms, APIs, or similar. It's nineties-style closed thinking. It's a big part of why cell phones are now eating their lunch. I used to be a pretty serious photographer, and own probably $10,000 worth of camera equipment.

I mostly shoot with my cell phone these days, not because I mind spending money on cameras, but because it's a better device for most photography. It integrates with the world. Cameras integrate with their manufacturer's closed ecosystems.


USB Video class allows the device to provide a list of formats supported, and the host to choose one. That seems suitable for the host to manage bandwidth across its USB links, even if other devices are also using bandwidth.


I got a GoPro Hero 8 Black into OBS (Linux, but should work anywhere) by connecting to it using WiFi and its semi-documented api (https://github.com/KonradIT/gopro-py-api) to turn on its UDP video stream, then used that as the media source in OBS. I'm betting your Hero6 will work this way too.


Is this possible in MacOS at all? I have an RX100 V and an Elgato Camlink HD, but would love to use that capture card w another cam, and use the RX100 over USB simultaneously.


It works for Mac

  brew install gphoto2
  brew install ffmpeg --with-ffplay
  gphoto2 --abilities
  # Abilities for camera             : Sony Alpha-A6300 (Control)
  # ...
  gphoto2 --stdout --capture-movie | ffmpeg -i - -vcodec rawvideo -pix_fmt yuv420p -threads 0  -f matroska - | ffplay -
I'm piping it to ffplay,so this will at least let you test your camera or you could also use it in OBS as a window source. Also, make sure your cameras usb mode is not set to "mass storage" but to a “Remote Camera Control”.


Thanks for the tip, really appreciate the actual commands. I'm wondering if anyone else is running into this:

    $ brew install ffmpeg --with-ffplay
    Usage: brew install [options] formula

    # Install flags here, nothing about --with.
    Error: invalid option: --with-ffplay


Your right, they removed that option, https://formulae.brew.sh/formula/ffmpeg. I guess ffplay is built with it by default now.


can you try piping that to gstreamer?


Sure, not sure what gstreamer plugin/sink would create a loopback device, but this plays as well.

   gphoto2 --stdout --capture-movie | ffmpeg -i - -vcodec rawvideo -pix_fmt yuv420p -threads 0  -f matroska - | gst-launch-1.0 fdsrc fd=0 ! decodebin !  videoconvert ! videoscale ! autovideosink


per this - https://apple.stackexchange.com/a/356362

it should be,

  osxvideosink

I wonder if this works,

  gphoto2 --stdout --capture-movie | ffmpeg -i - -vcodec rawvideo -pix_fmt yuv420p -threads 0  -f matroska - | gst-launch-1.0 fdsrc fd=0 ! decodebin !  videoconvert ! videoscale ! osxvideosink


Looks like both gphoto2 and ffmepg are available on homebrew, worth giving it a shot for sure. FWIW - I did end up building my own ffmpeg because debian default didn't have NVIDIA support. Want me to give it a try?


I'd love it if you gave it a try! I've already spent enough hours re-compiling ffmpeg to get nvenc support :)


Spoke too soon, v4l2loopback-utils is the missing piece on mac. Found this with a basic search[0], if anyone enterprising enough wants to take a crack at it

https://apple.stackexchange.com/questions/353168/how-can-i-c...


Yes please! Would love to use my X-T30 as a webcam on my mac.


Cascable Pro Webcam claims to support the RX100 V:

https://cascable.se/pro-webcam/

Compatibility table:

https://cascable.se/help/compatibility/


Here's a simple gui app[0] that creates a syphon stream, and you can use that from an app called camera twist or possibly from OBS as well.

[0] https://github.com/v002/v002-Camera-Live


After going down this rabbit hole myself, I can't recommend this approach.

First of all, it works! And it's pretty cool to not need a capture card. But, for most cameras, you only receive at the resolution of the on-camera screen.

In other words, the video stream gphoto2 receives is intended for a camera remote preview screen. Check your camera's resolution before investing in this as a solution -- my very expensive 4k @ 60fps-capable mirrorless camera only produces a pretty poor 640x480 @ 50fps stream using gphoto2.

Additional video recording features like flicker reduction or IS seem to be lacking through this method as well.

In the meantime, I'm patiently awaiting the delivery of my 4k capture card :)


That's a dope trick. I can only wish there was a hack as easy as this to turn my iPhone into my Mac Mini's camera/microphone.

I've been looking for something that does this for a while now, WFH on the Mac Mini without a camera or mic is just dreadful :(


OBS does let you use an iPhone as a camera. (Not sure about mic though...)


I'm not sure which of the solutions you're referring to, but after Googling I found these three solutions:

- https://obs.camera/ (which seems to be the most promising)

- https://www.newtek.com/software/ndi-camera/ (mentioned in a forum)

- and EpocCam https://www.kinoni.com/ (from the answer below)

I'll leave these here for people who stumble upon this answer, but if you were talking about another alternative I'd love to know about it. For now I'll give OBS.Camera a go and see if I like it. Thanks!


From memory, the solution I used was “plug iPhone into USB port, select iPhone as a camera source in OBS”

Pretty sure it “just works”.


Look at EpocCam. Works for me.


Oh no... I just got a capture card to use with my A7 III :(

Will give this a shot, luckily I'm still in the return window.

E: Works great! Make sure to enable "PC Control" in settings for other Sony cameras. I had it set to USB Mass Storage (which is maybe the default?)


What card did you get?


Razer Ripsaw HD - it's apparently equivalent to an Elgato HD60S, but most importantly it's actually in stock :)


Before you return your card, keep in mind that gphoto2 will only receive at the resolution of your camera's display. For example, my camera can shoot at 4k, but gphoto2 only provides 640x480p.

Additionally, many camera features (flicker-reduction, electronic shutter options) are unavailable through this method.

This is really equivalent to a camera remote preview video, not intended to be used for actual video capture.


It is also possible to use the OBS output as a virtual webcam. All you need is v4l2loopback and the v4l2sink OBS plugin: https://github.com/CatxFish/obs-v4l2sink

It work perfectly and the virtual camera can be used with Jitsi, BigBlueButton and the likes :).


Hah, I just followed your guide after stumbling on it on Google yesterday to set up a Canon M50 on Linux Mint! It works incredibly well—heads and shoulders above the video quality from a webcam, and now that I'm piping video through ffmpeg there's tons of potential to do some weird stuff with filters and swapping to pre-recorded video.


I wish I can take credit for this, Ben Chapman did all the work, I just happened to google it :)

And yeah, 100% agree on video quality, it is so much better than what you get from that potato sensor on MBP


My Sony a6300 is supported, but I can't get it to work. Doesn't even show up on lsusb when I connect in PC remote mode, much less in gphoto2 --auto-detect. I'm stumped. Too bad, that would have been useful.


The USB mode has to be changed. Set it to “PC Remote” mode. https://helpguide.sony.net/gbmig/44840601/v1/eng/contents/TP...


Thanks. That's what I did. When set to PC Remote mode, the camera does not show up is lsusb or gphoto2. The other modes work, but don't support capture.


On the off chance that somebody stumbles on this: I got it to work in the end. After the initial experiments, I noticed the camera refused to charge from a bog-standard USB charger (with any cable). Removing and reinserting the battery restored charging functionality and, as it turns out, the camera also started being visible in PC remote mode. Serves me right for only using the regular on/off switch.


You should try another microUSB cable - my first microUSB cable did not work because it was charging-only.


Thanks. I tried several, but I suppose they could all be bad. Mass storage mode works, though.


Yet another way that my Panasonic GX-1 can't capture video :(. It has a mini HDMI port, so I thought I could do it there, but it doesn't do live-view over HDMI, just playback.


Do you have any idea if this works with the older GoPro models?

Plan to give it a go later tonight but wondering if anyone has any success stories.


This is what I found while I was trying to get GoPro to work with gphoto2[0] - would love to know if you do get it to work somehow.

[0]https://sourceforge.net/p/gphoto/mailman/message/36174298/



That table is why I posted the question.

It’s unclear if that’s the list of supported cameras or if it’s a list of cameras and only those with entries in the next two columns are supported.

Either way I can’t wait to get home and mess with it.


I have an old Canon Powershot ELPH 300 HS, with newer models on that support list. I'll have to try this anyway.


Where does audio come from? From DSLR? If you need a separate mic, is there a/v sync issue?


I use a USB mic, and so far I or anyone I've been on Google Meet with haven’t noticed any issues.


You can also use MediaTek's NDI Virtual Input with some desktop apps (like Skype), but I personally use this with OBS to do two things:

- Send out a composite overlay (screen capture + webcam + lower thirds) on Teams/Skype/etc.

- Send out screen capture from another machine (usually OBS to OBS via NDI and then out via this plugin)

OBS is a lot of fun, but, alas, extremely demanding on system resources in some configurations, enough that I've started considering getting a new machine solely for video conferencing.


This works well, when it works, but it seemed like at least with my setup it seriously exacerbated performance problems. Using the older OBS -> Zoom Windows solution of swapping out a video api DLL ("virtual camera") never caused performance problems but stopped working when Zoom started integrity checking/whitelisting all libraries[1]. I switched to the NDI solution which seems to be more "official" but gave up on using it as it would consistently work fine for a while, and then framerate would drop to <2 per second. This was on a reasonably new/high end machine (X1 Carbon 6th gen) with hardware video encoding in OBS so it almost seems less like an absolute performance problem and more like some kind of lock competition, but I didn't really dig into it very deep at all - it's possible that the NDI stack was doing some software encoding I wasn't aware of and that was just too much.

[1] this happened in the middle of all the zoom bombing and I've seen an allegation that Zoom did this to intentionally nerf that OBS -> Zoom pathway as it was found to have been used by many zoom bombers, but I have no idea if this is true so don't get out pitchforks about it.


Ha, OBS was the straw that broke the camels back on my work laptop, performance-wise. Built an AMD desktop system and it’s been glorious - forgot what I was missing. Now the laptop mostly sits in a bag.


I came here to say exactly this!


Searching for "MediaTek NDI Virtual Input" does not turn up results with "MediaTek" for me. Did you mean "NewTek" or another NDI tool?

I couldn't find an NDI product for macOS from NewTek.


NewTek does NDI for Mac. The tools and SDK are available for Mac.

https://ndi.tv/tools/

The OBS NDI plugin is not part of OBS proper, but available separately:

https://github.com/Palakis/obs-ndi/releases


Holy smokeballs, thank you so freaking much!!! This thread caught my eye as I've been wanting to customize my Zoom and MS Teams "stream" (ie. webcam + extra background/overlay goodies). I hate to say it, but I skipped past the OP's project and tried this suggestion of NDI first.

Amazing. It took a good 30 minutes to figure it all out, but I now have the output of OBS (in my case, just the preview itself without needing to stream/record) as a video/webcam input source for Zoom, Microsoft Teams, and within Firefox. Note: doesn't work with Discord or QuickTime's File > New Movie Recording.

And all this with me being on macOS. Not Windows, but macOS. Incredible.

Steps (should work for macOS and Windows, not sure about Linux):

1. Install OBS. Run it, and set up a basic scene for testing (eg. webcam and a text label).

2. Download/install NDI Tools for your OS from https://ndi.tv/tools/#download-tools (note: system restart required). You only need the "NDI Virtual Input" app; on macOS each app had its own .pkg file bundled in the single .dmg archive; on Windows I assume it's an install wizard with checkboxes for each component. Again, only need "NDI Virtual Input" app/component.

3. Run the NDI Virtual Input application installed in step 2. It should live in your systray (without doing anything useful yet).

4. Download/install the obs-ndi plugin for your OS from https://github.com/Palakis/obs-ndi/releases - right now for Windows or macOS it's version 4.9.0 (expand the "Assets" link). There's a 4.9.1 update specifically for Ubuntu/Debian, but I'm not sure how those OS's are supported when there is no NDI Tools for Linux in step 2.

5. Run OBS. If you're lazy and didn't read the GitHub release notes in step 4, starting OBS should popup with a direct link to the NDI runtime you also need to download/install; then restart OBS.

6. In OBS, go to Tools > NDI Output Settings. If you want to "clone" the OBS output to the NDI virtual device only when you start streaming/recording in OBS (ie. to disk or to a streaming platform like Twitch), check "Main Output". Otherwise check "Preview Output", in which case your OBS preview will be output to the virtual video device at all times without having to start OBS streaming/recording.

7. In your operating system's systray (ie. top-right on macOS, bottom-right on Windows), you should have an NDI icon living there as started in step 3. With OBS running and configured according to step 6, you should be able to click the systray icon and select that output source as the input source for the NDI virtual device.

8. Open Zoom, Teams, or hopefully other apps which will work. Wherever you configure which input source to use for video/camera within that app's settings, there should be an "NDI Video" source. Select that… and BAM – your OBS canvas is now your input source!!

That took so long to type out, I hope someone manages to make use of it. :)


Thank you for sharing this. Your step-by-step worked perfectly for me (on Mac), and now I've got the OBS -> Hangouts setup I was looking for.


Yay!!! There were a couple of steps that took me time to debug how to make it work, so I'm ecstatic that at least one person found my steps useful!!! I don't have recent experience with Google Hangouts, so I'm curious to know whether that was a native app (does Hangouts have a macOS app?), or which browser you used (Firefox, Chrome, or…)?


Just got this up and running following your steps with great success! Enough to bring me back to say thanks!


Sadly I can't get this to work on macOS. OBS works, NDI seems to be installed and configured. It even shows up in Chrome, but I can't select it as an option.


Just for you I installed Chrome to check. It worked for me. I wrote a few things to check below; I suspect your problem is item 'c'.

a) Make sure Chrome has access to cameras (System Preferences > Security & Privacy > Camera > Google Chrome checked).

b) In step 6 from my comment, make sure to try "Preview Output" in OBS's Tools > NDI Output Settings, as "Main Output" requires you to start OBS streaming/recording (which you probably don't want to do).

c) After setting up that OBS setting, clicking the NDI systray icon should show a dropdown list with a single item. You actually need to click that item in the list (it will checkmark it). If you don't do this, you'll get a black screen when trying to pull from NDI.

d) I used https://webcamtests.com/ to test – maybe try that site.

Otherwise I'm sorry, I can't imagine what the problem is.


https://ndi.tv/tools/

He's talking about the poorly named NDI Scan Converter


Pretty sure he's talking about the NDI Virtual Input (which acts as a virtual camera on Windows)


Yes, it's NewTek. Specifically you're looking for this page: https://ndi.tv/tools/#download-tools


I've been running obs on my 2012 mbp with three to four live feeds at a time. My fans spin up and the cpu gets warm, but I haven't been dropping frames or having any major issues.


Did @johnboiles get the 10k bounty that was put up (I think by Shopify CEO) for this plugin?


No but iirc the terms from @tobi were that the feature needs to be cross-platform and merged into the main OBS codebase. There is work happening on both of these fronts, and absolutely my goal is to get this plugin merged into OBS eventually. We'll figure out some sort of splitting scheme when that time comes.


More info can be found in the recently merged RFC for those interested: https://github.com/obsproject/rfcs/pull/15


I’ll defer to Ben from OBS on how to allocate the bounty.


Been hoping to use this but it just brings my MBP 2019 to it's knees, like actually feels dangerously hot about 30 seconds after activating it.

Windows version doing the same thing feels light as a feather when running (3% CPU, <10% GPU usage)


FWIW there's definitely some performance optimization that could be done in my plugin. Three things I know of:

1) Potentially an entire framebuffer memory copy could be avoided if we could get CMBlockBufferCreateWithMemoryBlock to work. See the CMSampleBufferCreateFromDataNoCopy method (currently unused -- linked below) in my code. It mostly worked but the virtual camera video wouldn't show up at full resolution in OBS, which is how I typically test while developing. Not sure why it wasn't working; possible it's an obscure OBS bug.

https://github.com/johnboiles/obs-mac-virtualcam/blob/master...

2) It might also be possible to get the virtual camera to advertise one of the pixel formats that OBS supports natively which would avoid the pixel format conversion in the CPU. I _bet_ this is where the majority of the performance hit from my plugin happens. I'm not sure if this is possible, however. Maybe OBS doesn't natively support any formats you can use for virtual cameras.

https://github.com/johnboiles/obs-mac-virtualcam/issues/102

3) If #2 isn't possible, maybe the pixel format transformation could happen on the GPU? I don't know much about GPU programming but maybe this would help.


May go without saying but disabling the video preview really helps resource usage.


Whoa I didn't even know this was an OBS feature and (obviously) I've spent a good bit of time with OBS :)

I'll definitely be using this!


My experience has been that my MBP gets hot if I try to share a specific window. Switching from a window as a source, to a full screen source has improved OBS performance quite a bit.

And I do use this plugin.


Tangentially related, I've seen that behaviour with Google Meet as well. Sharing a specific tab takes a much bigger performance hit than sharing the whole screen.


Tab sharing has code deep into the Blink rendering engine... To the extent that it's actually possible to share a specific <div> or other HTML element, even if it isn't visible! (Not sure if you can do that from javascript, but you can totally do it from C++)

The side effects seems to be that a bunch of the code that prevents the same thing being re-rendered with every frame if it hasn't changed gets bypassed, and I'd bet that kills performance.


Make sure the power adapter is connected at the right (not left) side of the MBP.


i'm marveling at how well this comment exemplifies everything that's wrong with USB-C.


That's not related to problems with USB-C, but to Apple messing up their hardware design. This was discussed recently here: https://news.ycombinator.com/item?id=22957573


i didn't realize; thank you for the context.

i'd still point back to USB-C's complexities, though. apple isn't the only major company to have gotten USB-C wrong; look at the nintendo switch.


Don't have a choice, my MBP only has two ports on the left side.


Right now there is an open issue about a performance degradation in the last versions of OBS for macOS.

https://github.com/obsproject/obs-studio/issues/2841

I don't know if it is correlated to your problem though.


What surface were you using your laptop on? IMO Apple made the design choice to be hot as hell on some surfaces as a trade off for other advantages, so maybe some of the fault is not on this software.


Wooden table.


You can lower the size of the canvas to need fewer resources.

I am using OBS a lot with a MacBook Pro Mid 2012 but I stay in 720p.


13" or 15"? Dedicated GPU? Looking for some more info before I attempt to use it on my machine.


13" no GPU.


For anyone looking to set up a virtual OBS webcam in Arch Linux, here's how I did it:

1. Install headers for your Linux kernel:

  - sudo pacman -S linux56-headers
2. Install v4l2loopback-dkms from AUR:

  - git clone https://aur.archlinux.org/v4l2loopback-dkms.git

  - cd v4l2loopback-dkms

  - makepkg -scCi
3. Create a virtual video capture device:

  - sudo modprobe v4l2loopback devices=1 video_nr=10 card_label="OBS Cam" exclusive_caps=1
4. Set up a virtual audio device to avoid latency:

  - sudo modprobe snd-aloop index=10 id="OBS Mic"

  - pacmd 'update-source-proplist alsa_input.platform-snd_aloop.0.analog-stereo device.description="OBS Mic"'
5. Run ffmpeg:

  ffmpeg -an -probesize 32 -analyzeduration 0 -listen 1 -i rtmp://127.0.0.1:1935/live/test -f v4l2 -vcodec rawvideo /dev/video10
6. Setup OBS to stream to ffmpeg:

  - File > Settings > Stream, set Service to "Custom..." and "Server" to `rtmp://127.0.0.1:1935/live/test`
7. Setup low latency streaming:

  - File > Settings > Output, set Buffer Size to 0, CPU Usage Preset to "ultrafast" and Tune to "zerolatency".
8. Start Streaming in OBS.

9. Select your virtual camera and audio devices in Google Meet / Zoom / etc.

I get virtually no latency with this setup but I'm running an AMD Ryzen 7 2700X with 32GB of RAM. As always, YMMV.


Or you can use the obs v4l2loopback plugin to replace steps 5 and 6: https://github.com/CatxFish/obs-v4l2sink


Thanks! I’ll try this out.


This should work in Zoom, as in the latest release of Zoom 5 they have re-enabled virtual webcam support.


The Zoom release notes are a bit misleading. They didn't enable all virtual cameras, but instead a very specific list of virtual cameras. Here's how you can see that list

strings /Applications/zoom.us.app/Contents/Frameworks/nydus.framework/Versions/A/nydus | grep "Developer ID Application"

A number of folks have reached out to Zoom support to request that my plugin be added. See the latest in https://github.com/johnboiles/obs-mac-virtualcam/issues/4 for details.

I wouldn't fault Zoom for being restrictive about what they include in their allow list, since every addition adds risk, but it looks like they've already included closed-source applications from individuals as of 5.0.5. At least my code is open source and auditable!


Doing that grep interestingly includes "Developer ID Application: NewTek, Inc. (W8U66ET244)", which may match the other comment thread[1] talking about NewTek's NDI Virtual Input (note: the top comment there mistyped "NewTek" as "MediaTek"). So using the NDI solution may work with Zoom without editing entitlements.

[1] https://news.ycombinator.com/item?id=23406438


I'm unfortunately not seeing it in Zoom, even after reinstalling zoom.

I see "OBS Virtual Camera" in the source list of Video Capture Device, so I know it's running.

I do see Snap Cam (from SnapChat) listed in Zoom, so it doesn't seem to be a global virtual webcam block.


Why can zoom block a virtual camera? Couldn’t the virtual camera just pretend to be like any other real camera?


Because an app can disable loading libraries/plugins signed with a different developer id certificate, and it seems Core Media IO wants to load the plugin in the actually app process, and it doesn't use an helper process.


It's seemingly a MacOS Code Signing limitation.

https://github.com/johnboiles/obs-mac-virtualcam/issues/4


optional restriction, not limitation.


The limitation is that CoreMediaIO plugins still run in-process (and not just in some daemon but in arbitrary application processes!), when Apple's direction for over a decade has been to move all plugin mechanisms towards an out-of-process model. There's nothing about video plugins in particular that would prevent them from being out-of-process; in fact, most of Apple's CoreMediaIO plugins already have the bulk of the logic in a separate "assistant" process, but the IPC layer is reimplemented by each individual plugin rather than being done generically by CoreMediaIO itself. It's clear what has to happen, but Apple hasn't done it yet.


This is the downside of a walled garden and locked down devices. Of course many will say its for security and its wonderful but how much freedom are you willing to give up for security?


DroidCam (works on iOS too despite the name) https://www.dev47apps.com/ works great on Windows if you're looking for an alternative on a business / gaming OS and you have a smartphone (or tablet).

I've used it with Zoom / Google Meet / Discord and it's never failed me.


I use Iriun Webcam to use my (leftover) Android phone as a webcam into my MacBook Air and iMac, with Zoom, OBS and Quicktime. It works pretty decently, the only gripe is that I can't control exposure from the phone camera so it needs the right lighting to look nice.

There are also other apps out there, besides DroidCam and Iriun, that support different phone-computer connections.


Finally. I‘ve been waiting / looking for a decent solution for quite a while now...


What was wrong with CamTwist? I've been using OBS+CamTwist to do this for years now. I mean, this will save a couple minutes setup time, but it's not like that wasn't decent.


CamTwist requires disabling macOS SIP. For many this is too high bar.


I find that CamTwist destroys the i7 on my 2015 13" MBP as well, slowing it down so much that I can barely use Google Sheets at the same time.


Canon recently released a driver that lets you use an EOS camera as a USB webcam.

It's available for Mac and Windows.

So you can use it directly as a normal webcam or pull it into OBS and manipulate it with filters and text etc. then use OP's tool to output the OBS processed version of your EOS.

https://www.usa.canon.com/internet/portal/us/home/support/se...


I'm excited about this... But I wonder how much resources it requires to stream 720p 30fps into my older Macs without jitter and dropped frames. I had higher hopes on Elgato Cam Link, if it was in stock anywhere, since it does the processing on chip, but it's just speculation since not a lot of people measure performance on these setups either because it all works so well on Windows and/or they have high end machines for gaming (also Windows). Very few streamers/reviwers use Mac as it is underperformant and undersupported in the video streaming arena. It's really frustrating to realize this right in the middle of this "new normal" where webcams and video capture devices are all sold-out globally and one finds himself with a couple of video-worthless Macs.

Ironically Apple came up with FireWire a long time ago to bring (mostly prerecorded) video faster into Macs and now they lag far behind in every aspect related to video. This includes their webcams, which are terrible. Now that's a different story on iOS devices...


Holy macaroni this looks exciting!! Sadly it only supports a rather short list of cameras, their latest and greatest by the looks of things. The whole 5D range normally gets this sort of thing but this time it starts on the IV unfortunately.


I've been using this repo for a while now and it's a champ. It has been making my webcam rock, as OBS does a great job chroma key'ing green screen behind me. Plus I can add effects as I wish.

Thanks for reminding this and I'll make a donation to this person. Hope this becomes the actual implementation and he gets the $10k bounty by @tobi.


I recently tried to do some work on making a virtual camera, and was shocked to see just how difficult it is, let alone making something cross-platform. Anyone know of any projects that are trying to make this easier to hack on?


Not sure about cross-platform, but for macOS I started this plugin by creating a minimal virtual camera: https://github.com/johnboiles/coremediaio-dal-minimal-exampl...

@seanchas116 made a Swift port of my minimal example https://github.com/seanchas116/SimpleDALPlugin

Overall it _was_ very difficult! Apple's documentation and sample code for CoreMediaIO DAL (virtual camera) plugins are terrible. I just brute forced it for hours trying all sorts of different combinations of things before I got something to work.


this is fantastic, i wish i had your example when i worked on mine (closed source)... it was the same for me, total brute force

... and apples examples are in c++ but i ended up doing it using straight c out of frustration since thier (c++) examples were so convoluted


Very very convoluted. I found it near impossible to read.


This project seems like its the farthest along in this endeavor on linux:

https://github.com/umlaeute/v4l2loopback


Zoom appears to have released an update fixing the Virtual Cam Support.

"Changes to existing features

    Re-enable virtual camera support
    Support for virtual cameras will be re-enabled for users on client version 5.0.4. "


https://support.zoom.us/hc/en-us/articles/201361963


Those release notes are a bit misleading. See my comment above


Thanks for taking the time to bring my attention to it.


Most meeting software already allows you to broadcast your webcam and a screen-share session at the same time. So this is only necessary if you want your webcam feed to be embedded inside your screen-share, and you want your screen-share to be your webcam feed. For example, this sounds like it would be a horrible experience in something like a Zoom gallery view. Am I understanding correctly?


OBS can do a lot more than embed your webcam over your screen share.

You can:

- combine and arrange multiple portions of the screen as you like

- filter things

- switch between different layouts whilst you're in a call


I'm asking specifically about the virtual camera. Why not just screen-share OBS' preview window?


I believe you aren't. Obs+virtual cam means you can run a live, multifeed show without having to rely on zooms auto-switching. Essentially, you can spotlight your feed, use zoom and other software to feed into obs, and then have a talking-heads experience with multiple people on the same view at the same time without the jumping jitters between two speakers. You can also then embed anything else you want, like video files, images, text, etc.

Essentially, it let's you run a live production over zoom and it is a blast. I've run two shows myself and helped with tech on another two with this setup and they've all been fun.


The advantages of OBS are clear, but I still don't understand the virtual camera. Why not just screen-share OBS' preview window?


Good question. For me, it has just been experience.

I've found that the preview window tends to be glitchy. I don't know what causes it, but I've had obs crash on me multiple times when using that interface.

Also, the virtual cam allows you to output at a given resolution but if you want to do the same with the preview window, you need to scale it up which just beats more desktop space and may also just eat more resources.

Finally, it is just more direct of a connection. I would rather be able to get a feed directly out of a program rather than have to go the window-capture route as that just add an additional layer of possible bugs, jitter, and fail.


Slight tangent, does anyone know of any virtual "audio" device (in Mac OS or Windows) that you can attach VSTs / AUs to? I know that technically Zoom allows you to share computer audio but a controlled audio device would allow more granular control of the input.


BlackHole [1] is a modern alternative to Soundflower.

[1] https://github.com/ExistentialAudio/BlackHole


Seconding this, I recently switched to Blackhole. I think because Soundflower is no longer being developed.


Seconding blackhole, in my experience it works as well as Soundflower worked 5 years ago


Soundflower + AULab (download from https://developer.apple.com/download/more/), both free.

Alt AULab link, without the requirement of an Apple ID to download: https://www.apple.com/ca/itunes/mastered-for-itunes/


Unfortuntely i think you need to use both Audio Hijack and Loopback together. It's a combo from same developer.

It's quality software but a bit unintuitive how to set this up. First you create virtual audio source in loopback then with audio hijack you can route adjusted audio to this source you can't route to normal outputs. Also you need to have hijack turned on "record" for the effects to work.

But after that it works pretty well and you can add pretty cool things like... you can record both raw input and procesed input and route to your destination at the same time. Or you can mix in other apps like music into recording/routing.

Sidenote I've also tried BlackHole and unfortunately didn't have too much luck with it. It somehow worked as Loopback alternative but i think Loopback / Hijack share same audio drivers/code so integration seems smoother.


sorry to hijack your thread, I have nothing to help you on this. Instead, I need some help with my setup and I wonder if you can help me! I started doing some remote piano classes. I have a midi controller and use Logic's piano. I want to play with 'local' latency, but still send the logic sound via zoom (or any other app). To do that today, I have to use an 'Aggregate Device', which introduces ~60ms of latency, making it super hard to practice. Did you face this problem as well?If yes, did you find a solution? Thanks!


If you use Loopback (and you may also use Audio Hijack for finer-grained control) you can create virtual devices that can be selected from within Zoom (for example) while being monitored through, say, a pair of headphones.

Here's a crude diagram that shows how this might work:

  +--------------------------------------------------------------------------------------+
  |                             Loopback: virtual_keyboard                               |
  |                                                                                      |
  | +-------------------+      +-----------------------+    +--------------------------+ |
  | |MIDI / Logic piano |      |output channels: L & R |    | monitors: headphones     | |
  | | (Pass+Thru)       +----->+                       +--->+                          | |
  | +-------------------+      +-----------------------+    +--------------------------+ |
  +--------------------------------------------------------------------------------------+
  
  +---------------------------------+
  | Zoom                            |
  | +---------------------------+   |
  | | input: virtual_keyboard   |   |
  | |                           |   |
  | +---------------------------+   |
  +---------------------------------+
I believe this would reduce the latency from your keyboard to your monitors while any lag Zoom has processing the audio (and probably) video would be slightly increased.


thanks, I'll try that!



One of these (probably Loopback?) should work https://rogueamoeba.com/

I don't know of free, good alternatives.


Free: iShowU Audio, Soundflower

Paid: Loopback, Audio Hijack, Sound Siphon


Any idea if we could just simplify this by using a virtualbox VM setup with the right software?

Then it'd be cross platform and just a matter of running an optimized VM that can run on a limited amount of resources.


Camtwist can be another option if you want a virtual cam on Mac http://camtwiststudio.com/


Thank you! This is very relevant! Now I get a zoom meeting with my friends aaaand, chatroulette, here we go!


Finally!


I don't understand what problem OBS solves. It records video, but Windows and macOS already come with video recorders built in.

So OBS is better somehow. How? Why is OBS better than built-in OS recorders?


It's not about recording so much as managing a broadcast. You can configure many inputs which include external video (e.g. a webcam), media on the machine (recorded video or images), and screens on the device, and outputting this as a unified stream. This stream may be passed on as a virtual video device, which other programs can use, or to a stream host (e.g. Twitch), or be recorded.

These various inputs can be arranged into "scenes" for easy management and switching among them.

If all you need to do is record the raw video from a video device, you don't need OBS.


Game streaming (like Twitch) is an extremely common use case for OBS, and it's a great demonstration of what's possible. A lot of streamers will broadcast a composite of their screen, their webcam (often with green-screen masking), a feed of their chat, and various other graphics. All of this is pretty straightforward to do with OBS, and would be a significant effort to build otherwise.


OBS is essentially a live video production studio. Multiple scenes, multiple sources, title cards, etc. can all be controlled with OBS on the fly as you stream.


"OBS is essentially a live video production studio. Multiple scenes, multiple sources, title cards, etc. can all be controlled with OBS on the fly as you stream."

This. And you can have multiple layouts (scenes) and quickly switch between them. It's a video switcher on steroids.


OBS does much more than let you record with your webcam.. You can import a Window capture device, or a Display capture device, or set up a combination of different windows (Display capture with Webcam capture overlaid in the corner for example, like most modern tutorials)

OBS is feature rich, the native video recorder is no comparison.


OBS let's you edit video on the fly and the pipe the output somewhere else. For example, you can take the video of you playing a game, add a video of your face from your webcam to the corner, add text overlays of your channel name and then send the result to twitch.


It does more than record video. The use case I have for it is applying green screen to a camera feed, and then being able to pipe that video into an app that doesn't natively support green screens (ie Google Meet)


Is the project linked here what you use to add OBS output to a video call or is there another way?

What would I do if I want the enhanced stream to be piped to screen share instead of webcam? Most video chat solutions will show webcam and shared screen differently


VirtualCam was already a thing, but only for Windows (not sure what is the backstory behind the more recent version being maintained by someone else; I think I'm using catxfish's version): https://obsproject.com/forum/resources/obs-virtualcam.539/

OBS has full desktop as well as window capturing capabilities, so if you only want to show the contents of a single window, you would add a "window capture" item to your scene: https://github.com/obsproject/obs-studio/wiki/Sources-Guide#...

...then select "VirtualCam" from the Tools menu, and click "Start" to start sending images to the selected virtual camera. These virtual devices appear in eg. Zoom, so if you select it and enable video within the app, others will see 5 faces and Visual Studio Code having a conversation.

(Of course you can add more sources to the scene, aligning them appropriately, so you can have your face in the corner alongside the captured window.)


OBS and SLOBS are internal capture++. They allow for more complex multi-scene/input (mic and video) setups and allow you to setup configured "scenes" that are comprised of multiple things. Think of them as the difference between either having a stream or a recording that is just you or your screen versus having a professional style multi-input shoot like a news channel with tickers and graphics included.


OBS is primarily for streaming.

It also has a lot of advanced features beyond simple recording of a video source or your screen.

ALSO the name is literally "open broadcaster studio"

Maybe take two seconds to investigate what you're talking about before making comments like this. Sheesh.


This is a good news.


Use Wirecast instead. It works better.

OBS just isn’t for macOS users. If you record video with it it’ll be corrupted. The performance is bad. It’s for people who stream to Twitch on a Windows computer.

The real problem is that the macOS security model broke virtual cameras in the latest version of Zoom.


Can you substantiate any of your claims? I've used OBS on macOS for a long time and your experience doesn't correspond to mine.

It's true that there are some performance concerns and hardware encoding isn't available for streaming purposes on macOS due Apple's Video Toolbox API not exposing the appropriate encoder options.

However, I doubt that OBS would be responsible for corrupted video. It uses the industry-standard x264 encoder — if there's a problem with the video, it derives from (A) your settings, and (B) x264. I'm more inclined to believe A than B.

I'm more than happy to use Wirecast if you'll pay for a licence. I and most other people, mostly amateurs, don't have a spare US$599 lying around.


> If you record video with it it’ll be corrupted

You state this like a fact. I have recorded video, it was not corrupted. OBS works fine for my needs.


I am a happy OBS user on macOS and use it both from streaming and recording.

I never had any issue playing an OBS recording on macOS or uploading it to YouTube.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: