If you are a linux user and own a nice camera you can use gphoto2 and ffmpeg to create a virtual camera. I posted howto on HN couple of days ago[0][1], here it is for anyone who might need it. I tried it with both Sony RX100VA and Sony A7III, in both cases it works really well.
edit: forgot to mention that this works over USB, you don't have to pay crazy markup for capture card
edit2: (because I'm so excited about getting this to work) here is a list of supported cameras[2] - sadly I was not able to get GoPro Hero 6 to work.
Hey! I was trying to do this a while back, and couldn't make it work.
I followed the instructions in your post, and (Although it didn't work upfront) gave me the will to make it work ;)
Little advice: I fixed my set up by finding the correct v4l2 device, because video0 was already assigned. If you run:
v4l2-ctl --list-devices
it will tell you where v4l2 is plugged in your machine, in order to enter the correct command (That was the only part missing for my puzzle) as if you already have a webcam in your computer it will already /dev/video0 assigned, and the gphoto | ffmpeg piping gives too cryptic messages (It complains about the formats not being correct, while it should complain about it not being a v4l2 device)
Shame that there is a perfectly good standard for cameras... The USB Video Class... Why have all these cameras decided to go use a different set of protocols that don't work out of the box in any OS?
I'm guessing this is because proper live view video from a DSLR requires much higher bandwidth than USB can provide. The protocols used for remote capture and download over USB should be more standardized, for sure, but live view seems really hard.
It's actually much more because the DSLR companies, for the most part, are technologically-backwards, and don't get things like platforms, APIs, or similar. It's nineties-style closed thinking. It's a big part of why cell phones are now eating their lunch. I used to be a pretty serious photographer, and own probably $10,000 worth of camera equipment.
I mostly shoot with my cell phone these days, not because I mind spending money on cameras, but because it's a better device for most photography. It integrates with the world. Cameras integrate with their manufacturer's closed ecosystems.
USB Video class allows the device to provide a list of formats supported, and the host to choose one. That seems suitable for the host to manage bandwidth across its USB links, even if other devices are also using bandwidth.
I got a GoPro Hero 8 Black into OBS (Linux, but should work anywhere) by connecting to it using WiFi and its semi-documented api (https://github.com/KonradIT/gopro-py-api) to turn on its UDP video stream, then used that as the media source in OBS. I'm betting your Hero6 will work this way too.
Is this possible in MacOS at all? I have an RX100 V and an Elgato Camlink HD, but would love to use that capture card w another cam, and use the RX100 over USB simultaneously.
I'm piping it to ffplay,so this will at least let you test your camera or you could also use it in OBS as a window source. Also, make sure your cameras usb mode is not set to "mass storage" but to a “Remote Camera Control”.
Looks like both gphoto2 and ffmepg are available on homebrew, worth giving it a shot for sure. FWIW - I did end up building my own ffmpeg because debian default didn't have NVIDIA support. Want me to give it a try?
Spoke too soon, v4l2loopback-utils is the missing piece on mac. Found this with a basic search[0], if anyone enterprising enough wants to take a crack at it
After going down this rabbit hole myself, I can't recommend this approach.
First of all, it works! And it's pretty cool to not need a capture card. But, for most cameras, you only receive at the resolution of the on-camera screen.
In other words, the video stream gphoto2 receives is intended for a camera remote preview screen. Check your camera's resolution before investing in this as a solution -- my very expensive 4k @ 60fps-capable mirrorless camera only produces a pretty poor 640x480 @ 50fps stream using gphoto2.
Additional video recording features like flicker reduction or IS seem to be lacking through this method as well.
In the meantime, I'm patiently awaiting the delivery of my 4k capture card :)
I'll leave these here for people who stumble upon this answer, but if you were talking about another alternative I'd love to know about it. For now I'll give OBS.Camera a go and see if I like it. Thanks!
Before you return your card, keep in mind that gphoto2 will only receive at the resolution of your camera's display. For example, my camera can shoot at 4k, but gphoto2 only provides 640x480p.
Additionally, many camera features (flicker-reduction, electronic shutter options) are unavailable through this method.
This is really equivalent to a camera remote preview video, not intended to be used for actual video capture.
It is also possible to use the OBS output as a virtual webcam. All you need is v4l2loopback and the v4l2sink OBS plugin: https://github.com/CatxFish/obs-v4l2sink
It work perfectly and the virtual camera can be used with Jitsi, BigBlueButton and the likes :).
Hah, I just followed your guide after stumbling on it on Google yesterday to set up a Canon M50 on Linux Mint! It works incredibly well—heads and shoulders above the video quality from a webcam, and now that I'm piping video through ffmpeg there's tons of potential to do some weird stuff with filters and swapping to pre-recorded video.
My Sony a6300 is supported, but I can't get it to work. Doesn't even show up on lsusb when I connect in PC remote mode, much less in gphoto2 --auto-detect. I'm stumped. Too bad, that would have been useful.
Thanks. That's what I did. When set to PC Remote mode, the camera does not show up is lsusb or gphoto2. The other modes work, but don't support capture.
On the off chance that somebody stumbles on this: I got it to work in the end. After the initial experiments, I noticed the camera refused to charge from a bog-standard USB charger (with any cable). Removing and reinserting the battery restored charging functionality and, as it turns out, the camera also started being visible in PC remote mode. Serves me right for only using the regular on/off switch.
Yet another way that my Panasonic GX-1 can't capture video :(. It has a mini HDMI port, so I thought I could do it there, but it doesn't do live-view over HDMI, just playback.
You can also use MediaTek's NDI Virtual Input with some desktop apps (like Skype), but I personally use this with OBS to do two things:
- Send out a composite overlay (screen capture + webcam + lower thirds) on Teams/Skype/etc.
- Send out screen capture from another machine (usually OBS to OBS via NDI and then out via this plugin)
OBS is a lot of fun, but, alas, extremely demanding on system resources in some configurations, enough that I've started considering getting a new machine solely for video conferencing.
This works well, when it works, but it seemed like at least with my setup it seriously exacerbated performance problems. Using the older OBS -> Zoom Windows solution of swapping out a video api DLL ("virtual camera") never caused performance problems but stopped working when Zoom started integrity checking/whitelisting all libraries[1]. I switched to the NDI solution which seems to be more "official" but gave up on using it as it would consistently work fine for a while, and then framerate would drop to <2 per second. This was on a reasonably new/high end machine (X1 Carbon 6th gen) with hardware video encoding in OBS so it almost seems less like an absolute performance problem and more like some kind of lock competition, but I didn't really dig into it very deep at all - it's possible that the NDI stack was doing some software encoding I wasn't aware of and that was just too much.
[1] this happened in the middle of all the zoom bombing and I've seen an allegation that Zoom did this to intentionally nerf that OBS -> Zoom pathway as it was found to have been used by many zoom bombers, but I have no idea if this is true so don't get out pitchforks about it.
Ha, OBS was the straw that broke the camels back on my work laptop, performance-wise. Built an AMD desktop system and it’s been glorious - forgot what I was missing. Now the laptop mostly sits in a bag.
Holy smokeballs, thank you so freaking much!!! This thread caught my eye as I've been wanting to customize my Zoom and MS Teams "stream" (ie. webcam + extra background/overlay goodies). I hate to say it, but I skipped past the OP's project and tried this suggestion of NDI first.
Amazing. It took a good 30 minutes to figure it all out, but I now have the output of OBS (in my case, just the preview itself without needing to stream/record) as a video/webcam input source for Zoom, Microsoft Teams, and within Firefox. Note: doesn't work with Discord or QuickTime's File > New Movie Recording.
And all this with me being on macOS. Not Windows, but macOS. Incredible.
Steps (should work for macOS and Windows, not sure about Linux):
1. Install OBS. Run it, and set up a basic scene for testing (eg. webcam and a text label).
2. Download/install NDI Tools for your OS from https://ndi.tv/tools/#download-tools (note: system restart required). You only need the "NDI Virtual Input" app; on macOS each app had its own .pkg file bundled in the single .dmg archive; on Windows I assume it's an install wizard with checkboxes for each component. Again, only need "NDI Virtual Input" app/component.
3. Run the NDI Virtual Input application installed in step 2. It should live in your systray (without doing anything useful yet).
4. Download/install the obs-ndi plugin for your OS from https://github.com/Palakis/obs-ndi/releases - right now for Windows or macOS it's version 4.9.0 (expand the "Assets" link). There's a 4.9.1 update specifically for Ubuntu/Debian, but I'm not sure how those OS's are supported when there is no NDI Tools for Linux in step 2.
5. Run OBS. If you're lazy and didn't read the GitHub release notes in step 4, starting OBS should popup with a direct link to the NDI runtime you also need to download/install; then restart OBS.
6. In OBS, go to Tools > NDI Output Settings. If you want to "clone" the OBS output to the NDI virtual device only when you start streaming/recording in OBS (ie. to disk or to a streaming platform like Twitch), check "Main Output". Otherwise check "Preview Output", in which case your OBS preview will be output to the virtual video device at all times without having to start OBS streaming/recording.
7. In your operating system's systray (ie. top-right on macOS, bottom-right on Windows), you should have an NDI icon living there as started in step 3. With OBS running and configured according to step 6, you should be able to click the systray icon and select that output source as the input source for the NDI virtual device.
8. Open Zoom, Teams, or hopefully other apps which will work. Wherever you configure which input source to use for video/camera within that app's settings, there should be an "NDI Video" source. Select that… and BAM – your OBS canvas is now your input source!!
That took so long to type out, I hope someone manages to make use of it. :)
Yay!!! There were a couple of steps that took me time to debug how to make it work, so I'm ecstatic that at least one person found my steps useful!!! I don't have recent experience with Google Hangouts, so I'm curious to know whether that was a native app (does Hangouts have a macOS app?), or which browser you used (Firefox, Chrome, or…)?
Sadly I can't get this to work on macOS. OBS works, NDI seems to be installed and configured. It even shows up in Chrome, but I can't select it as an option.
Just for you I installed Chrome to check. It worked for me. I wrote a few things to check below; I suspect your problem is item 'c'.
a) Make sure Chrome has access to cameras (System Preferences > Security & Privacy > Camera > Google Chrome checked).
b) In step 6 from my comment, make sure to try "Preview Output" in OBS's Tools > NDI Output Settings, as "Main Output" requires you to start OBS streaming/recording (which you probably don't want to do).
c) After setting up that OBS setting, clicking the NDI systray icon should show a dropdown list with a single item. You actually need to click that item in the list (it will checkmark it). If you don't do this, you'll get a black screen when trying to pull from NDI.
I've been running obs on my 2012 mbp with three to four live feeds at a time. My fans spin up and the cpu gets warm, but I haven't been dropping frames or having any major issues.
No but iirc the terms from @tobi were that the feature needs to be cross-platform and merged into the main OBS codebase. There is work happening on both of these fronts, and absolutely my goal is to get this plugin merged into OBS eventually. We'll figure out some sort of splitting scheme when that time comes.
FWIW there's definitely some performance optimization that could be done in my plugin. Three things I know of:
1) Potentially an entire framebuffer memory copy could be avoided if we could get CMBlockBufferCreateWithMemoryBlock to work. See the CMSampleBufferCreateFromDataNoCopy method (currently unused -- linked below) in my code. It mostly worked but the virtual camera video wouldn't show up at full resolution in OBS, which is how I typically test while developing. Not sure why it wasn't working; possible it's an obscure OBS bug.
2) It might also be possible to get the virtual camera to advertise one of the pixel formats that OBS supports natively which would avoid the pixel format conversion in the CPU. I _bet_ this is where the majority of the performance hit from my plugin happens. I'm not sure if this is possible, however. Maybe OBS doesn't natively support any formats you can use for virtual cameras.
3) If #2 isn't possible, maybe the pixel format transformation could happen on the GPU? I don't know much about GPU programming but maybe this would help.
My experience has been that my MBP gets hot if I try to share a specific window. Switching from a window as a source, to a full screen source has improved OBS performance quite a bit.
Tangentially related, I've seen that behaviour with Google Meet as well. Sharing a specific tab takes a much bigger performance hit than sharing the whole screen.
Tab sharing has code deep into the Blink rendering engine... To the extent that it's actually possible to share a specific <div> or other HTML element, even if it isn't visible! (Not sure if you can do that from javascript, but you can totally do it from C++)
The side effects seems to be that a bunch of the code that prevents the same thing being re-rendered with every frame if it hasn't changed gets bypassed, and I'd bet that kills performance.
What surface were you using your laptop on? IMO Apple made the design choice to be hot as hell on some surfaces as a trade off for other advantages, so maybe some of the fault is not on this software.
The Zoom release notes are a bit misleading. They didn't enable all virtual cameras, but instead a very specific list of virtual cameras. Here's how you can see that list
strings /Applications/zoom.us.app/Contents/Frameworks/nydus.framework/Versions/A/nydus | grep "Developer ID Application"
I wouldn't fault Zoom for being restrictive about what they include in their allow list, since every addition adds risk, but it looks like they've already included closed-source applications from individuals as of 5.0.5. At least my code is open source and auditable!
Doing that grep interestingly includes "Developer ID Application: NewTek, Inc. (W8U66ET244)", which may match the other comment thread[1] talking about NewTek's NDI Virtual Input (note: the top comment there mistyped "NewTek" as "MediaTek"). So using the NDI solution may work with Zoom without editing entitlements.
Because an app can disable loading libraries/plugins signed with a different developer id certificate, and it seems Core Media IO wants to load the plugin in the actually app process, and it doesn't use an helper process.
The limitation is that CoreMediaIO plugins still run in-process (and not just in some daemon but in arbitrary application processes!), when Apple's direction for over a decade has been to move all plugin mechanisms towards an out-of-process model. There's nothing about video plugins in particular that would prevent them from being out-of-process; in fact, most of Apple's CoreMediaIO plugins already have the bulk of the logic in a separate "assistant" process, but the IPC layer is reimplemented by each individual plugin rather than being done generically by CoreMediaIO itself. It's clear what has to happen, but Apple hasn't done it yet.
This is the downside of a walled garden and locked down devices. Of course many will say its for security and its wonderful but how much freedom are you willing to give up for security?
DroidCam (works on iOS too despite the name) https://www.dev47apps.com/ works great on Windows if you're looking for an alternative on a business / gaming OS and you have a smartphone (or tablet).
I've used it with Zoom / Google Meet / Discord and it's never failed me.
I use Iriun Webcam to use my (leftover) Android phone as a webcam into my MacBook Air and iMac, with Zoom, OBS and Quicktime. It works pretty decently, the only gripe is that I can't control exposure from the phone camera so it needs the right lighting to look nice.
There are also other apps out there, besides DroidCam and Iriun, that support different phone-computer connections.
What was wrong with CamTwist? I've been using OBS+CamTwist to do this for years now. I mean, this will save a couple minutes setup time, but it's not like that wasn't decent.
Canon recently released a driver that lets you use an EOS camera as a USB webcam.
It's available for Mac and Windows.
So you can use it directly as a normal webcam or pull it into OBS and manipulate it with filters and text etc. then use OP's tool to output the OBS processed version of your EOS.
I'm excited about this... But I wonder how much resources it requires to stream 720p 30fps into my older Macs without jitter and dropped frames. I had higher hopes on Elgato Cam Link, if it was in stock anywhere, since it does the processing on chip, but it's just speculation since not a lot of people measure performance on these setups either because it all works so well on Windows and/or they have high end machines for gaming (also Windows). Very few streamers/reviwers use Mac as it is underperformant and undersupported in the video streaming arena. It's really frustrating to realize this right in the middle of this "new normal" where webcams and video capture devices are all sold-out globally and one finds himself with a couple of video-worthless Macs.
Ironically Apple came up with FireWire a long time ago to bring (mostly prerecorded) video faster into Macs and now they lag far behind in every aspect related to video. This includes their webcams, which are terrible. Now that's a different story on iOS devices...
Holy macaroni this looks exciting!! Sadly it only supports a rather short list of cameras, their latest and greatest by the looks of things. The whole 5D range normally gets this sort of thing but this time it starts on the IV unfortunately.
I've been using this repo for a while now and it's a champ. It has been making my webcam rock, as OBS does a great job chroma key'ing green screen behind me. Plus I can add effects as I wish.
Thanks for reminding this and I'll make a donation to this person. Hope this becomes the actual implementation and he gets the $10k bounty by @tobi.
I recently tried to do some work on making a virtual camera, and was shocked to see just how difficult it is, let alone making something cross-platform. Anyone know of any projects that are trying to make this easier to hack on?
Overall it _was_ very difficult! Apple's documentation and sample code for CoreMediaIO DAL (virtual camera) plugins are terrible. I just brute forced it for hours trying all sorts of different combinations of things before I got something to work.
Most meeting software already allows you to broadcast your webcam and a screen-share session at the same time. So this is only necessary if you want your webcam feed to be embedded inside your screen-share, and you want your screen-share to be your webcam feed. For example, this sounds like it would be a horrible experience in something like a Zoom gallery view. Am I understanding correctly?
I believe you aren't. Obs+virtual cam means you can run a live, multifeed show without having to rely on zooms auto-switching. Essentially, you can spotlight your feed, use zoom and other software to feed into obs, and then have a talking-heads experience with multiple people on the same view at the same time without the jumping jitters between two speakers. You can also then embed anything else you want, like video files, images, text, etc.
Essentially, it let's you run a live production over zoom and it is a blast. I've run two shows myself and helped with tech on another two with this setup and they've all been fun.
Good question. For me, it has just been experience.
I've found that the preview window tends to be glitchy. I don't know what causes it, but I've had obs crash on me multiple times when using that interface.
Also, the virtual cam allows you to output at a given resolution but if you want to do the same with the preview window, you need to scale it up which just beats more desktop space and may also just eat more resources.
Finally, it is just more direct of a connection. I would rather be able to get a feed directly out of a program rather than have to go the window-capture route as that just add an additional layer of possible bugs, jitter, and fail.
Slight tangent, does anyone know of any virtual "audio" device (in Mac OS or Windows) that you can attach VSTs / AUs to? I know that technically Zoom allows you to share computer audio but a controlled audio device would allow more granular control of the input.
Unfortuntely i think you need to use both Audio Hijack and Loopback together. It's a combo from same developer.
It's quality software but a bit unintuitive how to set this up. First you create virtual audio source in loopback then with audio hijack you can route adjusted audio to this source you can't route to normal outputs. Also you need to have hijack turned on "record" for the effects to work.
But after that it works pretty well and you can add pretty cool things like... you can record both raw input and procesed input and route to your destination at the same time. Or you can mix in other apps like music into recording/routing.
Sidenote I've also tried BlackHole and unfortunately didn't have too much luck with it. It somehow worked as Loopback alternative but i think Loopback / Hijack share same audio drivers/code so integration seems smoother.
sorry to hijack your thread, I have nothing to help you on this. Instead, I need some help with my setup and I wonder if you can help me! I started doing some remote piano classes. I have a midi controller and use Logic's piano. I want to play with 'local' latency, but still send the logic sound via zoom (or any other app). To do that today, I have to use an 'Aggregate Device', which introduces ~60ms of latency, making it super hard to practice. Did you face this problem as well?If yes, did you find a solution? Thanks!
If you use Loopback (and you may also use Audio Hijack for finer-grained control) you can create virtual devices that can be selected from within Zoom (for example) while being monitored through, say, a pair of headphones.
Here's a crude diagram that shows how this might work:
I believe this would reduce the latency from your keyboard to your monitors while any lag Zoom has processing the audio (and probably) video would be slightly increased.
It's not about recording so much as managing a broadcast. You can configure many inputs which include external video (e.g. a webcam), media on the machine (recorded video or images), and screens on the device, and outputting this as a unified stream. This stream may be passed on as a virtual video device, which other programs can use, or to a stream host (e.g. Twitch), or be recorded.
These various inputs can be arranged into "scenes" for easy management and switching among them.
If all you need to do is record the raw video from a video device, you don't need OBS.
Game streaming (like Twitch) is an extremely common use case for OBS, and it's a great demonstration of what's possible. A lot of streamers will broadcast a composite of their screen, their webcam (often with green-screen masking), a feed of their chat, and various other graphics. All of this is pretty straightforward to do with OBS, and would be a significant effort to build otherwise.
OBS is essentially a live video production studio. Multiple scenes, multiple sources, title cards, etc. can all be controlled with OBS on the fly as you stream.
"OBS is essentially a live video production studio. Multiple scenes, multiple sources, title cards, etc. can all be controlled with OBS on the fly as you stream."
This. And you can have multiple layouts (scenes) and quickly switch between them. It's a video switcher on steroids.
OBS does much more than let you record with your webcam..
You can import a Window capture device, or a Display capture device, or set up a combination of different windows (Display capture with Webcam capture overlaid in the corner for example, like most modern tutorials)
OBS is feature rich, the native video recorder is no comparison.
OBS let's you edit video on the fly and the pipe the output somewhere else. For example, you can take the video of you playing a game, add a video of your face from your webcam to the corner, add text overlays of your channel name and then send the result to twitch.
It does more than record video. The use case I have for it is applying green screen to a camera feed, and then being able to pipe that video into an app that doesn't natively support green screens (ie Google Meet)
Is the project linked here what you use to add OBS output to a video call or is there another way?
What would I do if I want the enhanced stream to be piped to screen share instead of webcam? Most video chat solutions will show webcam and shared screen differently
VirtualCam was already a thing, but only for Windows (not sure what is the backstory behind the more recent version being maintained by someone else; I think I'm using catxfish's version): https://obsproject.com/forum/resources/obs-virtualcam.539/
...then select "VirtualCam" from the Tools menu, and click "Start" to start sending images to the selected virtual camera. These virtual devices appear in eg. Zoom, so if you select it and enable video within the app, others will see 5 faces and Visual Studio Code having a conversation.
(Of course you can add more sources to the scene, aligning them appropriately, so you can have your face in the corner alongside the captured window.)
OBS and SLOBS are internal capture++. They allow for more complex multi-scene/input (mic and video) setups and allow you to setup configured "scenes" that are comprised of multiple things. Think of them as the difference between either having a stream or a recording that is just you or your screen versus having a professional style multi-input shoot like a news channel with tickers and graphics included.
OBS just isn’t for macOS users. If you record video with it it’ll be corrupted. The performance is bad. It’s for people who stream to Twitch on a Windows computer.
The real problem is that the macOS security model broke virtual cameras in the latest version of Zoom.
Can you substantiate any of your claims? I've used OBS on macOS for a long time and your experience doesn't correspond to mine.
It's true that there are some performance concerns and hardware encoding isn't available for streaming purposes on macOS due Apple's Video Toolbox API not exposing the appropriate encoder options.
However, I doubt that OBS would be responsible for corrupted video. It uses the industry-standard x264 encoder — if there's a problem with the video, it derives from (A) your settings, and (B) x264. I'm more inclined to believe A than B.
I'm more than happy to use Wirecast if you'll pay for a licence. I and most other people, mostly amateurs, don't have a spare US$599 lying around.
edit: forgot to mention that this works over USB, you don't have to pay crazy markup for capture card
edit2: (because I'm so excited about getting this to work) here is a list of supported cameras[2] - sadly I was not able to get GoPro Hero 6 to work.
[0] https://www.crackedthecode.co/how-to-use-your-dslr-as-a-webc...
[1] https://news.ycombinator.com/item?id=23325143
[2] http://www.gphoto.org/proj/libgphoto2/support.php