FWIW there's definitely some performance optimization that could be done in my plugin. Three things I know of:
1) Potentially an entire framebuffer memory copy could be avoided if we could get CMBlockBufferCreateWithMemoryBlock to work. See the CMSampleBufferCreateFromDataNoCopy method (currently unused -- linked below) in my code. It mostly worked but the virtual camera video wouldn't show up at full resolution in OBS, which is how I typically test while developing. Not sure why it wasn't working; possible it's an obscure OBS bug.
2) It might also be possible to get the virtual camera to advertise one of the pixel formats that OBS supports natively which would avoid the pixel format conversion in the CPU. I _bet_ this is where the majority of the performance hit from my plugin happens. I'm not sure if this is possible, however. Maybe OBS doesn't natively support any formats you can use for virtual cameras.
3) If #2 isn't possible, maybe the pixel format transformation could happen on the GPU? I don't know much about GPU programming but maybe this would help.
1) Potentially an entire framebuffer memory copy could be avoided if we could get CMBlockBufferCreateWithMemoryBlock to work. See the CMSampleBufferCreateFromDataNoCopy method (currently unused -- linked below) in my code. It mostly worked but the virtual camera video wouldn't show up at full resolution in OBS, which is how I typically test while developing. Not sure why it wasn't working; possible it's an obscure OBS bug.
https://github.com/johnboiles/obs-mac-virtualcam/blob/master...
2) It might also be possible to get the virtual camera to advertise one of the pixel formats that OBS supports natively which would avoid the pixel format conversion in the CPU. I _bet_ this is where the majority of the performance hit from my plugin happens. I'm not sure if this is possible, however. Maybe OBS doesn't natively support any formats you can use for virtual cameras.
https://github.com/johnboiles/obs-mac-virtualcam/issues/102
3) If #2 isn't possible, maybe the pixel format transformation could happen on the GPU? I don't know much about GPU programming but maybe this would help.