Hah, we wrote one of these too. It's fun! Our code for it is super clean and documented if you want to learn how to use WebRTC: https://github.com/trailofbits/tubertc
The hardest part is getting the signaling to work correctly over the internet (STUN/TURN/etc), which is why we advertise Tuber as only working on LANs. We tried to get some non-profit funding to finish that part up to no avail. I wonder how Hublin solved that problem?
I am really glad to see one of these finally go fully open-source. It has been stupid to see conferencing app after conferencing app only available through a hosted website. How are you supposed to deprecate expensive, proprietary enterprise videochat platforms if you can't deploy behind a firewall!?
That is really cool. Thanks for sharing. For someone with no knowledge of WebRTC, it would be helpful to have a blog post that explains the architecutre of tubertc and how it functions.
Video chat systems/protocols look to be somewhat stagnate/immature. I just wanted to do 720p 30FPS screen sharing across the internet (Europe to America), and this can be done with $10,000 dedicated hardware encoders and a ridiculously expensive mile for mile dedicated layer 2 network route.
-OR-
Alternatively, we could purchase a ~$200 elgato HDMI capture box, and create a free youtube account to stream 60FPS 1080p across the planet.
Currently there is no in between for high FPS video streaming systems on the internet.
There's also https://jitsi.org/ and https://obsproject.com/ , but they don't seem to allow P2P connections with buffering options. The trade off for low/unreliable bandwidth is allowing larger buffers which would impose greater latency.
If you want a dedicated circuit, then of course it's going to be expensive. But there's really no reason to spend that much money on an encoder, unless you're doing something like integrating an ASIC [1].
> Video chat systems/protocols look to be somewhat stagnate/immature
Eh. There's a lot of work being done with WebRTC, but I'm not going to say that's significantly advancing the state of the art with respect to video transport -- SDP, RTP, DTLS have all coexisted for a long time.
What has happened, though, is that by abstracting away media handling, the traditional "control plane" is being disintermediated. As a result, we're seeing 1) a lot of reinventing the wheel, and 2) more siloing among service providers. Both lead to feature fragmentation among providers and slows the progress of video-based services.
While I'll be the first to admit that SIP (and XMPP and H.323 and BFCP and ...) is rather too involved to be used outside a fully federated telecom environment, it's a bit of a shame that a nicer signaling protocol hasn't really gained any traction in in its stead. Having most services work from a higher level baseline could reduce feature fragmentation and speed up the perceived progress of video services. And hopefully encourage more interoperability, but that's probably a lost cause...
[1] I'm aware there are encoders that cost in that range, eg VBrick, and it's crazy -- you don't need to spend that much for video conferencing.
Your two examples are not equivalent at all. Video conferencing is generally designed for very low delays, whereas Youtube live streaming has a many second delay. This is critical to improving quality - not only can fancier video compression schemes be used with a large delay (B frames etc), but larger buffers mean that spikes in bitrate can be smoothed out. This is especially noticeable for screensharing where screen data updates tend to come in huge spikes with lulls of no changes in between.
The trade off for low/unreliable bandwidth is allowing
larger buffers which would impose greater latency
Video conferencing systems mostly prioritize low latency, so as bandwidth decreases (cross continent internet connections currently almost always have latency/bandwidth issues) either frames will drop, or resolution will need to drop. For our purposes of screen sharing/broadcasting we didn't care as much about low latency as much as we wanted high FPS and high resolution.
> Video chat systems/protocols look to be somewhat stagnate/immature. I just wanted to do 720p 30FPS screen sharing across the internet (Europe to America)
And this can't be done with any of the current clients based on webrtc, such as Hublin?
Maybe it's just me, but I find it a bit ... odd? to have my video conferencing tool running inside the browser.
I have Skype running as a separate application and I have it's icon in the Dock and I can switch to it, it runs in the background, etc.
I know that this is the tool responsible for my communication.
My contacts are also running it and it usually starts automatically on their system so most probably they're online.
A conference tool running inside the browser doesn't have these properties. I look at it as a website which I can visit or not.
This, I guess is a problem with all WebRTC implementations out there - it might be great in terms of capabilities, but it's not very practical in terms of heavy daily usage.
You should take a look at tools like https://appear.in. They make it so easy to create ad-hoc conference rooms for people coming from different mediums (for example, sales pitches with many people accross the world, organised in under a minute).
These tools are great if you want to organise something like several people from Slack, Skype and email (customers).
Sigh. We used to make a really good tool, but it was all proprietary. It supported up to 20 participants with full-duplex audio and multiple doc sharing. But it was part of a full product for collaborative teams, and we couldn't figure out the sales process fast enough.
It scaled well - hundreds of different conference participants per server node; media nodes that supported hundreds of simultaneous streams. Stream switching was on the order of scores of milliseconds (not seconds like all the webrtc stuff).
did it work inside the browser? as a Linux user, that's a big plus, not having to be stuck with some lame version like in Skype's case, or not have any working version at all.
No, that's the kiss of death for performant conferencing. At least at the levels we were reaching for. And we only had a headless version of the media engine on Linux, which was used for 'bot testing and for the media node (MCU in webrtc parlance). In other words, the cloud version.
We had a port to Android for a little while. Then a Wine version. But the native Linux one was an orphan.
https://www.freeswitch.org does all this and then some, But does so in a traditional MCU role, It can bring WebRTC and transcode vp8,h264 and h263 and SIP Endpoints together in the same Video Conference.
Only supports up to 8 participants, unfortuantely. Continually on the hunt for something that takes >15 (like Hangout) and doesn't cost a bomb and is truely cross platform (Linux, Windows, OS X, Android, iOS).
Good question. Until now, with WebRTC, you always needed a STUN, TURN and signaling server [0]. And I would be very surprised if they somehow managed to create a whole new architecture. It certainly seems like these servers are run centrally by Hublin.
In a WebRTC architecture, the media (video call stream) gets sent over a TURN server. I don't see where they say that you will set up your own TURN server.
My understanding is that STUN only coordinates connections between peers. A TURN server is required if that fails. So presumably they don't have a TURN server and only support clients that can peer with each other.
TURN's only needed if both sides have particularly restrictive NATs. There are also some public TURN servers. It's fairly harmless, as the media's all encrypted.
I'm curious as well. My understanding is that WebRTC and group chat is a pain when done strictly p2p. I could see perhaps sending a small compressed version of the stream and increasing that when requested (like clicking on someone's feed and making it larger) but even in that scenario it seems like bandwidth could become a problem quickly. Maybe it elects a leader with the most bandwidth and they act as a relay or mux the streams?
It is nice to see all these WebRTC solutions appear. For teams wishing to do teleconferencing using any platform (mobile and desktop) this is good news.
I wonder how long before a FOSS self-hosted solution is available?
What I couldn't figure out from http://hubl.in though; how is Linagora going to cover the costs of (and presumably raise a profit from) this service? Is it a freemium model, or is the current phase simply intended to garner interest and gain a foothold in the market?
WebRTC is very promising. "Hello" in Firefox is surprising stable (although I know a lot of people don't like Mozilla partnering with Telefonica) but there are occasions my remote teammates would find using Hello give them better screen resolution and latency. In one case, I was able to do real-time troubleshoot with an end-user problem over "Hello." Typically we'd use Skype or Webex.
Does anyone know whether WebRTC is used in HipChat and Slack video feature?
The hardest part is getting the signaling to work correctly over the internet (STUN/TURN/etc), which is why we advertise Tuber as only working on LANs. We tried to get some non-profit funding to finish that part up to no avail. I wonder how Hublin solved that problem?
I am really glad to see one of these finally go fully open-source. It has been stupid to see conferencing app after conferencing app only available through a hosted website. How are you supposed to deprecate expensive, proprietary enterprise videochat platforms if you can't deploy behind a firewall!?
ps. our logo is better than yours heh: https://blog.trailofbits.com/2015/12/15/self-hosted-video-ch...