Welcome to Project Lightspeed. This is a project that allows anyone to easily deploy their own sub-second latency live-streaming server. In its current state you can stream from OBS [1] and watch that stream back from any desktop browser.
This has been a super fun project which has taught me more than any other project I have done. It uses Rust, Go and React and can be deployed fairly easily on a very lightweight server. For example, I have been doing my test streams on a $5 Digital Ocean droplet and the CPU usage is at around 20%. Granted not a lot of people are watching however it is more lightweight than a solution such as Janus.
The point of this project is twofold. First I wanted to learn more about WebRTC and real-time communication in general. Second, I wanted to provide a platform where people can setup their own little live-stream environment. Maybe you just want a place where you and some friends can hang out or you are sick of the main-stream platforms.
Anyhow as of writing this post it is v0.1.0 and considered an MVP release. In the coming months I (and hopefully some of you :)) will be adding more features and trying to flesh this out into as much of a full featured platform as possible. Feel free to take a look at the repo and let me know what you all think :)
Awesome! I was looking for something like this when trying to play a local multiplayer game via the Internet in an early lockdown.
There are, or were, no good turnkey solutions for this. Twitch and Youtube have 5-10s latency, which is often not good enough. Mixer promised (and presumably delivered) ~1s latency using the FTL protocol you use, but they had a wait list of a couple of days or weeks, and of course now, they don't exist anymore. Even Steam Play Together, ostensibly built for this purpose, wasn't low latency enough in my limited experience (this really surprised me, so maybe I'm doing it wrong).
The easiest solution, use the share desktop function of whatever video conference tool, almost works, but they universally seemed reduce the frame rate, which is ok for presentations but unsuitable for games (also, no audio). My solution was to output OBS to a virtual webcam device and use Jitsi Meet. A bit roundabout, but it worked wonderfully.
Ideally, I'd forgo the DO droplet, and just run everything locally. 20% of a small droplet is even less of a modern desktop computer's CPU. Which leaves upload bandwidth for broadcast, which depends on your connection and how many people you need to be able to stream to.
Yes, I forgot about Parsec, that's a good suggestion. I remember trying it, and not getting it to do what I want, unfortunately I don't remember why. I think I was stuck in the "Arcade", when all I wanted was to share my desktop or one window. It certainly looks like exactly what I was looking for.
In home streaming with Parsec for me with a MoCA/Ethernet connection typically has 1-2ms of network latency. Over wifi in-home is more, closer to 20-30ms with a mediocre laptop wifi card. Playing online with my brother who lives 35 miles away using an Ethernet connection I typically see 15-25ms latency, not much worse than a 'meh' bluetooth controller. It's likely worth noting that my brother and I both have the same cable internet provider, but we also sometimes play with my brother-in-law who lives another 40 miles from me (~60 miles from my brother) and we can all play games like Streets of Rogue together from my brother's PC without issue.
Wow, that is better than I'd hoped! Thanks so much for your response! (I know it's just an anecdote, but bc I'm looking to see what it's perf limits might be, even one data point like this is very helpful.)
> Even Steam Play Together, ostensibly built for this purpose, wasn't low latency enough in my limited experience (this really surprised me, so maybe I'm doing it wrong).
I've had good experience with Steam Play Together, mostly playing Unrailed (a hectic game in the style of Overcooked). I definitely forget about the remote connection while playing. We were 1000 km apart, but had quite a good connection (100 Mbit, 15-20 ms ping).
I have a pretty low-latency setup for that but it wasn't completely turnkey. First you set up nginx with the rtmp module[1]. Then you can use OBS to stream your desktop to the RTMP server. I set OBS to send a keyframe every 1 second.
On the client side you have two options:
1. For low-latency game streaming, I would suggest watching through the RTMP stream. The RTMP module for nginx will re-broadcast your RTMP stream to all the clients that connect. I was able to get a latency of around 1 second by watching through:
I would expect better latency from a webrtc solution like Lightspeed but 1 second latency is pretty good for only having to install nginx.
2. HLS/Dash. The nginx RTMP module will also expose the video stream as HLS/Dash which is just cutting the stream up into files and serving them over http. Personally I set my segment size to 1 second and my playlist size to 4 seconds. Through this I get approximately a 4-second latency. Not great for competitive multiplayer games like Jackbox but if you're playing something like a world building game with friends then its acceptable. The real benefit to HLS/Dash is you can easily watch it through an html5 web video player or even a chromecast[2].
Bits you can add on top:
- I put my HLS/Dash directories in a tmpfs mount for speed and reduced wear on the drives
- I put the nginx stream module in front of my rtmp module so that it can handle TLS (making it RTMPS)
On FreeBSD it was just a checkbox in the nginx port, so the work involved may vary by distro.
[2] I haven't attempted to play the RTMP stream through chromecast so for all I know, that might be supported too. All I've tested so far on chromecast is an HLS stream using the "castnow" CLI program. The Shaka player, which is a web player, will support chromecasting an HLS stream from your browser but I've only tested their demo videos, not my personal streams, and I had to use official google chrome, not chromium, but it worked on both android and linux.
I confirm that Mixer delivered on the sub-second latency claims. As far as I know, FTL's performance is in line with WebRTC's performance. As long as the servers in-between are fast, a good WebRTC implementation should match it.
Unfortunately, with Mixer's death, I don't think there are any major turnkey players left with low latency sub-second streaming. I'd probably use Discord as a primary alternative which uses WebRTC with Discord's servers in-between.
If your home connection has the bandwidth to support the load of multiple users, a service that does direct P2P like Parsec will probably give the best performance.
From my experience it largely depends on the stream - for some I can easily get <2 seconds, others will be >10s. I'm not sure what causes this difference (ingest server?).
We were playing the Jackbox series of games together, the other folks were participating in the game with their phones. There are various minigames with 5-50 second timers, so 10s latency is a lot. Some of the games have a special streaming mode which extends the timers, but not all of them and it's best played with regular timers anyway. Obviously for true action games, you absolutely need sub-second latency, preferably <100ms.
Discord mostly works for my friend group to play Jackbox games, though sometimes it's still noticeably slow, so OP's project is definitely an improvement.
Except for Jackbox Party games it's by far the best fit. It's even the recommended way to play online by Jackbox themselves and I've hosted Jackbox via both Zoom and MS Teams and it worked perfectly fine that way.
Other online games wouldn't fair so well but the dropped frame rate in Jackbox Party games does not hamper the playability of their games at all.
They recommend doing it that way, because what else are they going to do, post a tutorial on how to do it via OBS? I don't think so.
Maybe Zoom and MS Teams offer(ed) a better fidelity. For one thing, Zoom lets you share desktop audio along with the screen. In fact, apparently these days, Jitsi can do that too, that definitely wasn't possible when I tried it early last year. At that time, at least, the experience OBS -> Jitsi was definitely much better than just Jitsi. (And note that all of this was in Linux.)
You scoff but they did link to a tutorial on how to do it via OBS in their guide[1]. They just made video conferencing the first suggestion.
Also in that guide was Discord and Steam Remote Play. It's a surprisingly technical guide (but in a good way) considering the average audience that might read it. It feels to me that some genuine thought did go into that document.
> Maybe Zoom and MS Teams offer(ed) a better fidelity.
Maybe. Anecdotally I've not had any issues with Zoom whereas Google Meets often feels like it's both heavier on the CPU and feeds seem worse. However that's running Meets on Firefox (Linux), it might perform better in Chrome.
> They recommend doing it that way, because what else are they going to do, post a tutorial on how to do it via OBS? I don't think so.
I think you're being rather unfavourable there. The Jackbox developers have been pretty responsive listening to user feedback in the past. For example Linux support was added after several requests on Steam forums. They've also added other features like subtitles specifically for streaming via video conferencing solutions. So if Zoom / Teams / etc didn't work well then you can bet they'd have posted another workaround and/or a game patch since the alternative is they'd lose a lot of potential business in 2020.
As I'd said, I'd used it fine over both Zoom and Teams (multiple times on both in fact) and the only reason I even bought Jackbox Party games was because several different work colleagues (I think it might have been 3 different people) recommended it to me after they had played their own games (individually) over Zoom and other conferencing solutions.
I don't have any experience with Jitsi so maybe the issues you were having were Jitsi specific? Maybe, being a techie, Jitsi was already "good enough" but you thought you could improve upon it a little and ended up over-engineering a solution? (we've all fallen into that trap -- when you spend your entire life building enterprise solutions it's sometimes hard to take a step back. Particularly when it's something as fun as OBS). Or maybe there was some issue with Linux? All I know is that myself and everyone I know has had zero issues hosting using Jackbox's recommended approach.
The best streaming experience for both streamer and viewers is when they can interact, and any latency over 500ms or so makes that a true challenge if you're trying to have a conversation where context is important.
Being an introvert that doesn't like it when people pay attention to them at all for any reason, I haven't really experienced this, but that's what everyone says.
There is no realtime interaction on stream anyway if you have more than a handful active chatter. And lag is high by default if the streamer is doing more than just chatting, like playing a game, building some stoff or reacting to a video.
I don't think you know just how low latency WebRTC is.
CPU usage on the streaming PC does not increase latency unless the PC is severely under spec. CPU usage increases CPU usage. That's it. Encoding usually happens on a GPU, and scene composition happens on the CPU, which is either a zero-copy routine or a very fast memcpy.
My point is that from what you're saying, it seems clear to me that you are not aware of just how good WebRTC is at this kind of thing.
I hardly ever see use cases for live streaming where's latency doesn't matter... the only one that comes to mind is non-interactive television? But this is the Internet and people usually want live responses and chat with the audience... the difference between even two seconds of latency and subsecond latency for this fundamentally changes how the audience interacts with you.
I'll bite. I play D&D remotely with my friends. I need to be able to have low latency voice and video communication, but I also need control over the audio codec and bit rate. Zoom and other video conferencing solutions use codecs optimized for voice, which makes music and sound effects sound like a Himalayan AM radio broadcast. Twitch and Youtube give me control over the video and audio quality, but the latency is 5+ seconds even on low latency mode. I tried running voice over zoom and video and music over youtube, but then drawing on the map is 5+ seconds out of sync with me saying "look here".
When you are in voice with (some of) your viewers having a 10s delay is shit for everyone. A lower delay is just a better experience, for any kind of viewer input. That aside, it's easily possible so this whole "why do you want this, you don't need this" smacks of apple tech support - it's nice that you don't need it, but evidently you are not the only use-case on earth.
This is patently gaslighting. It is on me to try to read you in a positive light and it is on you to write language that supports what you are thinking.
I think it was a matter of what libraries were available where. Namely, Lightspeed-webrtc uses the extremely popular & robust Go library Pion[1] for webrtc. It's a little over 500 lines.
The Rust Lightspeed-ingest[2] server is also ~500 lines of code, and primarily handshakes the FTL protocol used to communicate with OBS.
There is a Pion port to rust[3] that is in progress. I am not sure the state of this work. Pion is used quite extensively by many many projects; I'm not sure if the rust webrtc-rs port has any notable users yet. As I began by saying, I expect the trustability & extensiveness of Pion is what lead to lightspeed-webrtc being written in Go.
Hoping to see OP answer this and in particular what I would like to see them comment on is how they divide their codebase between these two languages. Which parts are being implemented in which of those languages and why.
You should know the OBS team plans on deprecating FTL since Mixer was the only major player who ever used it, not to mention the fact that the server side of the tech is closed. The FTL implementation in OBS is buggy, and keeping it maintained is not worth the effort for a non-standard transport protocol.
Yes I am aware however there is a new service called Glimesh that is utilizing FTL so I dont think it is going to disappear tomorrow. Also I have implemented the server side so it does not matter that it is closed.
Well, I think it's possible FTL will be gone from OBS by the end of 2021 so I guess they need to figure out what they are doing sooner rather than later. There will probably be a post about it on the OBS Github soon.
The ingress components is interesting to me. It takes the OBS stream, via the FTL protocol, & converts it into something for WebRTC to use, yes? What drove you to use FTL protocol for ingestion? Did you consider alternatives like RTMP, which I believe OBS also supports?
I hand't heard of FTL before. Apparently it was a protocol used in Microsoft's now-defunct game-streaming service, Mixer. I found some discussion of the various streaming protocols here[1], which included some description of FTL.
I guess it comes down to latency. That would still have left SRT on the table, yes?
Great project. Such a key area of connectivity for us all. So glad you did this. Thanks.
I went with FTL instead of RTMP for the sake of latency. Also FTL gives me a stream of RTP packets which can go directly into WebRTC meaning I have to do 0 processing of the packets where as with RTMP I would have to convert them into RTP packets. Also SRT is interesting however it is wildly complicated and does not use RTP meaning I would have to figure out how it works and then convert whatever it gives me into RTP packets for WebRTC
This project is pretty cool, I’ll tinker around with it tomorrow.
As an aside, I’ve noticed you’re building out your own stream protocol stack (FTL/LightSpeed). What’s the reasoning there? Seems slightly inconvenient to have to “hack” OBS to make the output stream work. Will FTL support be merged into OBS in the future?
If you’re just trying to avoid the latency of RTMP then I might suggest considering the existing SRT protocol[1]. It’s been open source for a while and is well-established(native support in OBS core and optional in FFMPEG). Seems to already solve a lot of the transport-level stuff that you’re working on with FTL.
So FTL is supported by OBS and was used by Mixer. I’m interested to move to SRT in the future since you’re correct, FTL support will be going away eventually
Also the work required to adapt what I have to SRT is non trivial and I would rather have something that works right now and then build in SRT support in the future
Got it. I’m fresh on FTL, first I’ve really dug into it so apologies for the ignorance. I’m in the industry of stream transport and mostly work with SRT.
Yes to the first half of this at least (KCP is not something I've heard of).
Having recently attempted to use SRT for an event's backend restreaming stack, the low latency was nice but it's a pain in the ass. It's really not designed for links where latency isn't known and consistent. You have to bake an expected latency amount into initial protocol negotiation or you'll end up with problems, and OBS' support is quite poor (failure to establish a connection for whatever reason is likely to freeze it up completely, it sucks up a lot of cpu vs. rtmp, etc).
And the other end of the stack is either pretty immature and kind of wonky (haivision's own software, srt-live-server) or requires you to pay to use it or is very closed source.
WebRTC of some sort is definitely the future of this, imo. Even if the stack kind of sucks right now the results are fabulous (discord's video streaming for eg. is webrtc based and is easily the lowest latency free screen sharing I've seen outside share-my-desktop stuff like parsec or rdp).
It looks like this uses the already-built-into-obs support for the webrtc-based ftl protocol mixer used and microsoft killed. That's actually really clever and honestly I think this is far more appealing than SRT.
What sucked about the ‘WebRTC stack’? I think the situation is much better these days. We have multiple options. SRT only has one with lots of bindings. With WebRTC you could use any of these!
Any way to push non-OBS content to Lightspeed? I've been trying to run a sub-second latency game-livestream (ala twitch plays), and I'd rather run it on a headless server without OBS.
It should work, as long as you're sending it to the server over FTL[1], which is a pretty new and uncommon protocol developed by the now-defunct Mixer.
Could it be possible to use WebRTC to have listeners become seeders/repeaters so you could have more listener without impacting too much your own CPU ? Something like AceStream.
A very interesting project, can you elaborate more on how
it is getting sub second latency and why youtube/twitch seem to have more than a few seconds of delay.
HTTP video streams are split into segments and those segments are delivered whole. Larger segments are easier to cache and scale to bigger number of viewers. The other factor is that youtube and twitch spend more CPU time on compression to achieve lower bitrates.
YouTube and twitch use RTMP which operates over TCP. This means that each time we send a packet we need to ensure that it’s been received which adds latency overhead. Lightspeed uses the FTL protocol which operates over udp thus reducing the latency overhead
So assuming you have packet loss - you just get a paused/blank stream until the flow continues? How does it handle any network issues.
Would it ever be possible to route two stream via different paths to the client and let the client just accept the first packet from either and drop the other in order to add some redundancy to delivery?
This wouldn’t actually solve anything since WebRTC can handle packet loss. The loss is going to be coming from OBS -> Server which unfortunately there isn’t much I can do about that since it uses UDP
Multiple streams is BAU with RTP - using smpte-2022-7, or most of the time just firing the packet different times on different routing tables.
Sometimes network paths die. This could be a dodgy router in a third party network that drops streams for 150ms at a time, or a bgp recalculation that knocks it out for maybe a minute or so.
In both cases you need to have multiple routes to keep your latency low.
This has been a super fun project which has taught me more than any other project I have done. It uses Rust, Go and React and can be deployed fairly easily on a very lightweight server. For example, I have been doing my test streams on a $5 Digital Ocean droplet and the CPU usage is at around 20%. Granted not a lot of people are watching however it is more lightweight than a solution such as Janus.
The point of this project is twofold. First I wanted to learn more about WebRTC and real-time communication in general. Second, I wanted to provide a platform where people can setup their own little live-stream environment. Maybe you just want a place where you and some friends can hang out or you are sick of the main-stream platforms.
Anyhow as of writing this post it is v0.1.0 and considered an MVP release. In the coming months I (and hopefully some of you :)) will be adding more features and trying to flesh this out into as much of a full featured platform as possible. Feel free to take a look at the repo and let me know what you all think :)
[1] Open Broadcast Software (OBS): https://obsproject.com/