Hacker News new | past | comments | ask | show | jobs | submit login
Scalable Live Video Streaming Using NGINX and MPEG-DASH/HLS (nginx.com)
140 points by slederer on June 1, 2016 | hide | past | favorite | 52 comments



"but for ease of reading we refer to NGINX Plus throughout"

Uhuh. Suuuure. :p


I also would hesitate to call it "Live" without showing how to adjust for GOP sizes, key frames, B and I frames, etc.


The one thing I've always wanted from nginx-rtmp was the ability to pull from a source input instead of running a separate process with ffmpeg to push the stream over. If they could get that working my life would become less complicated quickly.


yeah I use nginx-rtmp for local video capturing (basically the functionality of a mirror) and the lag that needing to re-encode + chunk + starting off at the beginning of the chunks in the playlist introduces is meh. I think I'm going to transition my project over to WebRTC though, since the mobile device I'm using to display the video can handle that now.


iOS still doesn't support webrtc...


Neither does a lot of Android! Luckily the product I'm using uses https://crosswalk-project.org/ -- that sounded like an ad!


> [I'd prefer] the ability to pull from a source input instead of running a separate process with ffmpeg to push the stream over.

Could you elaborate on this? I don't understand what you mean by 'push' and 'pull' in this context.


Currently to get the stream from ffmpeg into nginx you have to do the following: ffmpeg -re -i input_file -c copy -f flv rtmp://nginx_server_url:1935/app

It would be nice to have the ability to set up an app block that looked kind of like this

application source1 { exec fffmpeg -f decklink -i 'DeckLink Quad (1)@8' -f flv rtmp://localhost/app/$name }

ffmpeg is still pushing the stream to nginx, but nginx is in charge of starting that process.


I think you might be looking for the exec_pull (or exex_static) directive

https://github.com/arut/nginx-rtmp-module/wiki/Directives#ex...


I've tried, and it doesn't work so well. I don't think it works for every OS either.


Is there an advantage that an nginx process like that has over streaming servers like liquidsoap?


liquidsoap is audio only right?

Most streaming servers are bloated and slow. If nginx can do for video streaming what they've done for web servers it would be great.


Nope, liquidsoap does video. http://liquidsoap.fm/doc-svn/video.html


I wish there was a decent HTML5 low latency (< 2s) solution. Nothing comes close to RTMP + Flash still.


The scalability of DASH/HLS comes from the fact that the video segments are just static files sitting on a http server so they can be cached and distributed with the many techniques for serving static files over http.

If you don't need the scalability of DASH/HLS you can use the WebRTC apis for low latency streaming, usually <1s. WebRTC can be used for client-server applications just as easily as peer-to-peer using a gateway like Janus[1].

[1] https://janus.conf.meetecho.com/


Yer I've looked into WebRTC before but the browser support isn't good enough yet :(


What browser doesn't support WebRTC but does support flash/RTMP ?


Internet Explorer 9, 10, 11...



Not using browsers, those experiments were done using the MP4Client as a client


HD RTMP to the client is extremely difficult to scale without an enormous investment in infrastructure. AFAIK BAM/MLB.TV are the only ones doing it.


Abusing WebRTC can be a viable option in latency-critical applications, but it brings a bunch of other concerns with it (browser support, no way to leverage a CDN, etc).


The setup seems rather similar to https://github.com/arut/nginx-rtmp-module which I've used in the past. Perhaps they've bundled it with Nginx Plus without crediting the original authors? The project has a bit too permissive license for my taste.


Roman Arutyunyan is (or was previously) an engineer at nginx when he developed nginx-rtmp.

https://www.youtube.com/watch?v=1boJWioxsWc


I work at Nginx, although not on the core development team, and to the best of my knowledge Roman Arutyunyan is still an engineer with us.


> Both NGINX and NGINX Plus support the features we’re discussing


But the nginx-rtmp-module does not support DASH with ABR to the best of my knowledge. Seems to be still an open issue https://github.com/arut/nginx-rtmp-module/issues/480


This looks cool. I'm very interested in live video but can someone enlighten me on the creation side of this equation?

How do I create an RTMP stream in the first place?

What camera can/should I use? Can I use webcam? Is there an iOS app that can do it?

What hardware/software is needed to create this RTMP stream that I'll be pushing to nginx?


Streamers typically use something like [1] OBS, or [2] FMLE to stream their desktop or webcam to an RTMP ingest server like Twitch, Youtube, Streamup, Streamboat.tv, etc. Can even just use ffmpeg.

[1] https://obsproject.com/

[2] http://www.adobe.com/products/flash-media-encoder.html


ffmpeg, vlc, OBS Studio (this one is probably easiest; you can just add an image or add a webcam source etc then configure a rtmp url to push to)


Wirecast from Telestream is a good paid option as well.


Wowza is another one. It scales very well.


Lots of people in here commenting about nginx plus, but can anyone recommend a free alternative to BITMOVIN?


I had the same wonder and through some digging found videojs-contrib-dash[1] which uses dash.js, referenced on the bitmovin site[2]. I'm going to give this a go and see what does and doesn't work.

[1] https://github.com/videojs/videojs-contrib-dash

[2] https://bitmovin.com/mpeg-dash-open-source-player-tools/


video.js with the HLS plugin is fine if you don't need dash (there may be flash) but obviously that's very bare bones. It seems like there's things like ad support among other features you get with the suggested player.


Anyone have any insight on RTMP + HLS / MPEG-DASH -> WebRTC? Ala www.beam.pro (what they call FTL)?


Is there a way to mux subtitles written on the fly?

For example, I want to mux a subtitle stream, with the real time clock in it, into a video stream.


ffmpeg can do that fairly easily. To adjust the command in the example, you need to to take the srt file and pass it as a second -i input file. There's some helpful guides on how to do that if you search around.

e.g. https://en.wikibooks.org/wiki/FFMPEG_An_Intermediate_Guide/s...


But I don't have a subtitles file. I would need it to read from a stream, not a static file.


Try having it read from a fifo instead.


But how? I've searched the documentation.


How does this compare to Wowza?


It's not running on Java so it should be faster with the same hardware.


There may be a lot to complain about with Java but it is insanely efficient when it comes to serving data out a network port. Wowza will scale well beyond most companies' needs, I've used their stuff for years and it is the one program that convinced me that Java actually can be very efficient for applications like these.


basically the same. I find nginx + ffmpeg to be easier and simpler than deploying Wowza. As well as free-er.


Working on a similar product but using peer to peer live video streaming. Since the data does not touch the server, there are no scaling problems.


There are a couple of cool P2P companies out there like http://www.streamroot.io/ or https://www.peer5.com/


Who's the target market? I think I'm a bit jaded as my first thought was this would open up the streamer to get DDoSed every time they try to stream


One-to-many or one-to-one?


one to many


FYI. The rtmp module broke recently with the recent nginx 1.11.0 release.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: