Hacker News new | past | comments | ask | show | jobs | submit login

3.8Mbps 1080p is hardly 1080p. Decent quality starts at several 10s of Mbps, and high quality 1080p is in the 200-400Mbps range.

Ideally in the future you would A) be broadcasting directly to other participants, not going through zoom’s servers, which might multiply upload bandwidth needs B) be broadcasting at several 10s of Mbps per stream.

I definitely prefer higher bandwidth vconf. I’m not worried about blemishes, but I want to see people’s minute facial expressions better.




> Decent quality starts at several 10s of Mbps, and high quality 1080p is in the 200-400Mbps range.

That's so incorrect I don't even know where to begin.

I regularly record my own "talking head" in 1080p from OBS in h.264 at CRF 14 in "ultrafast", which is extreme crazy quality/space overkill (literally indistinguishable image from original, extremely low CPU usage), and that's an average of 34 Mbps.

I then immediately recompress the resulting enormous file to a still extremely high-quality CRF 21 in "veryslow" to be suitable for editing and subsequent recompression later, and that results in an average of 2.4 Mbps.

For comparison, high-quality 1080p h.264 movie and television releases on torrent sites are usually around 5 Mbps. The difference is that half my frame is a plain/stationary background while movies and TV have complex backgrounds, hence double the bitrate.

I have to ask -- where did you get the idea that "high quality 1080p is in the 200-400Mbps range"? That's off by two entire orders of magnitude. It's almost all the way to raw uncompressed 1080p, which is 1,500 Mbps.


> high-quality 1080p h.264 movie and television releases on torrent sites are usually around 5 Mbps.

BRRips are frequently 30+GB for a 1-2 hour movie. Do the math. You’re off by like 5-8x.

> I have to ask -- where did you get the idea that "high quality 1080p is in the 200-400Mbps range"?

From actually filming and editing video.

Keep in mind that real-time encoders (such as the one in a video recorder or that zoom has to use for reasonable latency) are pretty constrained and will generally achieve worse ratios. If you need to get 3Mbps in real-time on a laptop your only option is basically to quantize the shit out of the video. Releases that can encode slower-than-real-time can use longer GOP, B-frames, etc.

> It's almost all the way to raw uncompressed 1080p, which is 1,500 Mbps.

10bit 444 1080p60 is 3.7Gbps.


> BRRips are frequently

That's something entirely different, not what I was talking about. Also, if a Blu-Ray gives you the space, there's no reason not to use it. That doesn't mean you need it. Which is precisely why the rips that people commonly share don't.

> From actually filming and editing video.

That's for recording frames independently for editing on a professional camera. Not for anything you'd ever transmit live to a consumer in a million years.

> will generally achieve worse ratios

Worse than 200-400Mbps? What are you even talking about? Even an iPhone encodes to just 16 Mbps. Which is definitely not greater than 400.

> basically to quantize the shit out of the video.

Looks fine to me. It doesn't need to be lossless. It just needs to be good. I've never heard anyone complain about an iPhone "quantizing the shit" out of their video. To the contrary, people love iPhone-quality video.

> 10bit 444 1080p60 is 3.7Gbps.

Obviously I'm talking about 8-bit 30fps (444).


Where you go wrong here is that a webcam image is pretty stable. Same background, person moving a little.

Sure it can't be as optimal as a non real-time compressor can, and should therefore need more bandwidth, but high compression ratios on such an image aren't difficult. It's not valid to compare professional film work or even output BluRay encodes to what's required for your typical home video call.


That’s a fair point.


>BRRips are frequently 30+GB for a 1-2 hour movie. Do the math. You’re off by like 5-8x.

No. First, 1-2 hour is a 100% increase. 15GB for a 2hour movie is realistic, with a very high quality (CRF 18 or 19). Which would then be ~15Mbps, considering that there is also audio. Going further does not increase quality, but does increase file size.

It seems to me that you want to transfer an intermediate-codec (or even raw) via the Internet.


What kind of camera is recording 1080p at 200-400Mbps?

For example here https://www.red.com/recording-time if I put in 8K, 60fps and choose the lowest compression ratio, it's still saying it will only use 270Mbps. At 2K the highest bitrate seems to be 41Mbps.


> Decent quality starts at several 10s of Mbps, and high quality 1080p is in the 200-400Mbps range.

What? No. 10 Mbps is already plenty for high quality 1080p with the latest codecs. Even 1080p Blurays only run 20-40 Mbps and they're typically very conservatively compressed.


I think a lot of people might take issue with the statement “plenty high”. This type of statement comes across as similar to when we constantly heard ISPs decrying “25 down is plenty! it’s only pirates who need more than this!” and this was only a few years ago when ISPs were spouting this insanity.

Using your numbers — and we’ll pretend we haven’t already sailed past 1080p as a standard — if we expect better than garbage zoom tier quality, at 20-40 for one stream, most of our isp connections are currently woefully slow for uploads.

Many many many people have others who live in their households, whether they’re roommates, partners, or kids. So if you have 3 people on video chat, suddenly if we want high quality streams, that need goes from 20 and shoots up to 60+. If you have 3 kids on a video chat or streaming for school or play and a parent or two working from home, suddenly you’re up to 100+ upload.

And that’s ignoring that 1080p is becoming old world. I haven’t looked recently, but when buying my last monitor set a year ago, 4k seemed to be by far the most commonly available.

this whole x amount of bandwidth “is plenty” almost always seems to ignore a) real life situations of most families/roommates and b) forward moving technologies.


Where are you coming up with these numbers 20-40 Mbps for a single video stream?

That's insane. Like I mentioned in another comment, high-quality rips of 1080p movies tend to be 5 Mbps, and that's generally with a lot more detail and motion than what your camera is filming at home.

There's really no point in 1080p streaming video above Zoom's 3.8 Mbps. Even that's already overkill which is why almost nobody uses it. While the "20 mbps" you're talking about is beyond the threshold of lossless for human perception, and 40 is beyond that.

And beyond it? There's virtually no use for 4K video in personal communications, nobody wants to see the pores in your nose. It's not about technological limitation, it's about use cases. And a 20 mbps uplink handles the real life situations of most families and roommates just fine, when you use actual real numbers and not the ones you're just making up.


If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses. This isn’t even some audiophile-tier argument about an extra 60kbps in MP3 or whatever; there are obvious compression artifacts in 20Mbps real-time-encoded CBR video, especially on any kind of remotely high-entropy scene.

> high-quality rips of 1080p movies tend to be 5 Mbps

This is nonsense. A decent 20-30GB BRRip will be at least 25-30Mbps. Also, it’s not a fair comparison because it’s not real time encoding. If you can afford to encode in 5% real-time speed you can get much better compression ratios.


> If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.

I think you're the one imagining things here.

Like I said in an earlier comment, I record h.264 in "ultrafast" CRF 14. Every guide that exists says this is below the threshold for indistiguishable from raw footage, and my own eyes agree after extensive side-by-side comparisons. I go as low as 14 because it's overkill. But there's simply no visible difference. And it's an average of 34 Mbps for the kind of stuff I shoot, which as I said is overkill already.

200 Mbps is insanely unnecessarily high, even for real-time encoding on a crappy CPU.


> But there's simply no visible difference.

What are you filming? It sounds like you’re streaming a video of your face; how are you capturing it? If your video is blasted with shot noise from a 10mm^2 sensor through an 4mm wide lens, where “1080p” is more of a recording convention than a faithful description of the resolving power of the system, I can see how 34Mbps might look as good as whatever you see uncompressed.


>If you’re can’t tell the difference between 20Mbps and 200Mbps you need glasses.

Can you provide examples?

https://www.screenshotcomparison.com is suited for this especially.


> especially on any kind of remotely high-entropy scene

Unfortunately my job is pretty boring. Most of my video chats are low entropy.


>That's insane. Like I mentioned in another comment, high-quality rips of 1080p movies tend to be 5 Mbps, and that's generally with a lot more detail and motion than what your camera is filming at home.

I would double that, for at least movies. Not sure if you are going to see an increase in quality in smartphone style cameras.


The family that can equip their kids with production-grade 4k video kit can probably afford 100Gbit business internet service to their house tbh.

4k UHD Netflix stream is ~20Mbps. 1080p is usually about 5-6Mbps, and 99% of people say that it looks great and is all they want.

4k UHD is not needed for effective video chats for most business and personal use. And they wouldn't even need the same as a stream as it's a relatively static image and thus easy to compress.

Your image is typically a little square on the screen too (not the full display size). It is highly unlikely consumers will ever shell out for the camera equipment to create high quality images that need such bandwidth, even if such bandwidth becomes common.

Moore's law will maybe push this all forward in time, but what you describe is a total exaggeration of the current situation.


None of the video streaming software is set up for that, because nobody's internet can upload in that. The best I can do is a 1080p SLR ($350, once) + clicking the HD button in zoom, and most of that is being carried by the better optical system. All the low frame rates, micro stutters and so on still exist, adding to zoom fatigue.


I don’t understand why everyone is supposing that an entire household should be fine with the ability to send at most a single video stream out. What if both my wife and I have separate calls we need to be on? Or my kids want to play Fortnite while I’m uploading data for work? 10mbps up is 1990s-era tech.


I don't think anyone suggests that. But if we compare a "modern" asymmetrical connection such as a 250/50 or a 500/100 or a 600/300 or a 1000/100 then the "one HD stream is sub 5mbps" still means an asymmetric connection fits lots of these streams!

Obviously a 100/10 or 25/5 connection isn't going to cut it. I think really the gist of the article is "You need enough bandwidth in both directions". That's it. If you have a 100mbit, 200mbit or 500mbit or 1000mbit down with a 100+ mbit up, that's less important. "Symmetry" doesn't matter, it's enough bandwidth in both directions that matters.

The fact that "enough in both directions" and "symmetric" have been conflated is that because for a part of history, only symmetric connections did have enough upload bandwidth. With the gigabit and multi-gigabit down bandwidths, there is less need for symmetry so long as you have a fast upload.


On a technical level most of the asymmetric techniques just carve up the spectrum/bandwidth (in Mhz) to give more of it to downstream. Or timeslots or whatever.

I fully agree with the EFF that you need a decent upload, to support the applications of today and next few years. But to go fully symmetric actually means to lower the download in favour of upload, allocating exactly half of the spectrum to each.

So absolutely, once you reach a certain threshold I think most users are going to opt to carve up the spectrum in a way that favours download, just based on typical needs.


A dynamically allocating bandwidth allotment from a fixed budget would probably be a great improvement. 5 seperate zoom call time? More upload! Netflix time? More download, less upload! 3am backup upload time, mostly upload!


BR is different because the encoder doesn’t have to be real-time. It’s also just medium quality. 1080p60 264 or 265 encoded at 200+Mbps is what you get out of decent video cameras.


I believe such high bitrate for 1080p is I-frame only. It's useful for video editing, but not suitable for this context.


Many mid range video cameras support “long GOP” which means P-frames. They still spit out however many hundreds of Mbps you ask for.


You can be conservative and assume that a live-encoding will require double the bitrate for the same quality. So for very high quality 15 Mbps you need 30.


60fps looks disgusting though.


I agree it looks weird for cinema, but I prefer it for video chats. It feels more authentic.


No. Even Blurays don't have that. A high quality rip has somewhat 20Mbps and thats on the high end with H264.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: