Hacker News new | past | comments | ask | show | jobs | submit login

Content over 4k is readily available and even cheap to produce, but most of the good stuff is still being produced in 4k, so why is that???



As someone who used to run a photography company: total costs of producing 4k content (camera system, computer setup to handle file management and editing, time required to process and manage content, storage, headaches, etc) is absolutely enormous compared to 1080p. Higher resolution content like 8k seems like a nightmare for everyone but the consumer. I do think and hope we’ll get to the point where it’s not a big deal, but even today a brand new macbook pro will struggle to render a basic 1080p composite in After Effects. Even basic 1080p footage editing in Premiere Pro can overwhelm reasonably modern machines.

(Yes, I know that you do X on your Y machine without problem, but my point is that it’s easy to forget to account for all these little costs, and they pile up! Especially for professional workloads where people love to push the bar higher and higher.)


Also, from experience: The bitrate and bandwidth of streaming services is so bad that the actual resolution doesn't really matter.

A good high-quality 1080p export will have significantly higher perceived quality than a "4K" (UHD) video on YouTube, Netflix, Amazon Prime or Apple TV.

I've decided that I'll produce all content in 1080p at the current time, even though I'm recording video in DCI 4K oversampled from a 6K sensor. No viewer is ever going to see the difference anyway.


There are several studies which say that viewer emotional engagement caps out at 1080p, and the only thing that drives more engagement beyond 1080p is HDR/WCG and HFR.

Customers are also not as willing to pay for 4k as those invested in UHD would like to think. It's a "nice to have" and having big numbers makes people feel good, but it doesn't actually make them love the content more.


I know that you do X on your Y machine without problem, but my point is that it’s easy to forget to account for all these little costs, and they pile up!

Cloud-based After Effects render farms do help significantly though.


You still doesn't seem to get the point.


I think I do. The point is that you can build your video in Affect Effects locally at 1080p (or less) on a standard Macbook, and then render the video in the cloud at 8k really easily and at quite a low cost. You don't need to "do X on your Y machine". You can rent someone else's machine for that.


> and at quite a low cost

So, let's actually look at that. For a "low-budget" 8K pipeline you'd first need a camera that can shoot at 8K or more. Ideally you'd want more so you can crop-in or do stabilization in post. So you'd be shooting on the Blackmagic Ursa Mini Pro 12K (~$6k) [1].

Usually you're using 5-10% of the material that you shot in the final edit, for some movies that can go as low as 1% (e.g., Apocalypse Now [2]), but let's calculate with a 45min documentary and 7.5% material used.

That means you've now got 30 TB of raw material [3]. You'll obviously use proxies for editing, but at least for color grading and for delivery you'll need access to the full material.

So now you'll need to store 30 TB of raw material somewhere in the cloud accessible to the machine that's doing the delivery. Even assuming you've got a symmetric fiber connection so we can disregard potential traffic limits and transfer speeds for initially uploading the material, you'll still need to pay for the cloud storage.

Creative Cloud has a storage limit of 100 GB for individuals or 1 TB for business plans [4], which is obviously far too limiting, so you'd need to use something like LucidLink. As you'd be working with large video files, you'd need high-performance storage, so you'd have to calculate with their performance plan, which is another $80 per TB per month, so $2400 per month just for this one single project [5].

________________________

[1] https://www.bhphotovideo.com/c/product/1578059-REG/blackmagi...

[2] https://books.google.de/books/about/?id=wB7cAAAAMAAJ

[3] https://www.braw.info/capacity/

[4] https://www.adobe.com/creativecloud/plans.html

[5] https://www.lucidlink.com/pricing


And of course all of the above may be "peanuts" for professional studios, but I am not aware of any hardware/workflow/money/things you can throw at the problem to make the editing and production experience smooth. There are tons of inconveniences, practical barriers, bandwidth issues (and I'm not even talking about network/cloud bandwidth, that's a whole other thing-- even just disk IO, moving stuff around via USB-C, backing up stuff, etc... It's all super labour-intensive and annoying).


Well, if you're editing with proxies (which is really easy with Resolve or Media Composer), the editing experience is really smooth. But that doesn't help for grading or delivery, where you'll still need the full resolution files.

Even with Gen 4 NVMe storage you'll quickly hit bottlenecks at those resolutions.


You'd be surprised at how much buffering and loading is needed to just playback 1080p content (even directly from your local machine's built-in SSD) in the video editor before it's rendered. It's incredibly frustrating to do video work on "regular"/prosumer (macbook pro) hardware.


> After Effects

A lot of that is on Adobe more than anything else, mind you. Same with Premiere, too. These tasks can be a lot faster than Adobe software allows them to be. But there is a limit, of course


Agreed 100%, and I hear Final Cut Pro has much better performance.

But realistically, a lot of work happens on Adobe or other (Autodesk, Houdini, etc...) products that also have their own issues.


Asking as a person conpletely out of that world: Can DaVinci Resolve or Final Cut Pro do the same as After Effects and be faster?


I don't think these products are directly comparable. I think Final Cut Pro probably maps better to Adobe Premiere Pro... But even then, it's not a complete overlap. There are things you can easily do in Premiere that you can't easily do in FCP, and vice-versa. In general, I find that Premiere is a bit more powerful/flexible than FCP, but FCP is much better software and does what it does much smoother and much faster.


No, they're for completely different use cases. While it's technically possible to edit a video in After Effects, its main use case is compositing VFX and advanced motion graphics.


The demand for it would be from a minority.

A lot of people are watching content on their phones, for one thing.

I create some travel video content and up until last year, most of my clients weren't even fussed about 4K. I had been shooting at 5.1K for most of last year and then dropped to 4K this year because no one has needed it, but the storage and delivery costs are higher.


Cost of delivery. Most stuff is being streamed these days, and nobody wants to have to pay for the cost of streaming 100gb files. Plus the cost of commercial videography equipment is insane. Why upgrade before you have to?


The better the capture, the longer the shelf life of that footage.

So, while the pipeline might not run at more than 4K, the source material ideally is at a higher resolution.

And you need to make sure it still looks good on lower quality displays.


Nobody is making movies that anyone would really care to preserve for the coming decades anymore anyways.

Besides which, we took a big step back when we switched from 35mm to digital. Digital has only recently reached par quality.


The BBC probably thought that when they recorded over old Dr. Who episodes. It’s up to our grandchildren to decide what they want to see and what they don’t.


Because 4k (and even 1080p) is good enough for most people?


Yeah, for TVs that sounds right, but not for desktop displays. I wonder if desktop display pixel density is only stagnated because the TV industry has so much influence on the momentum and the TV industry thinks 4k is good enough.


I think that influence is pretty clear. In my neck of the woods, when pc monitors in 16:9 became a thing, the marketing copy was basically how great they were for watching movies.

Only now do some manufacturers start to put out taller monitors, but they're still rare, at least in my market.


Is it even needed on desktop displays? My experience has been that anything under 32” you need to use scaling to make anything readable on a 4K display. I’m current using a 43” 4K display as my main monitor, sitting not very far from to, I can’t distinguish individual pixels as it is, and frankly even this is uncomfortably large, I have gutters set up around the edges so that windows maximise to a comfortably viewable size.


That's why I'm happy to see new 8K TV release. I think it's overkill for 99% TV usage, but it's needed for PC display.


Also, if you compare same laptop available with a 1080p screen or a 4k screen, the former gets much better battery time.


Bandwidth. ISPs already throttle streaming and have data caps.


Everytime you increase the resolution all the requirements in term of cpu, ram, bandwith, storage explode for production, media distribution and end user explode.

It is not just replacing one screen for another on client's side.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: