Hacker News new | past | comments | ask | show | jobs | submit login

People are obsessing over the lens but really hard part about video production is the workflow. Timecode, track syncing, color grading, AI, storage. Apple is solving it piece by piece I can see iOS based workflow becoming norm in not distant future.



This.

How close to OOC is the end result? What processing was done in Apple software?

It appears the post-processing was some combo of Mac and iPad... was that Final Cut, or Adobe, or something not readily available to consumers?

What I'd love to see is a dual video - one done with Panavision rig and whatever workflow Apple's media team uses. And a second that's the same scenes, but shot with only an iPhone and the only post-processing is in Apple's consumer software ecosystem. Zooming with the cameraman's feet, etc.


After coloring grading there's going to be difference in sharpness but most won't notice. The biggest visual difference is going to be depth of field. Without specialty lens iphone has to rely on a depth matte, which is not very clean with current gen hardware. With Panavision lens you can manually pull focus. It will look better but requires a dedicated person on set.


It’s final cut. I believe they showed that the last time they released the video about shooting an event on an iPhone.


This. The lens is sort of the pointless gimmick they use to sell the actual value proposition.


Different lenses are needed for different jobs. That's why most production shops just rent the lenses they need for the shoot.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: