People are obsessing over the lens but really hard part about video production is the workflow. Timecode, track syncing, color grading, AI, storage. Apple is solving it piece by piece I can see iOS based workflow becoming norm in not distant future.
How close to OOC is the end result? What processing was done in Apple software?
It appears the post-processing was some combo of Mac and iPad... was that Final Cut, or Adobe, or something not readily available to consumers?
What I'd love to see is a dual video - one done with Panavision rig and whatever workflow Apple's media team uses. And a second that's the same scenes, but shot with only an iPhone and the only post-processing is in Apple's consumer software ecosystem. Zooming with the cameraman's feet, etc.
After coloring grading there's going to be difference in sharpness but most won't notice. The biggest visual difference is going to be depth of field. Without specialty lens iphone has to rely on a depth matte, which is not very clean with current gen hardware. With Panavision lens you can manually pull focus. It will look better but requires a dedicated person on set.