Hacker News new | past | comments | ask | show | jobs | submit login
Apple’s “Let Loose” iPad event was shot on iPhone with Panavision lenses (prolost.com)
49 points by robenkleene 48 days ago | hide | past | favorite | 83 comments



The attempt to stave off criticism with, "So is it fair to say “Let Loose” was “Shot on iPhone” if it was done with the help of gear the average iPhone owner could never afford? Of course it is." is very disappointing. At least be honest with the question, there. Can you reproduce their ad/event with just an iPhone? Absolutely not. Can you still make some impressive things with one? Absolutely. Will we someday be able to do the full thing with more compute and less gear? Probably.


When "Don’t You Start Internet" is one of your article's subheadings, it should prompt a deep consideration of whether or not it's actually worth uploading online.


An average iPhone user isn’t going to shoot a commercial so it doesn’t seem like a particularly interesting question.


By that metric, doesn't seem like a particularly interesting claim for the advert?

Don't be rude to the audience that is coming to the article, either. It is specifically engaging with people that will have these interests.


If an ad says "shot on iPhone" I would expect to get the same results if I bought it.


Ehhhhhhh. The same thing applies to every camera. You could say “shot on Sony A7iv,” and you’d still need very expensive lenses, lighting, and other gear. Sure, it’s more impressive to say “shot with the kit lens,” but a lot of the point here is that the device’s camera system is fully capable of advanced shoots. That means you can do a lot with it with AND without expensive gear.


The difference is expectations - people buying professional cameras expect and are used to needing to buy additional gear. In this case, "shot on iPhone" is obviously intended to deceive consumers into believing something that isn't true.


The difference is in who they are selling the product to. The iPhone has a very broad audience, some of whom may want to do photography with very little understanding of the field. They see the product. They see the results. They don't see the context. Someone buying an A7iv probably has a stronger understanding of the field to start with. They are likely looking for a specific product for a specific reason. If they don't or aren't, they almost certainly would have a better idea of what they are getting into by the time they actually make the purchase. (Perhaps through their own research, if not the up-sell would clue them in).


A Sony A7iv has threads for attaching lenses. An iPhone does not.

It's like if I feed you a six-course gourmet meal and tell you "I made it on a camping stove", when in fact the camping stove was just used to keep one of the sauces warm while everything else was made in a professional kitchen. That's what is happening here; there are some shots that the iPhone is physically incapable of making, so they used something that is capable of it and filmed that on the iPhone.

"Shot on iPhone" is implicitly promising that you can make what you're seeing if you have an iPhone. NOT that you could make it if you have an iPhone plus some additional equipment that is capable of making those shots, that nobody would even imagine using. For the most part, none of us even knew such a thing existed, although it's not hard to imagine theoretically.

"Shot on Sony A7iv", on the other hand, is pretty obvious to anyone at all familiar with photography that it is only telling you what camera body was used. In fact, people with any level of familiarity would respond with "ok, what's your point?" because the model of camera body is a relatively small detail if you're trying to describe what you used to make an image. What lens, exposure, postprocessing, etc? It's like admiring someone's work in a gallery and asking "what did you use to make this landscape scene?" and getting back "paint."


> You could say “shot on Sony A7iv,” and you’d still need very expensive lenses, lighting, and other gear

Not really. A very cinematic look is fairly achievable on any recent DSLR with a pretty run-of-the-mill 'holy trinity' set of lenses—14-24 ƒ/2.8, 24-70mm ƒ/2.8, 70-200mm ƒ/2.8—and a cloudy day.

Of course, each of these lenses are still a couple thousand each (less so if bought used), but they're workhorses and nearly every semi-pro photographer probably has at least one. I own a 70-200 ƒ/2.8 myself.

But compared to real cine gear, they're about an order of magnitude less costly.


Not necessarily wrong, but also still a deflection. If the defense is purely that "advertising misleads," sure. But that is why I say you should at least be honest with the defense here.


People are obsessing over the lens but really hard part about video production is the workflow. Timecode, track syncing, color grading, AI, storage. Apple is solving it piece by piece I can see iOS based workflow becoming norm in not distant future.


This.

How close to OOC is the end result? What processing was done in Apple software?

It appears the post-processing was some combo of Mac and iPad... was that Final Cut, or Adobe, or something not readily available to consumers?

What I'd love to see is a dual video - one done with Panavision rig and whatever workflow Apple's media team uses. And a second that's the same scenes, but shot with only an iPhone and the only post-processing is in Apple's consumer software ecosystem. Zooming with the cameraman's feet, etc.


After coloring grading there's going to be difference in sharpness but most won't notice. The biggest visual difference is going to be depth of field. Without specialty lens iphone has to rely on a depth matte, which is not very clean with current gen hardware. With Panavision lens you can manually pull focus. It will look better but requires a dedicated person on set.


It’s final cut. I believe they showed that the last time they released the video about shooting an event on an iPhone.


This. The lens is sort of the pointless gimmick they use to sell the actual value proposition.


Different lenses are needed for different jobs. That's why most production shops just rent the lenses they need for the shoot.


Looking at the size of that rig, "Shot on iPhone" is super misleading. Might as well use any modern FF body without the need to "rent" a specialized Panavision rig.

Edit: because it apparently needs explaining: "that rig is so big yet comes with so many limitations that it's foolish to not just use any normal camera body"


But your own sentence makes the exact comparison the article wants: if the end result is all the same, then the iPhone is providing a good enough experience to do this kind of work.

Yes other rigs might make more sense, but that’s not the point. The point is that they’re putting their money where their mouth is, and proving that it’s capable enough.


No. The point is that it is deceptive advertising, and it is against Apple's self interest to do so. If customers are sold the idea that Apple marketing producers did all of this on an iPhone and not a big Hollywood studio camera lens setup, they expect to be able to do that at home. Same when they sell the iPad M4 as some kind of revolution like the first iPhone or the MacBook Air. They are squandering their hard earned reputation.


And what about lighting, and sound design and makeup etc?

Factually, it is shot on an iPhone. This is no more a lie than when a film says it’s shot on an Arri or if a Sony commercial says it’s shot on an FX3.

None of the results are possible without the rest of the equation.


> Factually, it is shot on an iPhone. This is no more a lie than when a film says it’s shot on an Arri or if a Sony commercial says it’s shot on an FX3.

That's so disingenuous. The iPhone has a lens, and no one expects to put another lens on it. In fact, I'd wager a large amount that >99.9% of consumers have never even considered that it's possible. Whereas 100% of FX3 customers expect to use a lens.

I could probably make a pretty sweet setup to record a Netflix movie in awesome quality on my iPhone camera and there would be the perfect depth-of-field that comes with the professional gear that Netflix uses. And it would be factually shot on an iPhone.


You are comparing this to makeup? Really? You don't get it. No amount of "well actually"-ing is going to make consumers feel less deceived.


Really? You don’t get it? No amount of “well you need a better lens, so it’s a lie” would get you the same results anyway.

It’s a professional shoot. End of story. It’s factual that it’s shot on an iPhone. If you replaced it with a full frame body, you’d still not achieve this.

If your entire point is that a customer might buy the device and think they can achieve the same results, then why does that not extend to anything else?


I don't have a problem with this Shot on iPhone ad, because it clearly shows what kind of rig is needed behind the scenes to achieve the results, assuming it wasn't doctored and the footage is real: https://www.youtube.com/watch?v=jELeFXNJUOE


It's not the same because I don't need to rent a special rig; I just grab the camera off my shelf.


But you also grab a lens off your shelf. Do you grab a 0.1$ lens that goes into a cell phone, or do you grab a lens that rivals the cost of the camera body?

I see apple's point being that an iPhone can compete with standalone camera bodies of today if you put a good lens on them. Something that was not true 10 years ago for phones, and only slightly older than that for any camera.


If that's their point, then they should say that, instead of saying something else. "Shot on an iPhone" is a lie in context.


No more than saying "shot on a Sony" is a lie if you don't specify the lens used.


The lens is doing 99% of the work here. Also lighting. That a phone (not just iPhone) is good enough is not news; it's been a while already.


I won't downplay the heavy lifting the lens is doing, coming from almost 20 years with photography as my main hobby, but there's a lot more here that's worth mentioning that the lens isn't responsible for. Among others:

- Dynamic range - Colour science - Video output (as in, what codec is actually worked on in the editing room)

The idea that even a higher-end phone can put out ~12 stops of dynamic range while outputting ProRes with a log colour profile is notable, no matter how much work a Panavision lens is doing. It makes editing with industry-standard tools and getting a product comparable to what you'd get with dedicated photo/video equipment easier.


"Shot on DSLR" films are also like this. There's a camera body in there somewhere, and $100k worth of junk hanging off it in every direction, and the focus is being operated remotely by a guy with a 30" display.


The situation isn't the same because anyone buying a DSLR with swappable lenses should understand that they're buying the body and not the lenses. It's fair game to use the best possible lens in that situation because then the camera body capabilities are the limiting factor in the test.

When you buy an iPhone for photography and video, you're buying the sensor, lenses, shutter, focusing system, and onboard image processing in a single package. Changing that package by using an aerial image from a different lens is a material change in the system because whole chunks of the iPhone image system are no longer doing the work.


The Creator was pretty awesome. Sony FX3, a $300 Kowa 75mm anamorphic, a Ninja monitor/recorder ($800?), a Ronin RS2 stabilizer arm ($300). Plus some cheese plates, batteries, audio interface. But the $3k camera was probably the bulk of their cost for the rig. https://www.visuals.co.uk/blog/news-ep358/

Theres been a ton of press about this, because, yeah, it is hella rare. And yeah usually there's some dude pulling focus on a $20k+ monitor (the brightness these things need to work in even shaded daylight is absurd).

I think it's worth assessing what impacts the look and feel versus what is for production sake; there's a lot of ways to spend more but more to help production than to affect look. A talented camera operator probably could hold focus themselves on many scenes without the focus operator (especially with modern autofocus) and get similar output. But if you're making a $100m+ movie, heck yeah you rent gear & hire the focus puller, way less risk (although the autofocus probably is more precise & reliable in many ways now!), way less need to make sure you pick the right expert operator. Ditto for a lot of this stuff; just rent the most badass stabilizers and camera mounts, for that little extra edge of stability. Having good/expensive synced time-codes, having better position tracking; they speed production, not necessarily super improve it's output.

It's pretty awesome seeing the Netflix approved pipelines keep getting more in reach. There's really good options now. Even a $1000 lens can be super impressive. There's really good budget cine lens sets now. But yeah the rest of production stuff is never cheap. Lighting is the key thing that keeps coming to mind for me, where light budget can easily get to camera budget and then some, and boy does that make a huge difference; improve the output of the scene that your camera is set to input.


If you use anything else, the reality distortion field around iPhone cameras collapses!


"Shot on iPhone" (10% by weight)

Assessing by weight seems like a pretty obvious-ish way to assess.

We could also do by volume? That would probably be even worse on iPhone's claim; I feel like phones are pretty dense? Lenses certainly can be, filled with heavy glass, but now a days there's also a big drive to make lenses lighter; reviews these days often have people surprised how relatively lightweight even a sizable lens is. It'd be fun to have total volume as a standard consumer spec item, so we could calculate average density.


Shooting on any FF body is totally missing the point. What in the world are you getting at?

Regardless of what you are thinking, the "not a full frame sensor" in the phone is kind of the point. As well as all of the "AI" processing done to the photons captured by whatever piece of glass is in front of it. It's this processing that makes the phone they are showcasing different than an "modern FF body" that you are suggesting.

You've clearly missed the point


Perhaps you've missed the point.

The iPhone in this use case has multiple limitations that make no sense in the use case where it is a glorified sensor.

You can't swap the memory out, the memory is limited to begin with, you can't use attachments like monitors and other gear, etc.

If you're going to end up renting a huge professional rig to attach a sensor to it to get the same results as a typical FF or even APS-C camera, you've lost the message.

In other words, "that rig is so big yet comes with so many limitations that it's foolish to not just use any normal camera body"


The point is that it is the internal recording that is being touted. It captures log. It captures ProRes. Even without the special lens, it still does all of that. To show that off, they put a really nice piece of glass in front of it. Even if they disable all of the AI processing, it's still really clean video.

Today, what is a normal camera body? BlackMagic's cinema cameras are not much larger than an iPhone Pro Max Ultra+ model. I have seen the BMD 16mm format camera with giant Angénieux lenses attached to them. The camera is barely noticeable. Not really sure what you're point about the camera size is, and why you're so focused on it. The fact that a small device like a phone can create the clean log ProRes file is the point.


It's not misleading.

iPhone has a lot of things that other bodies won't have. A chip powerful enough to do on-device editing, a top-tier calibrated OLED screen with 1600 nits HDR brightness, an extensive suite of apps, on-board AI, etc etc.

I think it's really cool to see these devices become even more capable.


Professionals will never use a rig like that, and the implication when they say "shot with iPhone" is that by buying iPhone you'll be able to shoot something like that, so it is misleading advertising.


Never say never. Steven Soderbergh shot Unsane with phones and I wouldn't be surprised if other commercial movies have been shot on phones.


True, it would be better to say 99% of professionals won't be using rigs like that for the time being, I was generalising.


The van diagram of people who

- can afford the rig showcased here

- use an iphone to shoot professional videos

- edit their videos in a mobile device

is empty


I mean, it can play Fortnite for all the good it does a cameraman. At the end of the day I don't think on-device editing and apps/AI is really a killer feature at all. People want the raw footage, and Apple pushed to make the iPhone capable of that early.

For that matter, does Apple still not offer a way to get regular RAW files from your phone, or are we still stuck in proprietary candyland?


Assuming you're talking about ProRAW, it's not proprietary or a closed format. They're just DNG files that adhere to the DNG 1.6 standard.

https://petapixel.com/2020/12/21/understanding-apple-proraw/


No, it was actually shot on Panavision lenses with iPhones attached. Big difference.


If I shot the event on Panavision lenses with my four-year old Galaxy A30, would the quality match that of the iPhone?

If yes, than you are correct to point out the difference. If no, than the way it was phrased was not that wrong, after all. Using the iPhone did make a difference.


Not sure why anyone would compare a new flagship with a 5-year-old budget phone (at least if we're trying to avoid bias). Apple's cameras are good but not the best (if there is such thing as "best"), and competition is tight: https://www.dxomark.com/smartphones/#sort-camera

The main difference is nobody's running massive marketing campaigns for "Shot on Huawei" or "Shot on Pixel"


Xiaomi is running a massive "Shot on MI $X $Y-AI camera" watermark campaign, but they aren't restricting it to flagship models, so you'll also see it on barely passable shots from $50 phones.


If you record it raw, and make some post-processing, you'd get very close.

The lenses are the most relevant part of a camera. Outside of that, the next most relevant part of the iPhone camera is software. (And I'm sure there's post-processing on that video too.)


As I recall a very good write up of the last shot on iPhone event, massive LED lighting panels were a big part of the equation as were, of course, an expert video team who knew how to use the gear, stage it, and process the footage.


The latest iPhone heavily manipulate images, so the A30 would definitely produce more-usable footage requiring less post production work.


That's not how these things are ever described.

We say movies are "shot on iMax" or "shot in 70mm" not "shot on Carl Zeiss."


Pretty much, slapping a good lens in front of almost any decentish sensor will make good images


I find practice deceptive as marketting value, but this is interesting read from technical side.

>Feel free to re-read this: What Does and Doesn't Matter about Apple Shooting their October Event on iPhone 15 Pro Max — but in short, the fact that Apple can drop an iPhone into an otherwise-unaltered professional film production and match the look of their previous videos without anyone even noticing is meaningful.

https://prolost.com/blog/scarybts


This is what stood out to me too. Clearly the iPhone can do it, so the claim seems fair.

This reminds a bit of the stuff last year when the Halloween was shot on iPhone.

“They cheated! They used cranes and dollies and light boxes and microphones and…”

No one complains something wasn’t “shot on a RED camera” because of that stuff. It’s just standard fare for professional work.

The iPhone is acting like a camera body. It happens to have an integrated lens they’re using with the Pannavision, but otherwise it’s being used much like they’d use another camera body.


It's deceptive marketing because the target audience is people who don't even know what a camera lens is or why a lens is necessary, not professional photographers and videographers. Their target audience is people who think they can do your job with an iPhone, even though they can't for reasons obvious to everyone working in the industry. Nobody expects to be able to replicate a “Shot on RED” on their RED camera body without any lens or other accessories, whereas everyone expects to be able to replicate a “Shot on iPhone” on their iPhone alone.


I don't even understand why they'd do this. Nobody expects a phone to have the same quality as a cinema-level setup, and the iPhone's cameras are genuinely great. Why fake it?


Is attaching a lens to a camera to achieve a shot really considered faking it?


Yes, if what you're selling is the performance of the camera without the lens.


Are they? These are some shots from a presentation video about iPads with one sentence at the end saying it was shot with iPhones. They never said anything about it in the video.

If they shot it with RED and a similar lens for those shots and said "shot with RED camera" would that be considered faking it?


I think you're hiding the ball with these questions. Nobody expects something shot with a RED, with whatever lens, to look bad. They're some of the best cameras ever made.

Phones, on the other hand, have long not been considered equal to dedicated camera hardware. Apple would like to change that perception, so they hitch the dedicated fancy hardware to their phone and say "look how good our phone is," without properly calling attention to the fact that it doesn't look the same without the fancy hardware they are suggesting you shouldn't need to use.

Further, the average person would see "shot on iPhone" and think "they held up an iPhone and recorded this," not "they hooked the iPhone up to professional-grade equipment and recorded this."

The language is deliberately misleading, and the efforts to insist that it somehow isn't just because it is technically true are deeply unconvincing.


You miss the point with the RED: it's normal with any camera to use a different lens to achieve certain shots, and nobody calls it fake.

They did not say "look how good our phone is" for those shots or suggest they were made just by holding the phone up. That's an absurd injection. It's weird to call something deliberately misleading when you're the one saying it.

It's like saying they're not allowed to use a tripod, a dolly, any lights, do any color grading, etc. without it being called fake. It's literally just how you do photography.


An important distinction is that this isn’t like swapping lenses on a DSLR or mirrorless camera. This was an additive optical path in front of the phone’s internal sensor and lenses.

It’s getting a lot of criticism, but I don’t see this as being all that different than the myriad of other things they used: From expensive lighting to special mounts, nothing about this was like handheld shooting and we knew that from the start. This add on was just one of many.


So they used a converter with an optical element? Just like the good old SLR-era adapters that let you put Canon lenses on a Nikon body? Or a speed booster adapter for F-mount lenses on an E-mount mirrorless body? That's hardly innovative.


Not like that. SLRs don't have a built-in lens to contend with, and merely putting another lens over the iPhone's integrated one won't have the desired effect.

There are adapters where the integrated lens is focused on a flat disk of ground glass that's vibrated to move the grain around. The principal lens then projects an image onto the ground glass for the camera to record. This technique was used way back with camcorders and such to get a full frame effect on the cheap, before the DSLR video phenomenon happened.

While it's unclear if Panavision's implementation uses vibrating ground glass, the fundamentals will be the same: the iPhone is recording an image that's been projected onto an external surface.

Incidentally, I think an advert for one of those camcorders back in the day that used this technique would be decidedly judged as deceptive!


I haven't heard of adapters with ground glass projection. Thanks for that info. Seems like a lot of effort to get image of inferior quality. The more pieces of glass light passes through the more chromatic aberration and optical distortion affects it. Also, ground glass would potentially degrade image quality.


It's got to be more than just a typical speedbooster, given that they have to adapt a comparatively huge focal plane to a lens aperture about 3mm in diameter. The article speculates that they use some sort of lens relay system, perhaps with an aerial image rather than a ground glass intermediate, but doesn't provide any more information. I'd love to see a detailed technical explanation.


No, this would be an adapter that creates a virtual image (the article calls this "aerial image", which is wrong), which you can then in turn look at with your eyes or another camera. Nikon used to sell this, search for "Nikon Lens Scope Converter". The difference with a speed booster is that those still create a real image, just with a different focus.


Question for any video professionals out there:

Obviously, Apple dogfooding with their own cameras is good PR, but does anyone else actually shoot like this for professional work? Specifically the combination of extensive professional rigging, lights, lenses etc and iPhones/iPads.

It just seems like once you're at this level, surely there's better digital cameras to use?

But they do seem to put a lot of technical effort into supporting things like ProRes video and high data streaming rates, so I guess someone must be doing it?


This new presentation style of theirs is so alienating. It's like watching a bunch of AI generated demigods on a holodeck aboard a UFO hovering in earth orbit, gracefully bestowing upon the masses a bounty of new advanced technology to consume. So far gone from the days of a guy just standing on a stage showing off a new product to an enthusiastic crowd of human beings. Not just them but everyone's been doing this since COVID, and it's kind of a perfect illustration of what's happened to us as a society. No audience, no comments. You will take what we give you and that will be that.


There’s something of a trade off between optimizing for the chosen few in a room and optimizing for the much larger audience watching virtually. (Though companies generally have gotten pretty good at streaming live keynotes.)


At the end of the event, it also said "Edited on Mac and iPad".

I wonder why it didn't say Edited on Mac and iPad with Final Cut Pro"...


I thought they said they used Final Cut Pro when they released a making of video for the Halloween event.

Were they might’ve just said they let the people use whatever they normally used instead even if that was a competitors product.

I’m not sure there are other video editing things on the iPad that a professional would use are there?


Well there is Davinci Resolve for iPad: https://apps.apple.com/us/app/davinci-resolve-for-ipad/id158...

I mean it's absolutely possible of course that they have used Final Cut. But then it would be kinda funny if they didn't highlight that, in a video that was more or less specifically about iPad Pro and also had a section about Final Cut on iPad.

> I thought they said they used Final Cut Pro when they released a making of video for the Halloween event.

If you mean this video here: https://www.youtube.com/watch?v=V3dbG9pAi8I

then what you see for editing is Adobe Premier and what you see for color grading is Davinci Resolve.


That was the one. Guess I remembered wrong.


Don't care. M3.5 in and ipad of all things is a waste


Guess what it wasn’t shot on… the iPad


So a Uber expensive lens attachment was used.


Museum walls will be lined with iPads of various sizes.


With AI generated art. Future is looking good.


I feel bad for the Panavision lenses for having been subjected to that




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: