Hacker News new | past | comments | ask | show | jobs | submit login

I still don't understand why Apple has chosen to include such a sophisticated and expensive sensor of so little use to average consumers in its flagship products.

They're probably setting the stage for some future technology, maybe AR related?




Why do you think it has so little use?

One thing that comes to my mind is portrait background blur - so far computational implementations on smartphones that I’ve seen were not very good, I suspect this can improve effects like that a lot which seems relevant to consumers.

On the forward looking side I think app developers can also get a lot of value out of this. Eg (just a thought) changing the way clothes or shoes are bought by letting you scan body parts to determine fit.

And this can generate a lot of data feeding into their vision models + what you already mention about AR/VR.


Well, it doesn't seem to me that Apple has backed this new technology with any significant app that makes use of it. And I don't see any serious app in the apple store that makes use of lidar either.


I think you dismissed the portrait mode thing, which I think is by far the biggest use. Two other things I love -- "measure". It's very accurate. Also, there are several apps that allow you to scan an area and then send that to people. I've used it to show people an area (my office, a cool brewery I went to, etc). "3d scanner app" is one that I've used. Interestingly, I also used it to scan a piece of furniture I liked, which let me get precise measurements at a later date.

Edit: I forgot one super cool use of measure. Let's say you have a basement with an open ceiling. You can use the measure app to, say, measure a pipe. Then, when you walk upstairs, you can still see the pipe in 3D. So you can then understand just exactly under the floor where the pipe is. Very useful for things like pipes and electric wires.


> Edit: I forgot one super cool use of measure. Let's say you have a basement with an open ceiling. You can use the measure app to, say, measure a pipe. Then, when you walk upstairs, you can still see the pipe in 3D. So you can then understand just exactly under the floor where the pipe is. Very useful for things like pipes and electric wires.

That is really interesting. What app/service did you use to do this?


Just "Measure" on the iPhone. For instance, just measured the back wall of a closet. Then went to the next room and could see where the back wall of the closet was in AR.


Samsung, at least ultra line, has Lidar too and exactly same type of measure app (on top of some AR doodling). Just tried it and its actually pretty precise (on S22 ultra). I wouldn't build a shelf based on just that data but otherwise nice little addition to the toolset.


Measure, like he said. It's built in.


Wow, I completely missed that. I reread the comment about 3 times before I recognized it. Need more coffee, I guess.


The last one is pretty cool indeed. Still, I feel like Apple has still some "one more thing" for lidar in the future


They specifically called out the Camera app in all of the promos. It’s the reason I bought the Pro over the “base” model. It’s used for range-finding in the camera, both for portrait and low-light. It’s actually really neat and low-light performance has been drastically improved. (Which was one of my biggest gripes moving from flagship Google phones)

Any app used on half their billboards (Shot on iPhone 13 pro) and featured on the Lock Screen seems “significant” to me. :)


I bought an Apple device specifically because of LIDAR. I've found easy, convenient, "good enough" 3D scanning to be very useful both for recreational and professsional purposes.

Other uses of LIDAR (measurement/data collection, AR, photography) are also likely to become more popular over time.


Can I ask, what sort of things are you 3D scanning, and for what purpose? I have an iPhone 12 Pro that I'd like to utilize for similar uses, but I can't think of a practical output for them.


I'm not always entirely sure why I'm scanning things. I occasionally upload things to Sketchfab. I have vague plans to make a diorama out of pieces to view in VR.

Turning the question round - why take photos of things? A 3D scan isn't that different. It's a way to capture a memory and to show things to other people. It's so quick to capture something that sometimes I just do it on a whim when I find something interesting.

I'd like to make an art piece out of them at some point but I'm still waiting for the right inspiration to come along.


Polycam is used in the 3D modeling / VFX space to phonoscan assets--there are some others like it too that all use LiDAR. Another use could be to preserve historical artifacts similar to how it's used in this video: https://youtu.be/k1uXppV6TeA?t=225


IMO phones are so stagnant now that manufacturers just tack on random things to see what sticks. Lidar! Radar! Refresh rate! A 7th camera! Oooh.

Who the hell uses any of that stuff? But it lets them keep releasing new versions and charge huge amounts for new flagships.

Those gimmicks help to prevent the phones from becoming the next race to the bottom that the PC and cheap Android markets have become.


> Lidar! Radar! Refresh rate! A 7th camera! [..] Who the hell uses any of that stuff?

Well... you're right about the lidar, but I absolutely care about the camera and refresh rate. As a lifelong photography geek I basically upgrade for the camera improvements every year..

Camera, messaging, maps and dating apps are basically the only things I use my phone for these days...


Lidar is actually really useful for portraits, Samsung S22 ultra uses it extensively and can find a single hair strand it doesn't blur compared to the background. Everything looks more realistic compared to pure software processing where I can spot errors quickly.

As frequent user my good old Nikon full frame (D750), I am really astonished by output these tiny cameras produce. Sure, its not for big screen/print unless we're talking about sunny day. But for everything else, it makes fantastic portraits. I got fed up with the need to carry big heavy pouch around all the time, and missed way too many pictures of my kids to rely on it anymore.

It also sees in the dark much better than I do with my eyes, also thanx to Lidar - the pics I take from night walks where there is basically no light source hundreds meters around and I am in the forest (yes my night walks take me sometimes to interesting places, I normally don't use any light). Very dark scene becomes full of (usually) true colors and details simply invisible to me.

10x zoom on my phone allows me to read signs not readable to my eyesight (digital 'AI' zoom is very useful up to cca 30x). I was skeptical too, and its true full frame is still so far ahead, but at what cost - bulk, weight, the need to spend hours on postprocessing batch of photos instead of quick edits in phone in few seconds.


Yeah, portrait mode is basically face ID in reverse and works a lot better than room scanning.

How does the lidar help with night shots though? Maybe focusing, but otherwise isn't it just a long exposure with algorithmic corrections?


Well that focusing part is pretty important :) The rest is mostly about gathering enough light and compensating for handshake, probably with some ML.

For a photo you don't need much more. I can tell you that handheld shots are much better and easier compared to full frame dslr.


What do you suggest for a good camera with a tight budget? I used to have Samsung Galaxy S20 FE and it was just amazing. Trying to buy a new android preferably from flagship killers type of phones


That's cool. I'm glad there are still people interested in photography for its own sake instead of posing for Instagram!


"I basically upgrade for the camera improvements every year"

No wonder the earth is warming... You really need a new phone every year just for the camera ? I really don't get why anyone would do that without an actual professional need.


It’s partially because the cameras aren’t good enough one year, but do fulfill an edge case. The following year they fulfill another edge case. These reduces the need for using a dedicated camera, especially a bulky interchangeable lens ones, other times they fulfill an edge case that the bulky dedicated camera can’t do well either.

Makes them intriguing and fun to test out creatively. I can see the temptation.

The 13 has some things built in/enabled the 12 doesn’t have, which I had always wished the 12 had, such as video portrait mode, but I personally decided it wasn’t good enough for upgrade again, since its an ok assumption that the future models will also have that.


> The 13 has some things built in/enabled the 12 doesn’t have

Night mode in wide angle! Although it's quite a bit less capable than the other lenses. Maybe in the 14 ; )

And @eole666 of course I don't literally throw the old phone away every year, there's a whole downchain of very willing recipients of lightly used iPhones. Hell, my old iPhone 6 is still being used for QA.


Sometimes I forget some people don't buy things to use them for years, but prefers buying every year the last novelties. It's kind of a weird consumerist way of life.

Yes, smartphone camera's are awesome now, but it's sad an awesome product like that will be thrown away after a year of use.


With Apple supporting iPhones with IOS upgrades for years and security updates even longer, you don’t throw away an old iPhone, you either hand it down to someone else or you sell it.

Even an iPhone 6s from 2015 with a new battery is faster than most low end Android phones and it is still fully supported.


That's actually why I didn't upgrade from the 12 to the 13, because I wouldn't be doing the trade-ins. They were tempting offers but I hate being locked in to the same device for 24 - 30 months, which is a condition of most of the trade in offers. I would rather pay for the device outright and figure out what to do with my prior device, and so this newest iphone wasn't compelling enough for that.

(for context, my prior upgrades were in a more lenient trade-up program, which by coincidence resulted in me keeping up to date. not trying to pretend like avoiding a one year upgrade is some major sacrifice, its just what happened)


If you own the phone outright, you can either sell it directly or trade it in to Apple.


Yes thats what I meant by figure out what to do with my prior device, but thats alongside making sure I have backups of everything.

I've had some apps that stored files within them, and the app was no longer available for a higher version of iOS that a new phone shipped with. Would have been royally screwed if I had sold/traded in my prior phone because I wouldn't have immediately noticed except when I needed it.


Why is 3d scanning any more of a gimmick than photography? When I travel I scan things for the same reason I photograph things.


The lidar plays a trivial role in that. Most of it is just algorithmic photogrammetry. If you turned the laser off you'd probably still get similar results.

And it's a gimmick because it's a tiny niche of people who do that, and I wish I didn't have to pay extra for a phone for a lidar. Would take a fingerprint scanner any day. Or a headphone jack.


> Most of it is just algorithmic photogrammetry. If you turned the laser off you'd probably still get similar results.

This is simply untrue. Most 3D scanning apps on iOS make no use of photogrammetry or they do it as an alternate mode for smaller objects. I don't know of one 3D scanning app that combines the LIDAR with photogrammetry. They use one or the other depending on the scale of the object you're scanning.

> And it's a gimmick because it's a tiny niche of people who do that,

It's not the reason Apple added it. They added it for AR and photography. Arguably the former is a gimmick but the latter is future proofing and might end up being a game-changer.


> Those gimmicks help to prevent the phones from becoming the next race to the bottom that the PC and cheap Android markets have become.

See: it allows them to keep increasing the cost of their devices while simultaneously preventing them from passing the savings from their streamlined manufacturing to the customer.


Glad they didn't include a laserpointer.


hopefully thermal imaging next


Yeah, that and uv would be cool for looking at plant life.

The seek and flir imagers are cool for IR


UV seems more possible now that I think about it, thermal would present huge issues with export regulations


I'm iOS dev and few limitation:

1) Biggest one Lidar is only on iPhones Pro in contrast to TrueDepth that after iPhone X landed in all iPhone devices

2) Lidar has very low resolution

I wish the added Lidar to all iPhones + combined this with TrueDepth (both in front and back of iphone) - true depth has better resolution but is slower (only 30fps + higher latency) and only works best for distance < 1 meter


Focusing a camera in low light is still not great with traditional methods such as phase detection. LiDAR focus is spot on every time.

The quality of the camera is one of the main selling points of the iPhone (Eg. The ad campaign on billboards is “Shot on iPhone”. So I would argue it is a very appropriate feature.


Could the 3D model be used to interpolate the shape of an animal behind a cage?

I'm imagining a panda enclosure, with lots of people taking photos. 2D images are obscured by a line where the cage is. Eyes can correct for it by moving our heads around, but the photos can't.

There is an opportunity for cage-removal to be solved with software (guessing, or taking multiple photos and cross-stitching) or simpler hardware (camera on bottom of phone instead of top, use parallax like 2 eyes). But the lidar option might work too.


There's some practical stuff, like mapping a room so that you can see wallpaper or paint choices on your own walls. Or getting clothing sizes more accurately, etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: