Hacker News new | past | comments | ask | show | jobs | submit login
Samsung caught faking zoom photos of the Moon (theverge.com)
39 points by carlycue on March 13, 2023 | hide | past | favorite | 29 comments



Samsung has a history of deceiving people on the performance of their phones.

>Earlier this week, we were made aware of Samsung’s Game Optimizing Service (GOS) and how it throttles the performance of games and applications. GOS decides to throttle (or not to throttle) applications using application identifiers and not application behavior. We view this as a form of benchmark manipulation as major benchmark applications, including Geekbench, are not throttled by this service.


So.....basically the same thing all GPU drivers do on PC then?


Yeah they can't even invent cheating.

That's why the only valid GPU reviews are video reviews you can watch, not metrics.


> It’s this viral appeal that’s gotten the company into trouble. Without properly explaining the feature, Samsung has allowed many people to confuse its AI-improved images for a physics-defying optical zoom that cannot fit in a smartphone

I wouldn't say 'caught' or 'in trouble'. They're applying some AI blackbox to their photo tech, and the lack of transparency is annoying people so then people make viral posts like this claiming they 'caught' Samsung (when in reality it reads like an AD for Samsung).


physics-defying optical zoom that cannot fit in a smartphone

Is it actually “physics defying” or just can’t be done with available technology?


It's physics defying. Optical resolution is directly correlated with the lens diameter, there's no way around this.

Images are formed by interference of plane waves propagating at different angles. To resolve finer details, your optical system needs to collect waves that were propagating away from the source (the moon) at larger angles. This means you can either get closer to the moon or increase the lens diameter. It doesn't matter how the lens works, you could use any photonics magic trick you want, you're still limited by the (aperture) size of the optical system.

Samsung's lens is purportedly too small to resolve the level of detail produced in the moon images.


It's fair to say physics-defying.

The physical properties of diffraction establish fundamental minimum limits on the requisite lens aperture diameter for a given resolution.

See: https://en.m.wikipedia.org/wiki/Angular_resolution#The_Rayle...

Also: https://en.m.wikipedia.org/wiki/Dawes%27_limit (simplified, for the more common use case)

>> Samsung’s process is more intrusive than this: it doesn’t just improve the sharpness of blurry details — it creates them.


I don't understand. Is there a difference between those two statements?

There is no known technology that would make it physically possible to fit such a camera in a mobile phone, just as there is no known technology that would make it physically possible to travel faster than light.


Yes there’s an obvious difference? One just implies our tech isn’t there yet, but could be. The other implies we don’t think it’s possible at all with any technology.


The truth is slightly more murky: There's an element of engineering improvements that have brought us much closer to the physical limits between, say, my original Motorola Razr flip phone and the modern incarnation.

But yes, the manufacturers are now squeezing the last few percent out of the diffraction limit rather than leaving most of it on the table, so the year-by-year leaps and bounds in camera quality are mostly coming from AI enhancement not optics or sensors.


The article does bad job of explaining itself. The problem is that the AI only works on the one moon we have, and would not detect a meteor crash or a Starlink Satelite or ISS or moon lander whatever.

Same as if they used the same trick with a model Coke can, and you tried to take a blurry photo a of an insect on your can.

With the moon, it's especially ridiculous because no one wants Samsung's prefab moon photo, we have tons of those. They want to see the phone zoom in on what they are actually seeing, just for fun. I don't know, maybe if you photograph the moon behind a tree, you'd like a more detailed moon. But it's Moon "Zoom" , not Space Zoom, and it's Virtual Zoom. It's like the "Just Egg" vegan egg, but Just Egg knows you know it's not a real Egg. Samsung just keeps lying and pretending we're stupid, instead of advertising the nifty feature for what it is. That's tacky.

Samsung's argument is "if you can't tell, does it matter"?

This is new post-truth AI world.


Samsung is cheesey and misleading for doing this, but it is also strangely clever.


They seem to have taken the mantra "fake it till you make it" very seriously.

A smart move, actually, since most people only care that their phone takes decent selfies at the Instagram resolution. Heck, I don't even know what each of the three cameras at the back of my new phone is for. I stopped keeping track of those megapixels once they went above 10MP or so.


I'm pretty sure that all big phone producers use some ML stuff to improve image quality, so I wouldn't call it "faking". But the marketing might've been a bit deceptive.

I wonder what people really expect though, afaik the camera has 10x optical zoom and after that up to 100x digital zoom. What does digital zoom mean? I'd say using normal scaling and other algorithms to improve quality.


So it's taking that thing were that fresh photo you took at that popular tourist place is one that many have taken before, to a whole next level. So I should be able to go into Google maps, aim and shoot to take a photo with ML.


Interesting take. It somehow feels different when you take a picture yourself. I wonder if people will feel less of this ownership aspect as the technology progresses.

I'm sure professional photographers already feel that way, as they often don't take photos taken with a phone serious as photography.


I didn't buy my S23 because of this feature, but it's a neat trick and I've taken a few moon photos while out walking at night. Maybe you could argue the marketing is misleading, but AI-assisted photography isn't unique to Samsung.


Are they trying to link this product to the faking of the moon landing? Is this a weird viral marketing thing?


I realize now it definitely reads like I think the moon landing was faked. Derp.


I don't think its bar at all. No other tech companies are making an effort. I'm a hard-core Samsung fan, and definitely eats over Apple overpriced decades behind on function.


It's just a clickbait article, like many others written on this topic. Samsung doesn't replace the image of the moon with some pre-saved picture they already have. They take the actual real picture taken with your camera and then enhance it, whether with some AI algorithm or not - adding detail to make it look sharp. If you use a picture of a moon that's wrong(craters in the wrong place etc) then it will just sharpen that up, it won't replace it with a "correct" version of the moon. How is that different than selfie algorithms present on pretty much any phone that make you look great, smoothing out your face, removing blemishes etc.


It's an important distinction as we move into a statically-usually-true AI future (vs the deterministically-true recent past). It doesn't matter until it does.

Showing pretty pictures of the moon on Instagram? Doesn't matter.

Planning a rover trip across the surface of the moon? Does matter.

And while it may be absurd to expect anyone is planning the latter, the problem is that nowhere in the marketing or documentation of the future as-exists is there sufficient information to determine what use cases it should not be used for.

Or, in other words, if Samsung were doing the exact same thing for cancer detection microscopy, they'd be killing people...


>>Or, in other words, if Samsung were doing the exact same thing for cancer detection microscopy, they'd be killing people...

Now that's some major whataboutism if I've ever seen one.

Of course, if they use this tech for cancer detection, I will happily join the protests.

But they aren't. They take pictures you take with your camera and make them look sharp. If anything it's super cool the way they do it without resorting to pre-saved pictures of the moon. It's literally absurd to me that people are getting annoyed at this, even though literally all phones do image processing of one kind or another.

>>the problem is that nowhere in the marketing or documentation of the future as-exists is there sufficient information to determine what use cases it should not be used for.

Really? That's the problem people are having with it? That the mode exclusively demonstrated and marketed for taking pictures of the moon doesn't have sufficient information to establish what use cases it should not be used for? I find that extremely hard to believe - it sounds like a made up problem for clicks, exactly what Verge is doing(they are kind of famous for it).


Have you seen the original Reddit post?

You're right (I assume) it is an ML thing and not "replacing" the image, but it takes a 170x170px intentionally blurry blob and turns it into a detailed moon photo. Very very little of that photo is the result of light captured via the camera sensor.

This is the original posters side-by-side:

https://imgur.com/ULVX933


Also I just wanted to add to this - MKBHD has just posted his video about this, and he wasn't able to reproduce what the reddit user did. Samsung did improve the photo of the blurry moon, but nowhere near as much as that guy on Reddit shows. That's not saying that he's lying of course - it's just that maybe we need to take it with a grain of salt.

https://www.youtube.com/watch?v=1afpDuTb-P0


Yes I have seen it , and I disagree with this statement:

" Very very little of that photo is the result of light captured via the camera sensor."

The photo is entirely the result of the light captured by the sensor - after image processing, yes detail is added which physically couldn't have been captured by the camera - but like I said above, that's literally all mobile photography nowadays. I literally don't see the problem, if anything I'm extremely impressed by the result.


I don't see a problem either, but the fact of the matter is that the moon in that picture is much closer to generated than captured.


HN is funny... first there were links [1, 2] to the reddit posts (where the person doing the investigations posted their results), and a few days later, the link to the mainstream media coverage (which is also based on the reddit post).

[1] https://news.ycombinator.com/item?id=35107601 [2] https://news.ycombinator.com/item?id=35123389


It's just randomness.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: