Hacker News new | past | comments | ask | show | jobs | submit login

> starting around the iPhone 13, phones got really good.

ILC cameras also got really good.

I'm not at all doubting that smartphones are the primary cannibalization vector, but even for people who prefer to use cameras with significantly larger sensors than can be put into a smartphone I think the cameras themselves reached a point years ago where it became difficult even for ILC enthusiasts to justify upgrades on a regular basis because what people already owned was "Good Enough". I reached this point with the Sony A7R Mk3 and other people probably reached this point sooner.

For years it felt like Canon and Nikon were kind of aware this would happen with ILCs and were dragging their feet on camera body tech making incremental upgrades behind where the technology should have been if they were competing full speed, and then other vendors like Sony just came smashing in without being part of this implicit agreement and pushed camera body tech along extremely quickly for a few years (with Canon/Nikon having to follow along to some degree to keep up) and it didn't take many iterations of this pushing the technology to where it could be for ILC camera bodies to be something you feel no itch to upgrade from year to year because the shiny new thing is an extremely marginal upgrade.

So the cannibalization of the market probably had two fronts, the larger one from smartphones, and a smaller but still significant one from "Good Enough" (which is an issue smartphones are starting to run into as well).




Yep, my wife and I used to do wedding photography and from that period we have 2 Canon 5D Mk II's and 2 40D's along with 3 L lenses, some primes and a few cheaper zooms. There's absolutely no reason we would ever need another camera. Even people with a single body wouldn't ever need another one unless they broke it.


I'm guessing you don't record videos then.

In terms of video capabilities, Canon 5D Mk II is limited to 8-bit 4:2:0 1080p H.264 recording at 30fps, maxing out at 12 minutes of recording. That is a far cry from 10-bit (or 12-bit) 4:2:2 4K, 6K or even 8K RAW or ProRes at 120fps or higher with unlimited recording from a similarly priced camera in today's money.

(It's also limited in terms of RAW photo as well though: the best recording option is 8-bit 10MP RAW)

No phone comes anywhere near that either, not to mention the lenses for phones can't compete with the real interchangeable lenses. The difference probably doesn't matter to someone who is just going to record his baby walking around and watch it on a 7 inch screen, but of course that's not the target audience for those cameras.


10bit and 4:2:2 is only for editing in post, majority of consumers don't need that.


If you read the last sentence of the post that you're replying to, I already said that it won't matter to most people.

That being said, "is only for editing in post" (which is not really true, banding is an issue in scenes with high dynamic range with 8-bit, not limited to sky but also with strong lights or deep shadows) doesn't mean people won't want it. Around 10-15 years ago, in the age of single-digit-GB slow SD cards and weaker camera/phone processors, that's what people used to say about RAW photos repeatedly. Now it is mainstream in even in phones, with built-in editing apps and easy to use desktop programs with few knobs. This means editing itself in post isn't a barrier for mainstream adoption, the issue is video editing currently has a high barrier as it is essentially impossible on portable devices, the programs have their learning curves, and the whole stack requires some financial investment.


Majority of displays are 8bit and it will probably stay as standard for while.

Btw, over 10 years old Canon 5DMIII can shoot RAW video with MagicLantern. Manufacturers should open/update code to their old cameras that are capable do this. Its really disappointing when marketing ruin whole product. No wonder that camera market dying.


Even when targeting 8-bit displays, recording 10-bit is still beneficial. Besides obvious benefits in editing and encoding, simply playing a 10-bit video file straight from the camera on an 8-bit screen is useful when applying any common "effects" (brightness, contrast, LUT, colorspace transformation, gamma correction, tonemapping, etc etc).

> Btw, over 10 years old Canon 5DMIII can shoot RAW video with MagicLantern

Not sure why that is relevant in this context, but any digital camera would be capable of shooting RAW video with hacks: they all have photo-sensors and RAW simply means dumping the digitized signal data in a suitable format. It's a matter of hacking the device. But it doesn't mean you should do it, especially when that's not what they're designed for. Unsurprisingly, in the case of Canon 5D Mark III (which is a photo-oriented camera lacking a stabilizer, you can read about further limitations such as the under-utilized sensor in video mode [which typically happens due to hardware limitations] here https://www.dpreview.com/reviews/canon-eos-5d-mark-iii/25), a lot of potential problems await apparently: https://www.cined.com/consider-this-before-you-shoot-raw-on-... For RAW video, at the very least, you need a more reliable storage hardware hooked to your device with sufficient capacity for recording (meaning CFExpress or NVMe via USB, not SDXC), and possibly active cooling, both missing from that camera so it would require some hardware modding.

That being said, modern video cameras can also do more than the trivial task of recording RAW: they can handle processing and encoding of higher quality videos (resolution, bpp, frame rate) in real time, which requires specialized silicon missing from Canon 5D Mark III.

> Manufacturers should open/update code to their old cameras that are capable do this. Its really disappointing when marketing ruin whole product. No wonder that camera market dying.

1. The camera hardware isn't actually designed for it (by the way, even with new video cameras, there are usually trade offs, you turn one feature on and another becomes inaccessible) 2. that's not the reason why the consumer camera market is shrinking, and 3. doing that would shrink the market volume even further.


Temporal Dithering also known as Frame Rate Control is very often used in 8 bit panels to allow them to display almost as many colours as a 10bit panel.

From the input perspective you're running it as a 10bit panel


Videos are not what most people mean by "photography", 10-bit is mostly a gimmick (there are situations where it gives a real advantage, but they're niche), and higher-than-1080p resolutions are honestly pretty marginal a lot of the time. 30fps is pretty awful though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: