I love Halide but never remembered to open it. Luckily iOS 14 shortcuts made it possible to trigger open Halide whenever the Camera app is opened. It now acts like the default camera app.
I'm pretty into photography and love to shoot manual on my mirrorless and film cameras, so was naturally very intrigued by Halide. Unless I need to take a photo at a fast shutter speed, most photos look better taken through the stock camera with all the AI/ML processing Apple does. I guess I could also use Halide to shoot in RAW and edit later, but I don't really care to do that with iPhone photos.
Curious what you're shooting where you feel you need the fine tune control Halide provides and if you actually fine the results better than the stock camera app?
For me, Halide lets me shoot regular RAWs, whereas the stock camera doesn't — I don't like working with ProRAW because it renders buggily in Photos and Lightroom and I don't find it worth the file size.
The tone map "flickers" on/off unpredictably and in a non-undoable way (when it's off it appears to just be a flat plain RAW). If I actually try editing a ProRAW file, the whole photo area gets covered in green or red rectangles on every slider tick — looks like a graphics driver bug.
Agreed. While I can _sometimes_ get better results out of ProCamera (especially if I'm doing digital "zoom" cropping), honestly the default camera app does better 90% of the time.
If I want the best results, I'll bring out my Fuji X70 or my mirrorless, both of which connect to my phone directly and automatically transfer shots (with geotags and everything)
This matches with my experience. I love shooting full manual on my mirrorless, even dialing in manual focus on a fast prime with some assistance from focus peaking, but when it comes to iPhone photography, I prefer shooting hands-off. It feels like the ML engine just knows the limitations of the sensor better than I do and can give me the most to work with in Photos' streamlined develop mode.
We're glad you dig it, and thanks for sharing this! Every year at WWDC, we hope that Apple will allow users to change their default camera app, but we aren't holding our breath. This is probably the best workaround.
Other users have found luck assigning the "back tap" accessibility gesture to a similar shortcut, though we've heard that gesture can be flaky.
>We set out to build something much more than a large iPhone app on a bigger screen. We had to completely rethink our design to for the big, bigger and biggest portable screen.
>The best UI is usually as little as you can get away with, so just like on Halide for iPad, our exposure dials only appear while adjusting exposure. They fade away into the background after a few moments.
we had to think what to do with all of this extra screen space, so we decided to hide even more of the UI
Exactly this. Hiding a UI behind a UI behind a UI is nothing but painful and slow.
I downloaded the app and was met with two subscription options and a purchase option. You cannot try the app free without choosing one of two subscription options or a rather silly purchase price of $40 USD.
Requiring a subscription signup before using software is Scuzzware...they are betting you forget you downloaded it and signed up for the sub. I have an extremely dim view of this practice. Strike 1.
They have UI hidden under UI. Strike 2.
UI buttons unmarked with no clear indication as to function. Strike 3 and uninstalled.
Previously, we offered a pay-once price with no trial option. With subscriptions, trials are straightforward, solve the App Store’s lack of upgrade options, and open the door to features with ongoing costs. It would be nice if Apple offered Sketch style licensing, but it is what it is.
If you don’t want to go with a subscription, we still offer a pay-once choice. Just be aware that you won’t get features with ongoing costs.
If the price doesn’t make sense to you, then it sounds like you’ll be happier with a cheaper app, and we wish you best.
I was under the impression that the app store payment model makes it impossible to do a time-limited trial. You either have a lite version that lasts forever, or a pro version which requires up-front payment, no tryout possible.
Those trials only work with subscriptions, at least as far as I understand it. The trial just delays billing. This makes Sketch style pricing and trial behaviour difficult.
As mentioned in the post, when the viewfinder fills the screen, controls have to overlap the viewfinder. This makes it harder to judge your composition. That's why it still makes sense to keep your UI from looking like an airplane cockpit.
(Edit: You edited your comment so now I am forced to edit mine. Yours previously said something like "that's an intentional feature called 'ProView' described in the article".)
I'm disputing this part of the article:
> The best UI is usually as little as you can get away with
I think this is what designers honestly believe, and obviously you do too. I just don't think it's true. I know that undiscoverable UIs have been a trend for a while now, and I wonder how much damage they have done to human productivity and the reputation of software developers in general.
It's possible your goal is to make an intentionally difficult-to-use app, so that people are forced to learn the tricks and secrets, and then eventually feel "invested" in the app after they have done so, making it less likely for them to switch. Snapchat does this.
But this is a photography app, not a trendy social network for Generation Z, and so the decision to make it intentionally hide UI controls from you is, in my opinion, completely crazy.
"As little as you can get away with," includes a discoverable UI. We go to great lengths to make things discoverable. For starters, there's a chevron providing a visual affordance, reminding you the QuickBar and Honeycomb are expandable with a swipe.
But we go even further. The first few times you launch they app, the UI starts with the QuickBar expanded. After a few moments, it closes, which will likely draw the user's attention. Once you've used the app a few times, we don't show this helpful animation to remind you that the QuickBar is expandable.
The client also tracks if you haven't launched the app in a while, and brings back that helpful animation. (This is all tracked client-side in the name of privacy.)
> It's possible your goal is to make an intentionally difficult-to-use app, so that people are forced to learn the tricks and secrets, and then eventually feel "invested" in the app after they have done so, making it less likely for them to switch. Snapchat does this.
If that were my goal, I would be a moron.
Snapchat is successful in spite of its incomprehensible UI due to network effects and its value proposition. If you've watched SnapChat's UI evolve over the years, they've trended toward more discoverability. You can only go so far with the novelty of a mystery box.
We have no network effects. We don't lock away your data. We live and die by our UI. There are many other apps that have similar features as us, with many customers telling us they tried all the other apps before settling on us.
If you don't like the UI, that's perfectly fine. You're welcome to your opinion. It just doesn't line up with the opinion of our customers.
> We did prototype a machine learning-based Concert Mode that detects when you are taking shots at a show and disables the camera automatically, but we decided that was perhaps a bit too much.
As much as I hate that person at a concert holding their iPad up to take photos, I respect the decision to let their users make the choice for themselves and to not become a gatekeeper (and, almost certainly, block use of the app in legitimate situations due to false positives).
I just don't understand the hate around concert photos/videos.
I always have a bunch of friends who wanted to see the show but couldn't make it. Being able to send them a few photos and a couple 30s clips lets us all enjoy it a bit and have a good message back and forth.
If it's a 2h show, I'm still completely "present" for 99% of it. It's not detracting from my enjoyment.
And I actually think it's pretty cool when there are moments when everyone is sticking up their phone to record -- it's this shared experience where everyone is expressing yes, this is the moment. You see it's not just important to you but to others. And all the glowing screens is aesthetically similar to the way people used to hold up their lit lighters when people smoked.
If I'm on stage and everyone's pulling out their phones to record at a certain moment, that's when I know I'm making magic and we're all sharing it. People are appreciating it so much it's worth saving.
I saw someone hold up an iPad for an entire Bjork show on the row in front of me, live-streaming the gig to their friends on Facebook. Didn’t totally block my view but was definitely annoying and I thought especially rude as there were notes saying Bjork had specifically requested no cameras be used (it was a gig with just an orchestra so she wanted people to focus on the music). My friend had a word with them at the interval and they did move it out of our line of sight at least, lol
You kind of do, right? If you had €50,000,000, you could have a private concert. You don't, so the general public is there, doing general public things.
It's nice if people are nice, and we should strive to be nice. I am not sure how that translates to a paid camera app that sees your child taking her first steps, runs it through a machine learning model that classifies it as a concert, and then prevents you from taking a video of it. Whatever infinitesimal amounts of concert jackassery it prevents comes with a downside of literally losing a once-in-a-lifetime event. That is the very definition of shitty software, and we should reject it aggressively.
If you want people to not block your view at concerts, demanding that they have a machine learning model control what they can take videos of is not the solution. Hire some guy to say "hey, stop filming on your iPad or you're out."
>I just don't understand the hate around concert photos/videos.
A quick snap and the phone going back to your pocket, sure.
Photos/videos with hands help up to shoot throughtout whole concert/or several songs? For these, the hate is easily understand.
(a) Watch the fucking concert you're present at - don't be a subpar cameraman for it.
(b) Don't obstruct the others' view.
(c) Most likely you ain't gonna watch it again anyway. You barely watched it when it was actually live in front of you (instead you were concerned with taking video).
(d) There are video/photo pros in the concert + at least 100 other regular fans who will post their photos/videos anyway. Why exactly do you need yours? Because 20+ rows back from the stage is such a great position to be filming in the dark with a mobile phone that it shouldn't be mished?
>If I'm on stage and everyone's pulling out their phones to record at a certain moment, that's when I know I'm making magic and we're all sharing it. People are appreciating it so much it's worth saving.
OTOH, it tells me people don't care for the show, they just want to post that they were there too.
Absolutely reverant held silence or heavy dancing / headbanging (depending on the genre) are the things that would make me actually feel like people actually feel they're listening to magic.
>You can enjoy a performance, and be in the moment, while still recording it.
Yes, just not as much, and with your mind occupied by another activity.
>Not to mention that there are artists that embrace bootleg recording, like Dead & Company
That's neither here, nor there. Bootleg recordings were just some fans bringing in audio recorders (which you just keep on you). They don't add a screen between you and what you're there, nor have you trying to frame things.
Just to give you a different perspective: in Japan, and before COVID hit of course, I've attended quite a few concerts, and none of them had any person holding up their phones or iPads to record anything, ever. It's completely banned in all the concerts I went to and is illegal: you'll have your phone confiscated and the recorded material deleted by the event staff. Not that I've seen anyone trying to break it. Maybe the ones done by amateurs at small stages don't have that rule, but even then, I've never seen any.
But what we did carry were penlights/cyalium lights. And each song has a set of calls people do at specific parts of the song. And for the rest of it? We set our penlights to the color matching the artist on stage, and we wave them, and sing along and make the screaming calls when it's appropriate. I would feel this crazy emptiness in my hands with one penlight so I'd bring two for both hands. The making of magic moment happens when few thousand fans scream "Encore" at the end, and when we do this penlight waving and all the stuff together. Phones don't even come to the picture. They are in vibrate mode. Can't have it ringing when the MC is speaking and everyone else is quiet, that'd be rude as hell.
I guess it's totally different elsewhere, but I'd be in for a culture shock if I saw that many people trying to "capture the moment" on a device screen, instead of in their hearts and minds. For me, those are memories I treasure.
I do understand that I cannot share the experience with my family/friends, but I get to share the experience with the guy sitting/standing next to me, and maybe the guy I exchanged Twitter details with so we could talk about it afterwards. Met quite a few great people during those concerts, and some I talk with time to time on Twitter even now. :)
But I'm precisely pushing back on the dichotomy you're presenting -- it's not an either/or choice about a screen vs hearts and minds, or sharing with friends vs. with the person next to you.
You can have both -- you can have all of it. Record a couple of minutes, share with friends, and make friends next to you as well the entire rest of the concert and have those memories. Plus, you can multitask. Even if I'm filming a quick bit of a song, I'm sure as heck enjoying the music and experience too! It's not like I stop listening or looking. :)
That's not necessarily false, but it is missing the point.
The problem artists and other people have isn't that the individuals recording are having less fun, it's that they're ruining the show for everyone else.
If I'm seeing a band I love live, I want to see the band. I want to pay attention to the way the frontperson interacts with the audience and the expressions they make as they sing. Those are subtle details, and someone holding a bright shiny rectangle above their head and blocking my view is distracting.
OTOH, I spent a night at one of the largest nightclubs in Tokyo with a crowd that was basically standing still, lightly fistpumping while they were recording the whole set on their phones. Really killed the mood.
I feel the issue in this case is more a tablet being used to record a show versus a phone. The larger tablet blocks more of the view of people behind you. This does kind of fall apart with the blurred distinction between large phones and small tablets though, so in this case I feel like their decision was the right way to go (in addition to the other issues like false positives, complexity, etc).
Same. I thought this was a Halide integration on iOS. Interesting how products in intersecting spaces converge on names.
I am curious though if any tech on iOS leverages Halide and whether it's even feasible. It would be cool to get Halide generators to wrap some of Apple's special image processing and ML hardware.
Could you please add a digital zoom option in Halide? I bought the app to have full manual control over the camera and I don’t understand why it cuts out this standard feature. One of my use cases is that I often use my phone with one of those macro lens clip-ons and on pocket microscopes and there you just have to zoom in digitally. And even for everyday shooting I sometimes just want to zoom in slightly. Of course everything is possible in post, but since you are boasting full manual control, I also expect the app to have all features of the default iPhone camera.
Doing digital zoom "right" involves some serious thought.
One issue is technical. The iPhone cannot capture a RAW photo with digital zoom enabled. It's restricted at an API level. The reasoning is that it wouldn't make sense to crop a RAW file, and there are privacy concerns around saving data in a photo that the user thought was cropped.
Another issue is that digital zooming is awful. You're typically looking at simple interpolation, whereas cropping the photo afterwards in any half-decent editor offers better algorithms. Very smart software can apply super-resolution.
Then there's added UI complexity. On hardware that supports optical zoom, you want to clearly surface to the user whether they're engaged digital zoom or optical. Many users are caught by surprise with how the first-party camera app quietly switches between the telephoto lens and cropping the wide-angle lens when it needs to focus on macro shots; the wide-angle lens has a closer minimum focus distance.
From talking to users, we found many folks use digital zoom as a focus aid. That's why MkII included a focus loupe, which solved that problem in a much more elegant way.
We do have some ideas on how to do digital zoom well. Stay tuned.
> So I remember very clearly at the 2012 Olympics in London, if you looked around the stadium, you saw a lot of people using an iPad as a camera and generally that was people that just needed to have a bigger viewfinder for vision reasons, etc.
For sporting events we use iPads for videos and photos. The reason is that the quality is "good enough" and the large screen makes it so much easier to actually follow the action on the screen. When we tried to use phones, people were much more likely to watch the action not using the screen, and then would miss the shot because they couldn't get the camera aligned with the shot fast enough.
When you say "we" do you mean a professional outfit, or just as a group of consumers at a sports event? I thought pros were all about DSLRs and long lenses at sporting events. What kind of action can you actually detail with an iPad's lens?
For my mom, her iPad (Mini) is her only computer, including for taking photos. She much prefers the large screen (vs a phone or camera-sized screen) and wouldn't know how to transfer photos over from a traditional camera.
I imagine my mom is pretty close to the average demographic that goes to theme parks in this way.
My coach uses it to film climbing and it's really helpful for instant feedback. There are also apps to do more in depth analysis to for instance draw geometry on your movement to understand and improve.
A long time ago before iPads we had this fancy slow motion camera it felt pretty awesome at the time. But the cameras on apple devices now are so amazing. high FPS and super high quality it's crazy to look back what we used to have!
This other app link also looks interesting; the idea of having a team repository and archives would bring value to small teams I think. Like Football (europe) has this where their players spend time each day and they have program that tags parts of game etc on timelines to view, annotate etc which is super cool.
I saw on HN a while back about ML that trained the most optimal pole jump body position. That would be super cool for other sports. And maybe pro-am apps which analyze for instance your running form with some ML autodetect spine tool. There's the obvious team strategy math that's been done before ML, but more for movement and body position. I hate it (and a lot of other climbers do too) but speed climbing could probably get faster with some new tricks to get a straighter center line up the wall.
In photography you’ll hear similar arguments about people using the camera display instead of the viewfinder. The viewfinder provides the best view of what the image will look like, but people would rather hold their camera out in front of them then stick their eye up to the viewfinder.
I think the argument can be made that the pro-view offered by halide would be better for image composition than the iPhone, but as mentioned using the whole screen won’t allow you to truly see everything.
Taking a photo with an iPad is somewhat surreal. Typically when you take photos with anything else you need to bring it back to your PC and open it in a large screen to realize it's slightly out of focus or everything is just too noisy. With the iPad especially post retina display you get an almost 1:1 relationship between what you preview and what you get finally. If you try to take it in that perspective an iPad is a better method to take a photo if camera quality is the same.
Most of the users who pay money for software are on iOS. Combine that with the difficulty of working in the Android camera ecosystem, and you're left with a lot of risk for uncertain rewards.
Specific instructions: https://www.daviddegner.com/blog/change-default-camera-app-o...