Hey HN! I recently launched this app, and I thought HN might find it interesting.
I'm primarily a web developer, but I decided to learn Swift and build it as a native app instead of doing it with Electron and a JS framework.
The main goal with this app was to save myself time creating screencasts, but I've heard from people doing YouTube and podcasts that it's saving them a lot of time too.
Part of that goal involved keeping the app itself fast, which was a big part of why I chose to go the native route. (AVFoundation was another big reason!)
One thing I'm pretty proud of is the waveform rendering. It loads the whole file and then downsamples it to various resolutions, and then switches between them as you zoom in and out. The result is that zooming feels very snappy because the app goes out of its way to avoid drawing more than necessary. The trial is free/no email required if you want to see how it works.
Happy to field any questions (or feature requests for that matter) :)
The video [1] on your page is a well-designed demo. It's self-evident, gets to the point, and shows off what the app does in a clear and practical way, in 1.5min.
Yeah this was a tricky decision early on. Going with web tech seemed like the most obvious path to MVP since I've got that experience.
I started with doing some proof of concept stuff in Electron + Svelte, but pretty quickly it felt like I was hitting a use case that HTML video is not meant for. It's not good at frame-accurate... anything. There was a big old github issue talking about web support for editors like this and it seemed like it was a long way off, if ever.
So then I started looking into native stuff. The AVFoundation library looked like a massive leg up. It takes care of reading and writing video and audio, and it's got some good functionality around managing editor concepts like tracks, clips, etc.
I looked at Swift UI first, and that felt sort of familiar coming from a React background -- one way data flow and such -- but it also didn't seem fully baked yet, and it would limit the app to 10.15 and newer. I know plenty of devs who hang on to old MacBooks and previous macOS versions for a long while (I'm one myself!) so I didn't want to rule out a huge swath of folks by choosing Swift UI either.
That left regular Swift and the old standby AppKit stuff. I used the book Hacking with macOS to get going, which was an amazingly helpful guide.
So far I think Swift was a good choice. Time will tell, I guess :)
I'm primarily a web developer, but I decided to learn Swift and build it as a native app instead of doing it with Electron and a JS framework.
The main goal with this app was to save myself time creating screencasts, but I've heard from people doing YouTube and podcasts that it's saving them a lot of time too.
Part of that goal involved keeping the app itself fast, which was a big part of why I chose to go the native route. (AVFoundation was another big reason!)
One thing I'm pretty proud of is the waveform rendering. It loads the whole file and then downsamples it to various resolutions, and then switches between them as you zoom in and out. The result is that zooming feels very snappy because the app goes out of its way to avoid drawing more than necessary. The trial is free/no email required if you want to see how it works.
Happy to field any questions (or feature requests for that matter) :)