Hey HN! I recently launched this app, and I thought HN might find it interesting.
I'm primarily a web developer, but I decided to learn Swift and build it as a native app instead of doing it with Electron and a JS framework.
The main goal with this app was to save myself time creating screencasts, but I've heard from people doing YouTube and podcasts that it's saving them a lot of time too.
Part of that goal involved keeping the app itself fast, which was a big part of why I chose to go the native route. (AVFoundation was another big reason!)
One thing I'm pretty proud of is the waveform rendering. It loads the whole file and then downsamples it to various resolutions, and then switches between them as you zoom in and out. The result is that zooming feels very snappy because the app goes out of its way to avoid drawing more than necessary. The trial is free/no email required if you want to see how it works.
Happy to field any questions (or feature requests for that matter) :)
The video [1] on your page is a well-designed demo. It's self-evident, gets to the point, and shows off what the app does in a clear and practical way, in 1.5min.
Yeah this was a tricky decision early on. Going with web tech seemed like the most obvious path to MVP since I've got that experience.
I started with doing some proof of concept stuff in Electron + Svelte, but pretty quickly it felt like I was hitting a use case that HTML video is not meant for. It's not good at frame-accurate... anything. There was a big old github issue talking about web support for editors like this and it seemed like it was a long way off, if ever.
So then I started looking into native stuff. The AVFoundation library looked like a massive leg up. It takes care of reading and writing video and audio, and it's got some good functionality around managing editor concepts like tracks, clips, etc.
I looked at Swift UI first, and that felt sort of familiar coming from a React background -- one way data flow and such -- but it also didn't seem fully baked yet, and it would limit the app to 10.15 and newer. I know plenty of devs who hang on to old MacBooks and previous macOS versions for a long while (I'm one myself!) so I didn't want to rule out a huge swath of folks by choosing Swift UI either.
That left regular Swift and the old standby AppKit stuff. I used the book Hacking with macOS to get going, which was an amazingly helpful guide.
So far I think Swift was a good choice. Time will tell, I guess :)
Your demo video looks really great. I hope no one uses 0 seconds. That's one thing I find annoying about a lot of YouTube channels. They talk very fast and edit out all the spaces. My brain just simply can't keep up with it. It's too bad because the content is interesting but it's completely unintelligible and exhausting. I watch most of my videos via Chromecast, but Chromecast doesn't allow you to play videos at slower speeds. And it wouldn't add back the spaces anyway.
What if your tool was available in real time for people who wanted it and then I could have normal videos.
Hah, yeah. I don’t like it cut that close together either.
I don’t know if anyone is using it for this yet but it’s kinda fun to drag in a recording of a zoom call or something, and play it back with silence (mostly) cut out. Not quite real time yet, since it has to load the file first, but this could be done.
Neat app, I could see myself using this. However, one thing that makes me hesitant is bitrates and deeper information on file exports. Does it maintain the same bitrate as the original video? I see you have a lot of information (Cmd + F) on "export" but nothing regarding bitrates. An FAQ page would be useful instead of the long single page. I work in video production so bitrate is an important thing-to-know before purchasing. Anyways, cool app.
Yeah, good call, I’ll add something about this. Right now it uses AVFoundation’s “highest” preset which seems just about as fast as the “passthrough” one I was using originally, so I suspect it’s leaving the video (mostly?) alone. I’ll try to get some actual numbers on that though.
I don't know anything about how to do it with AVFoundation (or if it's possible, or if you're already doing this), but to some approximation depending upon codecs, you can make a lossless recode of the video. E.g., what Rogue Amoeba's Fission does, but for both video and audio.
You could also provide a way to export a list of cut points suitable for feeding to some other workflow. E.g., ffmpeg.
These may go beyond the simplicity and spirit of your app, which seems very nice.
Oh yeah, the cut list is exactly what the app does now :) It can export XML, EDL, or a ScreenFlow project. That was the first feature I built, and the main thing I wanted. I think the video example demos better, but to me, the cut list is way more useful.
Damn awesome work! Out of curiosity do you have paying customers yet? I was working on a similar thing in this niche and found it hard to find customers who were willing to pay
I’m mainly a web developer creating content for web developers, and so most of my following wouldn’t really want this tool. But I’ve gotten to know a handful of other content creators that could use something like this, and I think that probably helped to spread the word on Twitter a bit.
I'm primarily a web developer, but I decided to learn Swift and build it as a native app instead of doing it with Electron and a JS framework.
The main goal with this app was to save myself time creating screencasts, but I've heard from people doing YouTube and podcasts that it's saving them a lot of time too.
Part of that goal involved keeping the app itself fast, which was a big part of why I chose to go the native route. (AVFoundation was another big reason!)
One thing I'm pretty proud of is the waveform rendering. It loads the whole file and then downsamples it to various resolutions, and then switches between them as you zoom in and out. The result is that zooming feels very snappy because the app goes out of its way to avoid drawing more than necessary. The trial is free/no email required if you want to see how it works.
Happy to field any questions (or feature requests for that matter) :)