Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: A music player that creates real-time music videos (hyperchroma.app)
61 points by aeroheim on Nov 27, 2021 | hide | past | favorite | 55 comments



I've been working on a music player that combines audio analysis and shaders to create music videos that play alongside your music. The videos are similar to the style of edited music videos you can often find on youtube and the like, but are done in real-time!

edit: to clarify some confusion, this app is basically a music player similar to wallpaper engine but with more focus on visualizations. the trailer contains an example of the videos produced - you load your images into the app, and the app uses it to create a visualization synchronized with the music similar to the style of edited music videos.


I really like it. I think a lot of the criticisms could be solved by finding a bunch of royalty-free images to have preloaded, neutral patterns/graphics so it's not restricted to working for specific genres, and across the images put some subtle text saying "Demo graphics, replace with your own..."

You basically want to show the best possible example of it working, right out of the box, as that very first impression is what sticks.

For your demo video I'd also have it change music genres during the demo, and with each change also change the images to suit that genre. Show 10 seconds of generic rock and have the images be a concert, then switch to 10 seconds of classical music and have the images be nature/space, etc.

I think I'm one of the few who still have a large local music library and listen to it lots, usually while working on hardware projects or writing/sketching ideas, so I'd happily use something like this up on my monitor while doing so.


Are there any demos available? Like, examples of what the music videos look like?

I'm a little wary of downloading what appears to be closed-source software and running it, especially if I don't know if it's something I'd be interested in using.

Otherwise, it's a really cool idea (I LOVE the old winamp visualizers).


Hey, thanks for your interest!

The footage in the trailer shows what the music videos look like. All of it is recorded directly from the app, and there's a section where the UI is hidden and it's mostly effects and transitions going on (that's the music video part). In hindsight I might not have made that part clear enough.

Also, the old winamp style visualizations were super badass! They played a huge part in my interest in audio visualization :-)


Not to be hyper-critical, but that video was a bit underwhelming. It might be helpful to find a more impressive example to highlight.


> The footage in the trailer shows what the music videos look like

What, the static anime girl with a ken burns pan and zoom?


Are you talking about the anime girl?

If so, does it do other things or just anime girls?


You can use whatever images you want. You load your images into the app along with your audio and it uses those images. There's a bunch of effects like filters, transitions, particles, etc. that you can customize to adjust the visualizations.


I guess what I'd really like to see is a demo of the different effects and filters and whatnot. Nothing extreme, but show how they can make the same pictures into a different experience.


I’m on iOS, the trailer from the website does not load for me. Do you have the video on YouTube?


I do have a twitter video available (no youtube unfortunately, sorry!)

https://twitter.com/hyperchroma/status/1464656712544174083

I'm not sure if the trailer loading is an issue on the site - could you check the console for me and see what it says?


I’m on iOS too. From experience it’s usually because the video container, codec, or options, you are using for the video is not supported by iOS.

In your case I see you are using a webm container. iOS prior to version 15 does not support webm natively. Then earlier this year they started adding support for parts of webm in iOS 15. https://9to5mac.com/2021/08/10/apple-adding-webm-audio-codec...

I recommend converting the video using ffmpeg.

The following should give you a video file that will be playable on most mobile devices, even quite old ones.

    ffmpeg -i preview.webm \
      -c:v libx264 -crf 23 -profile:v baseline -level 3.0 -pix_fmt yuv420p \
      -c:a aac -ac 2 -b:a 128k \
      -movflags faststart \
      preview.mp4
The HTML5 video tag supports having multiple video sources so you can still keep the webm as one alternative and browsers will pick either the webm or mp4 depending on support.

See https://developer.mozilla.org/en-US/docs/Web/HTML/Element/so... for details on multiple sources in the video element. I.e. what changes you need to make to your html.


I’m on the latest version of iOS (15.1.1) and it doesn’t load for me either. Perhaps the server hosting the video file is overloaded?


My comment originally only said that webm was not supported prior to iOS 15. But webm videos are not fully supported in iOS 15 either afaik, so I updated the comment to specify that they started adding support for parts of it. You may have seen an earlier version of my comment without that qualifier, in which case I apologize for any confusion.


Ah, thank you very much!


Just gave this a whirl and it's pretty neat! Some feedback:

- App looks very pretty. Highlights are the subtle animations when scrolling long lists in the Library, and using album art in individual rows.

- I like the output of the visualizer. The glitch transition between tracks looks great.

- Having the visualizer in the background while browsing music is a bit distracting.

- Definitely needs some default imagery for the visualizer, I expected it to work out of the box.

- Took a really long time to index my music collection (about 20 minutes for metadata and another 5 minutes to generate thumbnails). This is for 81,000 songs w/ a high-end desktop computer.

- Under library > artist, `Unknown Artist` has almost all albums attributed to it. Most of my collection is MP3 with ID3v2.4 tags (artist field always populated, album artist populated when applicable). In Hyperchroma, most artists are listed with the correct number of songs, but 0 albums.

Probably not what you're after, but I think this could be very successful as a free/cheap alternative to After Effects specifically to generate the music videos. (rather than a music player)


Thank you for the detailed feedback!

Some of the stuff you mentioned are definitely bugs. Do you mind submitting them to the issue tracker?


The demo here is underwhelming compared to old-school winamp visualizers.

But I think that with deep learning, there is now a real opportunity to make a kickass visualizer. Take a GAN and set the latent vector according to the DFT of the audio stream -- it would be super trippy. You could even use WaveNet and CLIP to make the visuals match the lyrics.


Yeah, I can understand that.

I had a lot of fun listening to music in winamp when I was younger with milkdrop. But I think my tastes have changed now to where I prefer visualizations with more concrete imagery and variety instead of the usual variations of abstract effects (they are still dope as hell though).

I tried to design this app with that in mind, but I definitely see why some people don't find them this style of visualization to be interesting. Maybe someday someone will put some serious effort into bringing the demoscene-style visualizations into the modern era?


Demoscene style: https://github.com/XT95/VisualLiveSystem (of course the demosceners got you covered)


The winamp ones were done by genius demosceners


Yeah to this day I think of Winamp visualizers as still the best implementation of music visualization ecosystem to date. iTunes had (has?) visualization plugins but didn’t compare to the hey day of Winamp. Seems like people just stopped trying in this category? Not sure where the audio visualization scene is at


I remember back in the winamp days thinking "this is awesome, I can't wait to see how much more awesome this is in 15 years!".

Alas progression in tech can't be taken for granted.


Don't mean to hijack this thread, but I built out a mini Spotify player with visualization support: https://github.com/dvx/lofi

Even had a few folks contribute a couple of neat visualizations[1] :)

PS: Visualizations were unfortunately deprecated on Macs since like 1.5.0 because of the incredible difficulty of getting the audio stream data. Pretty sure there's no way of doing it these days outside of a kext.

[1] https://github.com/dvx/lofi/tree/master/src/visualizations


Actually the new Music.app will load visualization plugins on SIP disabled systems from a specific folder and specific format, although it's been far too long now for me to remember where that folder was. I'm sure some keen reversers can take a look at Music.app and figure it out again. I for one absolutely love the iTunes visualizers, they even have hidden key commands to manipulate them with.


What resources have you found on algorithmic music visualization?


Metagroove is a pretty great visualizer that has the same vibe as Winamp's, but with a modern touch -- and more importantly, a modern resolution.

https://marumari.itch.io/metagroove

https://www.reddit.com/r/WeAreTheMusicMakers/comments/jqda6f...


The idea seems quite nice but the execution isn't.

This uses crazy amounts of resources for what it offers. Electron just isn't the right platform for such a thing, imho.

Should be a nice Rust / OCaml / C++ / "whatever native" app with max. 10% the size and 1% of the current resource usage.

Maybe it's better on other systems regarding resources but the Linux version I've tried made my laptop fan go into fighter-jet mode. That's not appropriate for playing music and showing some "shaken images". When using my desktop media player (smplayer) and projectM as visualization there is hardly any load on the same system (and the fan remains of course completely off).

I'm not sure the media player component in this project here is even needed. If the visualization needs to analyze larger parts of the audio it could delay and buffer the signal (which would work for longer running background music well enough), or just analyze the files on disk upfront and save the results externally. At least on Linux there is some generic audio player interface that shows for example a widget in the task bar (I don't know about the details how this works, but it works). Information from this interface could be used to know which track is playing currently and get notified about track changes (showing the title and cover art and offering play/stop/pause/forward/rewind buttons is what the generic audio player widget can do so the needed information about player state is available and can be obtained likely also form other apps). I would guess such an audio player interface is also there on other desktop operating systems so such a visualization could be made cross platform likely.

Just my 2¢.


Does it have integration with "popular music streaming services", as nowadays it is rare for users to have their music in mp3 format.


It doesn't at the moment. I would like to, but it's quite difficult getting access to PCM audio data from services for audio analysis due to DRM (see Widevine).


You could take a look into Spotify Web API. It provides rough audio track analysis data at /audio-analysis/ endpoint. Not sure if other services have anything similar.


Just seeing the title made me think of the mid-90s DOS MOD music player "Grind": https://youtube.com/watch?v=dJHnAoeQPGU


Are real-time videos just photo transitions with effects?


Yeah. It's most similar to the kind of edited stuff you see people making with programs like Adobe After Effects and etc. The effects and transitions are synchronized with the currently playing audio, so there is real-time aspect to it.


I just tried this, and I only got the big logo slowly floating in the middle of the screen, no matter what music I play. Am I missing something?


Hey,

You need to add images to it for it to visualize. You could try with some of your favorite wallpapers perhaps.

The docs on the site cover how to do it, but I'm going to work on making it more clearer.


Ah, I see. It's a little un-intuitive, to be honest, that a music visualization needs the user to supply an image. Perhaps supply a default image, or use the album art if the music file has one embedded?


I really like how this looks, and it'd be pretty nice to have it running on the monitor I have above my main monitor. Problem is that I only have s3m, it and xm etc. music as actual local files, which this player don't seem to support. Most other music (like the bakcground jazz I keep in the background) is streamed.


This crashes on macOS for me:

> … not valid for use in process: mapped file has no cdhash, completely unsigned? Code has to be at least ad-hoc signed.


Oof, thanks for reporting this. I think something might've broke in the code signing/notarization process for macOS recently.

Could you post an issue with the details on the issue tracker? https://github.com/Hyperchroma/hyperchroma/issues


Issue should be resolved now. Apologies!


same issue


Hey,

Just wanted to let you know that the issue should be resolved now. Apologies for the inconvenience.


Any demo video to understand what kind of output it will be?


The trailer contains footage of the music videos produced (the part with the fancy effects and transitions synchronized with the music). It's basically a real-time video that plays within the app alongside your music - it doesn't output videos as files. I need to probably make that part clearer somehow.


"real-time music visualization"


> it doesn't output videos as files

Er, might I ask why not? Be a lot cooler if it did...


Yeah, it would be pretty cool.

I think the main reason is that the visualizations are driven by audio analysis, which is often inferior to actual videos edited by hand. This isn't designed to compete with videos made by people, only to provide a good-enough substitute.

I could implement a video editor to let people have more control over the visualizations, but people can already use existing video editors to do that instead if they really wanted to. So I decided that it wasn't worth the effort.


Project looks great!

Tho, out of curiosity: Is it open source/will be?


Thank you :)

I haven't decided yet on whether or not I want to open source it. But I am definitely open to it and may do it in the future.


Ooh, glad to hear, I wish you all the luck with the project aswell :D


+1 for open source. Good luck OP!


Is it built on electron?


Yes.


Kudos for the effort. Now If you have a beast processor like M1 I presume it would run ok, but still, and this makes my 5+ year old machine cooking omelet, would never use it because of this. Also mentioned here in comments, dvx/lofi is the same story, Electron and webgl shaders, just not acceptable performance for such a tiny thing. Old school demoscene coders is where we should learn how to write efficient software for such tasks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: