I've been working on a music player that combines audio analysis and shaders to create music videos that play alongside your music. The videos are similar to the style of edited music videos you can often find on youtube and the like, but are done in real-time!
edit: to clarify some confusion, this app is basically a music player similar to wallpaper engine but with more focus on visualizations. the trailer contains an example of the videos produced - you load your images into the app, and the app uses it to create a visualization synchronized with the music similar to the style of edited music videos.
I really like it. I think a lot of the criticisms could be solved by finding a bunch of royalty-free images to have preloaded, neutral patterns/graphics so it's not restricted to working for specific genres, and across the images put some subtle text saying "Demo graphics, replace with your own..."
You basically want to show the best possible example of it working, right out of the box, as that very first impression is what sticks.
For your demo video I'd also have it change music genres during the demo, and with each change also change the images to suit that genre. Show 10 seconds of generic rock and have the images be a concert, then switch to 10 seconds of classical music and have the images be nature/space, etc.
I think I'm one of the few who still have a large local music library and listen to it lots, usually while working on hardware projects or writing/sketching ideas, so I'd happily use something like this up on my monitor while doing so.
Are there any demos available? Like, examples of what the music videos look like?
I'm a little wary of downloading what appears to be closed-source software and running it, especially if I don't know if it's something I'd be interested in using.
Otherwise, it's a really cool idea (I LOVE the old winamp visualizers).
The footage in the trailer shows what the music videos look like. All of it is recorded directly from the app, and there's a section where the UI is hidden and it's mostly effects and transitions going on (that's the music video part). In hindsight I might not have made that part clear enough.
Also, the old winamp style visualizations were super badass! They played a huge part in my interest in audio visualization :-)
You can use whatever images you want. You load your images into the app along with your audio and it uses those images. There's a bunch of effects like filters, transitions, particles, etc. that you can customize to adjust the visualizations.
I guess what I'd really like to see is a demo of the different effects and filters and whatnot. Nothing extreme, but show how they can make the same pictures into a different experience.
The HTML5 video tag supports having multiple video sources so you can still keep the webm as one alternative and browsers will pick either the webm or mp4 depending on support.
My comment originally only said that webm was not supported prior to iOS 15. But webm videos are not fully supported in iOS 15 either afaik, so I updated the comment to specify that they started adding support for parts of it. You may have seen an earlier version of my comment without that qualifier, in which case I apologize for any confusion.
Just gave this a whirl and it's pretty neat! Some feedback:
- App looks very pretty. Highlights are the subtle animations when scrolling long lists in the Library, and using album art in individual rows.
- I like the output of the visualizer. The glitch transition between tracks looks great.
- Having the visualizer in the background while browsing music is a bit distracting.
- Definitely needs some default imagery for the visualizer, I expected it to work out of the box.
- Took a really long time to index my music collection (about 20 minutes for metadata and another 5 minutes to generate thumbnails). This is for 81,000 songs w/ a high-end desktop computer.
- Under library > artist, `Unknown Artist` has almost all albums attributed to it. Most of my collection is MP3 with ID3v2.4 tags (artist field always populated, album artist populated when applicable). In Hyperchroma, most artists are listed with the correct number of songs, but 0 albums.
Probably not what you're after, but I think this could be very successful as a free/cheap alternative to After Effects specifically to generate the music videos. (rather than a music player)
The demo here is underwhelming compared to old-school winamp visualizers.
But I think that with deep learning, there is now a real opportunity to make a kickass visualizer. Take a GAN and set the latent vector according to the DFT of the audio stream -- it would be super trippy. You could even use WaveNet and CLIP to make the visuals match the lyrics.
I had a lot of fun listening to music in winamp when I was younger with milkdrop. But I think my tastes have changed now to where I prefer visualizations with more concrete imagery and variety instead of the usual variations of abstract effects (they are still dope as hell though).
I tried to design this app with that in mind, but I definitely see why some people don't find them this style of visualization to be interesting. Maybe someday someone will put some serious effort into bringing the demoscene-style visualizations into the modern era?
Yeah to this day I think of Winamp visualizers as still the best implementation of music visualization ecosystem to date. iTunes had (has?) visualization plugins but didn’t compare to the hey day of Winamp. Seems like people just stopped trying in this category? Not sure where the audio visualization scene is at
Don't mean to hijack this thread, but I built out a mini Spotify player with visualization support: https://github.com/dvx/lofi
Even had a few folks contribute a couple of neat visualizations[1] :)
PS: Visualizations were unfortunately deprecated on Macs since like 1.5.0 because of the incredible difficulty of getting the audio stream data. Pretty sure there's no way of doing it these days outside of a kext.
Actually the new Music.app will load visualization plugins on SIP disabled systems from a specific folder and specific format, although it's been far too long now for me to remember where that folder was. I'm sure some keen reversers can take a look at Music.app and figure it out again. I for one absolutely love the iTunes visualizers, they even have hidden key commands to manipulate them with.
The idea seems quite nice but the execution isn't.
This uses crazy amounts of resources for what it offers. Electron just isn't the right platform for such a thing, imho.
Should be a nice Rust / OCaml / C++ / "whatever native" app with max. 10% the size and 1% of the current resource usage.
Maybe it's better on other systems regarding resources but the Linux version I've tried made my laptop fan go into fighter-jet mode. That's not appropriate for playing music and showing some "shaken images". When using my desktop media player (smplayer) and projectM as visualization there is hardly any load on the same system (and the fan remains of course completely off).
I'm not sure the media player component in this project here is even needed. If the visualization needs to analyze larger parts of the audio it could delay and buffer the signal (which would work for longer running background music well enough), or just analyze the files on disk upfront and save the results externally. At least on Linux there is some generic audio player interface that shows for example a widget in the task bar (I don't know about the details how this works, but it works). Information from this interface could be used to know which track is playing currently and get notified about track changes (showing the title and cover art and offering play/stop/pause/forward/rewind buttons is what the generic audio player widget can do so the needed information about player state is available and can be obtained likely also form other apps). I would guess such an audio player interface is also there on other desktop operating systems so such a visualization could be made cross platform likely.
It doesn't at the moment. I would like to, but it's quite difficult getting access to PCM audio data from services for audio analysis due to DRM (see Widevine).
You could take a look into Spotify Web API. It provides rough audio track analysis data at /audio-analysis/ endpoint. Not sure if other services have anything similar.
Yeah. It's most similar to the kind of edited stuff you see people making with programs like Adobe After Effects and etc. The effects and transitions are synchronized with the currently playing audio, so there is real-time aspect to it.
Ah, I see. It's a little un-intuitive, to be honest, that a music visualization needs the user to supply an image. Perhaps supply a default image, or use the album art if the music file has one embedded?
I really like how this looks, and it'd be pretty nice to have it running on the monitor I have above my main monitor.
Problem is that I only have s3m, it and xm etc. music as actual local files, which this player don't seem to support.
Most other music (like the bakcground jazz I keep in the background) is streamed.
The trailer contains footage of the music videos produced (the part with the fancy effects and transitions synchronized with the music). It's basically a real-time video that plays within the app alongside your music - it doesn't output videos as files. I need to probably make that part clearer somehow.
I think the main reason is that the visualizations are driven by audio analysis, which is often inferior to actual videos edited by hand. This isn't designed to compete with videos made by people, only to provide a good-enough substitute.
I could implement a video editor to let people have more control over the visualizations, but people can already use existing video editors to do that instead if they really wanted to. So I decided that it wasn't worth the effort.
Kudos for the effort. Now If you have a beast processor like M1 I presume it would run ok, but still, and this makes my 5+ year old machine cooking omelet, would never use it because of this. Also mentioned here in comments, dvx/lofi is the same story, Electron and webgl shaders, just not acceptable performance for such a tiny thing. Old school demoscene coders is where we should learn how to write efficient software for such tasks.
edit: to clarify some confusion, this app is basically a music player similar to wallpaper engine but with more focus on visualizations. the trailer contains an example of the videos produced - you load your images into the app, and the app uses it to create a visualization synchronized with the music similar to the style of edited music videos.