Hacker News new | past | comments | ask | show | jobs | submit login
Synchronizing Pong to music with constrained optimization (victortao.substack.com)
321 points by platers 73 days ago | hide | past | favorite | 37 comments



This reminds a bit of a project I coded the audio for an art exhibition around 20 years ago, which was for multiple simultaneous players. As the game got more intense it became apparent that the ball/wall sounds were playing music, and that the three players were all actually playing one musical piece.

It was based around 3 arcade cabinets pointing together, so the players couldn't see what was on each others screens.

This was achieved by modifying the ball speed/direction slightly so that it arrived at the bat/wall at a musically relevant point and triggered the correct sound.

Ah, here you go, Josh as a reference to it on his site: https://www.autogena.org/work/ping


That’s neat. So it was a static composition that the gameplay was tweaked into fitting?


Somewhere in between. It was a static composition (as in, it had a start and an end) but there were multiple loops per instrument, so more competent players would reach a more frenetic musical performance of the piece if they kept more of the balls in play.

The audio engine (the bit I worked on) was in effect a stem based mixer, but with a state transition diagram per stem, with multiple loops available. Depending on the route through the transitions (triggered by events from the game play, for example, how many balls the player is keeping in play) it was possible to reach more complex or simpler performances of the same musical piece, so the players decisions and ability would affect how the performance of the music, not the music itself if that makes sense?


Very cool! As a further variation on this idea, I'm imagining training a reinforcement learning agent on atari games / super mario, but with an additional music-based reward/input, to try to get a "musical" looking playthrough... (Not sure how good it would look / whether it would be worth it though...)


I'm a novice at machine learning, but Open AI made a python library for reinforcement learning in video games, and a fork of it is still actively maintained [1]. It's been a few years, but I remember being able to get it up and running in a day or two (maybe a weekend). It used the Retroarch emulator, which is compatible with a huge number of emulators and consoles.

https://github.com/Farama-Foundation/Gymnasium

There also SethBling's excellent video on YouTube about machine learning specifically with Super Mario World:

https://www.youtube.com/watch?v=qv6UVOQ0F44

I encourage you to give it a try! I feel that video games are a bit underrated by current AI buzz, I think there's lots of potential for a machine to learn a skill through playing a game. And lots of potential for a game being selected or even created with the goal of teaching a specific skill. However at that point maybe it's better to forego the audio and visuals and speak to the machine in text or with pure data.

On the other hand, I have seen a video about convolutional neural networks that feed on each pixel of an image. So perhaps training with sound data, or with the pixels of a spectrogram, could have some positive results. And certainly I would be amused to see a game played in time with music, or possibly even dancing with a songs melody, harmony, and story as well.

Anything that's ever been created by humans, existed first in the imagination of a human brain. And you've got one of those. A mental vision pursued and brought from the mind into physical reality is a beautiful thing, a gift for all of humanity in my eyes. I think it's quite worthwhile. But that's just my perspective. Thank you for sharing your imagination. Have a nice day


Crypt of the NecroDancer explores this idea of rhythmically timing your character's movement to get bonuses in game.


Don’t forget the legend of Zelda spinoff “cadence of hyrule” which I’m pretty sure was made by the same guys as crypt of the necrodancer.


Bit.Trip Beat is a game based on almost exactly the concept in the linked page: https://youtu.be/LHbg-sNqe4w?t=47 , as well as its later sequel Flux.


Not very good. Mario doesn’t have enough rhythm to evoke a lot of musicality.

The original game’s sound was tied to the frame rate so this vaguely happened by default. Later ports to PAL broke this because it ran at a slower frame rate.


This reminds me of those polyrhythm visualizations on YouTube (check out LucidRhythms for some great examples).

https://www.youtube.com/@LucidRhythms

Probably almost impossible to adapt written works 'backwards' into a visualization but it might be fun to have different bars represent different notes and have the balls split for chords.


Having N paddles would be cool as well, perhaps with a cost for paddles getting too close (to avoid trivialities), and optimizing which note each paddle represents/beats, maybe imposing a cost on drastic pitch changes so that each specializes in a pitch range.


This is so freaking cool! I was mesmerized watching the paddles move as the beat progressed. There are certain things that just look right which makes it beautiful.This project is one of them!


Prior art: Eisenfunk - Pong (https://www.youtube.com/watch?v=cNAdtkSjSps)


Bit different though! In your example, the video is made from manually syncing with the song bpm, as the beep is at a constant rate. It's basically just a hand-made visualization of (every other) kick drum.

While the submission has the notes not at a basic 1/4 tempo, and is automatically "animated" based on the constrained optimization. Also leads to a much more interesting visualization :)


No constraint optimization can replace Pentafunk Jenny ;)


I love that video. Weird, but catchy.


Came here to post that. Danke!


Really nice stuff. I cannot send a heart without subscribing which doesnt feel right for me.


While technically okay, there are multiple cases where a paddle and the ball move at the almost identical speed and it looks like the paddle is pushing the ball all the time. (By the way, p[i] = 0 should be disallowed for this reason.) This is inevitable when a large d[i] is immediately followed by a very small d[i+1], but it might be able to avoid them whenever feasible.


You could penalize the objective function proportionally to d'[i]


Imagining an `installation` in my space, using both my MT-80S and a display. Can I even reason about this, the timing? I'm not smart here, just interested

https://www.matrixsynth.com/2014/07/roland-mt-80s-midi-playe...


Really interesting. For some reason my brain really really hates this. I think it screws with my internal model of causality or something and I find it difficult to watch. Odd


For me the most conspicuous thing missing is dynamics... particularly when there are "ghost notes" in between much louder ones during a fast passage, it seems like something is missing.

That said, however, I find it to be oddly satisfying to watch. Curious if experience playing different instruments has anything to do with it. To me, something like a xylophone or a steelpan feels pretty analogous to this.


I think it may be synthesized – I thought the reverb was a little off, and noticed there was little to no change in timbre when the dynamics changed.


It seems like the ball bounces off the center of the paddle, not the edge, which always makes it look wrong. Maybe you're seeing the same problem?


Atari had a video music "visualizer" device back in the late 1970s. Designed by one of the developers of the Pong game. One of if not the first consumer product of its kind.

https://en.wikipedia.org/wiki/Atari_Video_Music

If you've seen the movie Over the Edge, Claude and Johnny have one at their house.


Awesome work!

How is the beat used to sync the pong chosen? Like for Bad Apple!, especially around 1m55 https://www.youtube.com/watch?v=bvxc6m-Yr0E it seems off

Good suggestion from a YouTube commenter, pasting it here

> This is pretty cool.. it would be cooler if there were multiple pongs and paddles for each type of beat (like high beats and low beats)


This is really cool. I took an optimization class a few years back, but haven't made the time to do anything fun with it since. This inspires me to do it.

I do kind of wish that the last note corresponded to a game over, though, and I wonder if a smaller screen or faster ball would widen the playing field a little. Maybe I'll fork the code and try some of those out myself.


Delightful. Part of the fun is that the game is a background to the music rather than the other way around that we are used to.


Absolutely wonderful!

> "We obtain these times from MIDI files, though in the future I’d like to explore more automated ways of extracting them from audio."

Same here. In case it helps: I suspect a suitable option is (python libs) Spleeter (https://github.com/deezer/spleeter) to split stems and Librosa (https://github.com/librosa/librosa) for beat times. I haven't ventured into this yet though so I may be off. My ultimate goal is to be able to do it 'on the fly', i.e. in a live music setting being able to generate visualisations a couple of seconds ahead being played along with the track.

Not sure if this is unsavory self promotion (it's not for commercial purposes, just experimenting), but I am in the middle of documenting something similar at the moment.

Experiments #1 - A Mutating Maurer Rose | Syncing Scripted Geometric Patterns to Music: https://www.youtube.com/watch?v=bfU58rBInpw

It generates a mutating Maurer Rose using react-native-svg on my RN stack, synced to a music track I created in Suno AI *. Manually scripted to sync up at the moment (not automatic until I investigate the above python libs).

Not yet optimised, proof of concept. The Geometric pattern (left) is the only component intended to be 'user facing' in the live version - But the manual controls (middle) and the svg+path html tags (right) are included in this demo in order to show some of the 'behind the scenes'.

Code not yet available, app not yet available to play with. Other geometric patterns in the app that I have implemented:

- Modified Maurer

- Cosine Rose Curve

- Modified Rose Curve

- Cochleoid Spiral

- Lissajous Curve

- Hypotrochoid Spirograph

- Epitrochoid Spirograph

- Lorenz Attractor

- Dragon Curve

- Two Pendulum Harmonograph

- Three Pendulum Harmonograph

- Four Pendulum Harmonograph

This is the Typescript Maurer Rose function (that is used with setInterval + an object array of beat times which determine when to advance the 'n' variable):

  export const generateGeometricsSimplemaurer = (n: number, d: number, scale: number = 1) => {
      const pathArray: TypeSvgPathArray = [];
      for (let i = 0; i <= 360; i += 1) {
          const k = i \* d;
          const r = Math.sin(n \* k \* (Math.PI / 180));
          const x =
              r \*
              Math.cos(k \* (Math.PI / 180)) \*
              40 \* // base scale
              scale +
              50; // to center the image
          const y =
              r \*
              Math.sin(k \* (Math.PI / 180)) \*
              40 \* // base scale
              scale +
              50; // to center the image
          pathArray.push(\${i === 0 ? "M" : "L"} ${x} ${y}`);`
      }
      const pathString: string = pathArray.join(" ");
      return pathString;
  };
setInterval isn't an appropriate solution for the long term.

The geometric patterns (with their controls) will have a playground app that you can use to adjust variables... As for the music sync side, it will probably take me a long time.

*Edit: I just noticed that the author (Victor Tao) actually works at Suno


pretty neat! it feels like if you spaced out “important” beats instead of most of them and shrunk the play area so the paddles are larger, it would have an even more interesting effect.


If only operational research courses could have been so fun many years ago… excellent write-up!


Neat, love it.

Now try synchronising the music to the game.

You could use our Bungee library for the audio processing.


Would be cool if someone did this for Lichess.


very cool project! love your idea


> Synchronizing pong to music with constrained optimization

Nothing new. Apparently there are references to people doing this in ancient and medieval times.

https://en.wikipedia.org/wiki/Flatulist


Awesome work bro! Your company is hiring? I'd be super thrilled to work with you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: