Hacker News new | past | comments | ask | show | jobs | submit login

But musical interpretation is something you develop from your education and your lived experiences. It is meaningless to agglomerate the interpretations of thousands of pianists from an artistic perspective.

If you want to simulate human physical limits then model the muscle agility, not the interpretation




There is no intent to simulate human physical limits, the goal of "Humanization" of MIDI music or musical notation is to provide an output that sounds "better" and more "human-like" than the default (very bad sounding) option of playing the notes at the exact timing and velocity that only follows the explicit notation (e.g. forte or crescendo) and is otherwise uniform.

Attempts to agglomerate the interpretations of thousands of pianists from an artistic perspective can be used to provide a better-sounding outcome than pure random variation (the simplest 'humanizer' algorithms in various tools) or a deterministic algorithm with hand-crafted heuristics, so they are useful and meaningful because of that.


But why strive to humanize something mechanical? Why pretend it's as if a human played it? Why exclude rather than plug in some human in the form of collaboration? I'm a bit dismayed with the current direction, tools are no longer developed for humans to use but for humans to be excluded. I hope it's a temporary direction because if this becomes a general trend in the future it will be depressing for humanity. Many more things will seek to be 'humanized' while actual humans will be few and far between


Because it sounds better in many contexts, and people who use tools to make music or parts of it (e.g. drum machines, automatic accompaniment, electronic music, synthesis of instrumental parts) want these parts to sound good, so 'humanization' has been a key feature of these tools for decades. The concept is not new, it's widely used and widely accepted - the article is simply about how modern tools can do it slightly better than before.

It's perfectly normal for someone to want to, for example, use a drum machine instead of needing the collaboration of a human drummer - not needing collaboration lowers the barrier of entry and opens up possibilities of independent creativity.

It's also perfectly normal for someone composing an orchestra-like part for some project to be able to synthesize that without actually needing to involve (and pay for) a full orchestra of musicians. The orchestra might be able to interpret the score better and play the parts better than the synthetic instruments, so for large-scale projects this extra effort and cost often is appropriate, but not for all projects.


I do make music and am aware of what humanize means in terms of arpeggiators and sequencers: not too great but okay and leaving room for my own input. From a tool like that to to a tool making a 'perfect' humanized sounding is quite a step forward in making the human redundant. Where are my options as an artist other than pick a few options from a bucket of the trained AI? I understand the allure to making everyone able to compose music, sing, draw, paint and do all kinds of artful things at the click of a button with zero time spent understanding the space too well, it is indeed very attractive idea...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: