Hacker News new | past | comments | ask | show | jobs | submit login

Because it sounds better in many contexts, and people who use tools to make music or parts of it (e.g. drum machines, automatic accompaniment, electronic music, synthesis of instrumental parts) want these parts to sound good, so 'humanization' has been a key feature of these tools for decades. The concept is not new, it's widely used and widely accepted - the article is simply about how modern tools can do it slightly better than before.

It's perfectly normal for someone to want to, for example, use a drum machine instead of needing the collaboration of a human drummer - not needing collaboration lowers the barrier of entry and opens up possibilities of independent creativity.

It's also perfectly normal for someone composing an orchestra-like part for some project to be able to synthesize that without actually needing to involve (and pay for) a full orchestra of musicians. The orchestra might be able to interpret the score better and play the parts better than the synthetic instruments, so for large-scale projects this extra effort and cost often is appropriate, but not for all projects.




I do make music and am aware of what humanize means in terms of arpeggiators and sequencers: not too great but okay and leaving room for my own input. From a tool like that to to a tool making a 'perfect' humanized sounding is quite a step forward in making the human redundant. Where are my options as an artist other than pick a few options from a bucket of the trained AI? I understand the allure to making everyone able to compose music, sing, draw, paint and do all kinds of artful things at the click of a button with zero time spent understanding the space too well, it is indeed very attractive idea...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: