Thuistezien 217 — 27.03.2021
We often tap our foot when listening to music. Often, we don’t realise we do it. It seems such a natural thing to do that we barely need to think about it, nor do we feel any particular pride for being able to pull it off. Yet there’s a notable processing of musical information we need to do quite quickly to be able to achieve this. There’s a significant amount of complex and nuanced information that we have slowly been internalising over a large part of our lives, which allows us to then digest music in a meaningful way. And every listener, including every non-musician, possesses a significant music-related skill-set even if they don’t realise it. Without it, music would be meaningless to listen to. With any skill that we are barely conscious of, it can prove difficult to then explain how we do it. Especially in a detailed and thorough manner that relies on language. And then try explaining the principle of ‘foot-tapping to music’ to someone that doesn’t even know what music is…
Which is why it becomes such a deeply fascinating question for some computer scientists as they try to ‘teach’ a computer how to recognize where the ‘foot-taping’ moments of a piece of music are. Once a computer scientist achieves a solution to this question, the natural next step would be to build computer software that also strives to extract the harmonic essence of a song, which of course soon proves even more complex.
These are the kinds of questions that arise for Dr. Anja Volk as she explores ways of building music related technologies. As an Associate Professor at the Utrecht University, Dr. Volk’s research lies at the intersection of Computer Science, Mathematics, Cognition and Music, and aims to develop computer systems that offer new ways of engaging with music. To achieve this, she relies heavily on research exploring how humans process musical information. But then the technologies she works on seem to have a habit of casting new light and discoveries on the infinitely complex question of how humans perceive music. Which then leads her back to working again on the software she has been developing, to then improve it using the new information she has uncovered.
These are some of the research processes that Dr. Volk reveals to us in her talks, as she offers an overview of several of the music technology projects she has worked on. In this video we see her give a short ‘lightning talk’ as part of the ‘Instrumental Shifts Symposium’ which took place at West, as part of the 2019 Rewire Festival. The 7-hour long symposium explored topics of new music technology, computer generated music and other future technologies, and their effect on music creation. Dr. Volk’s talk was featured in the symposium’s ‘lightning round’, which gave stage to researchers based in The Netherlands to present their work.
Text: James Alexandropoulos – McEwan