What if Artificial Intelligence (AI) algorithms could write the music that you listen to? And who would own that music?
AI can now be found in every corner of the music industry. From analysing the music you listen to and acting as your personal DJ, to evaluating which songs are most likely to be popular with music consumers, it is everywhere.
What if Artificial Intelligence (AI) algorithms could write the music that you listen to?
In this experimental concert, we will premiere a set of pieces created by a team of musicians and researchers working with AI algorithms that learn how to compose by analysing music written by humans.
We will explore how the choice of learning materials and the intervention of humans in the composition process affect the music that machines create and the implications to authorship and copyright.
Part of Musical Futures, an annual series of concerts and events that explore the future of music-making, digital creativity, and new ways to perform, experience and interact with music.
Artificial Intelligence (AI)
AI is the science of creating computer programs that give machines the capacity to perform mental functions that resemble what humans do. This includes things like perceiving the world (e.g., seeing, listening), learning simple and complex things, reasoning and solving problems. At present, there are many (very specific) tasks that AI machines can perform very, very well (sometimes more accurately and faster than humans) and the possibilities seem to be endless (especially given that capacity that computers now have). Nonetheless, they we are still far from machines performing complex intellectual tasks that mirror the flexibility and autonomous ways in which humans learn, solve problems, adapt to continuously changing environments, and make complex decisions.
(Image on the right generated by DALL-E)
AI ... Music!
This does not mean that machines are not capable of very surprising things … including creating music! AI-generated music is now an (accessible) reality and is establishing itself in various areas of the music industry. This makes us wonder: will machines be creating the music of the future? This is a rather big question that, perhaps, only time will be able to answer (sometimes it is surprising how great ideas do not interest people and end up fading away. But there are certainly more immediate things that are interesting to reflect on, including the ways in which these machines use existent (human composed) music and the implications it has to the music industry (and especially music creators). This is the focus of this project.
(Image on the left generated by DALL-E)
M&M is both an artistic and a research exercise. We want to know more about the legal implications that using copyrighted music to “teach” AI machines to create music and the preparedness of the UK legal framework to accommodate AI-generated works with all its possible nuances (e.g., pieces created only by the machine or in collaboration with a person). In a nutshell, this work aims to test whether the music industry’s current legal and economic frameworks are prepared for this emerging phenomenon and to offer recommendations for the future. This is the research bit …
In order to make this possible, our team decided to design a potential future scenario (perhaps even already present): we asked a composer to create new music with two AI music machines (using the same technology used in the viral OpenAI ChatGPT) created for this project. One of the machines was trained with (out-of-copyright) Romantic music and another with (in-copyright) Pop and Rock music. By trained we mean that the model “heard” a lot of music pieces and asked to infer the rules that allow it to create music in the style of the music it heard (in essence not very different from what humans do). By following these steps, we were able to know exactly which songs the machine used to learn how to compose and therefore we have a very transparent and clear platform to explore the legal implications that arise from the whole process (something that would not be possible if we used an already available commercial AI music machine).
The outcome of this exercise are four new pieces that will be performed in live concert at the Tung Auditorium (University of Liverpool): two inspired by Schubert’s Die Forelle and the other two by a new piece commissioned to the Liverpool-based band Steeling Sheep: Just Data.
The concert will start with a short introduction to AI, how AI machines are being used to create new art (including music) and an overview of how we used AI to create a new set of songs over the past few months.
There will be five world premieres: four new songs composed with two AI machine developed by Rui Guo in collaboration with the Applied Music Research Lab and a song commissioned to the Liverpool-based band Stealing Sheep. The lyrics for all songs were created with the now infamous ChatGTP.
After the concert, Rachel Drury (Ph.D. research at the University of Liverpool) will talk about the implications of AI to copyright law and the research behind this project.
At the end, the public will have the chance to ask questions about the concert and the project.
The concert of FREE, but you will need to book tickets in advance.
"Die Forelle", Op. 32, D 550. is a lied, or song, composed in early 1817 for solo voice and piano with music by the Austrian composer Franz Schubert. Schubert chose to set the text of a poem by Christian Friedrich Daniel Schubart, first published in the Schwäbischer Musenalmanach in 1783. [Source: Wikipedia].
This piece mirrors a similar story to Franz Schubert’s lied adaptation of ‘Die Forelle’ written by Christian Friedrich Daniel Schubart. It follows a carefree trout innocently swimming through a stream … when it is deceived by a cunning AI system who then entraps the poor trout within a world of machines.
This piece tells the story of a virtual trout, an AI being, empowered by extraordinary intelligence and processing capabilities. While blessed with these incredible powers, it longs to connect to the real world, to drink from and swim in real water. In sudden and unforeseen circumstances, the AI trout malfunctions with disastrous effects. It becomes an unstoppable monster wreaking havoc on the world over which it eventually presides.
This song was written from the perspective of an AI system who finds themselves through the discovery of music creation …
This song was written from the perspective of someone disillusioned by a world they feel has perhaps underdelivered its futuristic promise and lost something important in the process. The protagonist seeks comfort in communicating with an AI, hoping that it can empathise with their feelings and ultimately share a meaningful connection.
Eduardo Coutinho, Rachael Drury and Mickey Bryan
Rui Guo and Eduardo Coutinho
Music Industry and Copyright
Rachael Drury and Mathew Flynn
Mickey Bryan and Rui Guo's AI machines
This project is sponsored by the
Interdisciplinary Center for Composition and Technology (University of Liverpool)
School of the Arts Research Development Initiative Fund (University of Liverpool)
The page was built with Mobirise site theme