How artificial intelligence can impact the music industry

Artificial Intelligence, unlike popular notion, could be defined as decision-making based on arguments of a computer system

New Update
Artificial Intelligence

There are several digital programs capable of creating relatively original musical compositions almost entirely on their own using Generative Adversarial Neural Networks (GAN). For many, these are a help for artists when it comes to composing. For others, an advance that threatens to replace human creativity.


How artificial intelligence work in the music industry?

Artificial Intelligence is not necessarily what people imagine. The term has several meanings, but it could be defined as decision-making based on arguments of a computer system. There are several ways to accomplish this, but the general idea is to program algorithms that learn to identify patterns after reading a long database and then make decisions based on them. The more information they receive, the more they adjust to what is expected of them. Something moderately similar to a baby learning to speak.

But you can compose musical melodies. In 2016, IBM created Watson Beat, a software that has analyzed millions of songs and is capable of producing an artificial melody. The idea is for musicians to use it as a guide when they feel stuck or want to try different things. They can ask the program for any rhythm on which they can build their own theme song. Watson can even suggest popular phrases and words to craft the song's lyrics.


Watson is, perhaps, the most recognized, but there are others. Jukedeck, for example, creates songs according to the genre, instruments, duration and speed preferred by the user. They then sell the use license for $ 0.99 for individuals or $ 21.99 for large companies. Copyrights can also be purchased for $ 199 USD. Today, the company has created more than 500,000 songs and is used by various companies such as Coca-Cola or Google, that search for background music for their commercials.

A competing program, Amper, is composing, together with YouTuber Taryn Southern, the first album made by artificial intelligence. The only human work that Amper requires is for someone to input the beats per minute, rhythm, tone, and preferred music style.

This technological revolution does not only affect music production. The Landr program, for example, uses artificial intelligence to automatically master songs. According to the testimonies of some users on the internet, this does not reach the quality of a mastering directed by a human, but the results are quite acceptable.


On the other hand, platforms such as Gracenote, a Nielsen company, use artificial intelligence not to make music, but to distribute it. This digital tool identifies the underlying feelings in each song as well as the musical tastes of the users and sells that information to streaming music services such as Apple Music, Amazon Music or Amazon Groove Music. The "recommendations of the day" from platforms like Spotify are based on this type of technology.

The new bet of the industry

It is no wonder that the music industry is quite interested in these tools. The Sony Music Research Laboratory, with support from the European Research Council, created an artificial intelligence software to produce songs called Flow Machines. The result, which they released in December 2016, was two songs, one inspired by The Beatles and the other by the music of Johann Sebastian Bach.


And it's not just Sony. Big labels like Warner Bros and Universal Studios have invested in start-ups of similar types. The legendary recording studio Abbey Road Studios has powered various Artificial Intelligence platforms such as AI Music or Humtap through its incubator of technological innovations.

The creators of these tech tools are very emphatic in claiming that this will not replace artists. The system will always need inspiration that comes from a specific feeling of one person and that touches another person in the same way. Hence, it is impossible for this technology to replace the artist.

However, it is ironic that one of the functions of these programs is, precisely, to connect the artists with the ones that their listeners like and feel the most, according to an analytical study of what they say on social networks.On the other hand, music produced by robots could indeed occupy an important place in the production of background music or audio-visual products.


In any case, this technology is in a very incipient stage of development. It will take more time to see what its true scope will be within the music industry. Progress so far is quite promising. At the moment, if a user asks Siri, the virtual assistant for Apple's operating systems, if he knows the popular 2001 supercomputer: Space Odyssey, he will say: “I'm afraid HAL made some very bad decisions. But he did know how to sing”. Interesting answer, indeed.

The article has been written by Dr. Raul V. Rodriguez, Dean, Woxsen School of Business

He can be reached on LinkedIn.