We have all heard of Mozart, Chopin, and Beethoven, but not all of us know Google’s artificial intelligence. Yes, a robot has joined the club. And yes, it plays music. (If the song We are the robots by Kraftwerk is playing in your head right now, it is completely normal, don’t worry). This new robot/artist that creates a lot of debate is called Magenta. You might have seen in my previous article about Facebook’s artificial intelligence how machine learning works on images and videos. This article will describe a concept that is similar yet different. The main question here is: Can you use machine learning to create a music piece? That’s exactly what I will touch on in this article.
Google’s Magenta and its music band
Magenta is Google’s Brain Team project that answers the question mentioned above: Can we use artificial intelligence and machine learning to play music?
Google developed this project with two goals. The first is to explore machine learning even deeper and take this concept further. Indeed, this type of artificial intelligence has been used to recognize pictures, speech, and translate content. Facebook too has a similar algorithm that has been used to help blind people hear their newsfeed. This feature is called Facebook Read. For Artificial Intelligence researchers, the sky is the limit. They always look for new features to develop, and new ways of developing machines. So why not create algorithms and teach machines how to play the piano for example. Robots are good students. Indeed, blind tests have shown that people have been fooled by machines: Peter Russel, who is a musicologist, listened to a music piece played by Iamus, a classical music robot. Surpringly enough, he did not know it was created by a machine.
The second objective of this Magenta project is to build a community with people interested in music and technology such as artists, coders and researchers. Google is inviting people who are interested in this project to join the community, follow the progress. Actually, a part of the project is accessible to the general public, and is waiting for people’s input.
Artificial intelligence goes beyond limits
A lot of people are scared of such technological growth. Well, they might be right. When machines start recognizing pictures, and videos, and describe them to us, it means that technology is taking these robots beyond their limits.
The good news is that there is a use to this technology. It is not only developed to win a challenge, or defy the limits of research and technology. As I mentioned earlier, Facebook uses artificial intelligence it to expand its community, without excluding anyone.
When it comes to artificial music, some applications identified this new trend and working around it. There’s a mobile app called @life that plays music according to your state of mind and your mood. You might ask yourself, how does a machine know what one is feeling in real time? The machine gathers information about the person’s behavior or their location location and analyzes their mood. Some data analysts use Instagram filters for example, to identify the user’s mood: dark colors reflect sadness, whereas bright colors represent happiness. This music mobile application is said to help people in pain by distracting them and using the popular benefits and virtues of music.
Maybe machines will help us create new music genres, by combining different algorithms. Or maybe this new invention will help people manage their stress or heal their pain, music being the cure to everything! We can see the bud of that technology today with Spotify that can detect your running speed and adapt the music type and tempo.