If AI and robots are going to take our jobs, at least they can do it in the most relaxing way possible. Just like Shimon here — a four-armed marimba-playing robot designed by George Tech’s music technology center. Sure, Shimon is just the tip of the iceberg when it comes to AI making music, but just listen to those jazz-fusion vibes.
Like many AI music experiments, the music Shimon is playing is generated by a method called deep learning. This essentially means mining a large amount of information (in this case, a dataset of some 5,000 songs) and looking for common patterns in the music. For example, if you have a sequence of notes F, G, A, what note will follow next? Deep learning will give you a good answer.
Shimon the robot has been around for a while now, playing alongside human musicians using pre-programmed songs. But now, it’s being used to play original compositions. The video above shows the first melody Shimon ever created, while the one below is melody number two — a slightly faster number:
As we’ve seen with previous experiments, the actual musical output is a bit avante garde. Researchers working in this field say this is because the deep learning systems we use to analyze music tend not to be so good at thinking about long-term structure. They analyze the music in short bursts, and the resulting melodies sound quite abstract. It is possible to program in artificial constraints so the programs will produce songs with traditional verse-chorus structures, but at that point it’s not really AI-created music, but AI-human collaboration.
So, although the marimba is a particularly non-threatening instrument, the melody itself is proof that machines have a way to go. For more information on Shimon, check out this interview with its teachers, Gil Weinberg and Mason Bretan, over at IEEE Spectrum.