Friday, May 17, 2019

I listened to a Massive Attack record remixed by a neural network




Mezzanine, by trip-hop group Massive Attack, is a huge record that many, including Rolling Stone, consider one of the best mankind has ever produced. You’ve probably heard at least a few of its tracks, like Teardrop, Angel and Inertia Creeps. I can say with absolute certainty, though, that you’ve never listed to a version remixed by a custom neural network.


The mashup is part of a new exhibition at London’s art-focused Barbican center. It’s easy to miss — the moody, atmospheric tones are easily suffocated by the patter and chatter of nearby attendees. A giant projection with flashing letters, though, alludes to the AI-powered creativity happening out of sight.


Those characters are, in fact, from DNA sequences that were used to encode the album in a spray can. That unusual form of musical distribution is also being exhibited at the show, but has no real connection to the neural network project. Still, the letters that comprise DNA — A, C, G and T — flash across the wall, creating a simple but effective music video for the Mezzanine remix that’s pumped out of two Sonos speakers.


The neural-network rendition never stops. It’s a machine that constantly spits out new sequences derived from every track on Mezzanine. The remix sounds like a singular piece that is continuously being updated and extended, like a band who doesn’t know when to stop riffing on the last track of a live set. The lack of a clear-cut start or end point was maddening, at first. But I soon appreciated its rolling and ever-evolving nature, which felt strangely similar to a 10-hour ‘white noise’ rain track or study playlist on YouTube.



The original version of Teardrop, the second single from Mezzanine.

Massive Attack collaborated with Mick Grierson, a professor and research leader at the UAL (University of the Arts London) Creative Computing Institute, and various students on the installation. Prior to the project, Grierson had spent several years working on ways to produce music with neural networks. Unlike most artists, though, he didn’t use notes and chord progressions as the fundamental building blocks. Instead, he focused on systems that could understand the “texture” of bands like Massive Attack, who uses sweeping, continuous sounds to transport the listener. “You don’t really notate that stuff,” he explained. “It’s about the style, and the quality of the sample, particularly with electronic music.”


A few years back, Grierson and his research team attended Moogfest, a music and technology festival in North Carolina, to run a workshop on machine learning and share some of the toolkits they had been developing. For Grierson, it wasn’t enough for the systems to work and produce something pleasant — they had to be packaged in a way that artists understood and could actually use.


The researcher bumped into some folk from Google who had been working on a similar project. They were impressed with Grierson’s work and suggested he talk with Massive Attack, who were keen to utilize machine learning in their music. After the show, the band called Grierson and scheduled an in-person meeting.


“Then [Massive Attack] just came into the office and we had a talk, and we played them what we had, and they said that they wanted to make a new version of Mezzanine,” Grierson said. “And that was how it started really.”



“We played them what we had, and they said that they wanted to make a new version of Mezzanine.”



The British group, who used to be a trio but now consists of just Robert “3D” Del Naja and Grant “Daddy G” Marshall, has long used technology to create, perform and distribute music in unique and unexpected ways. As Wired reports, the band met Andrew Melchior, part of creative technology specialist 3rd Space Agency, after a concert in 2013. The musicians wanted visuals that were unique to each venue and could respond to the audience’s reactions. Melchior suggested they talk to Magic Leap, the company behind the One AR headset, and Will Wright, the legendary game designer behind SimCity and The Sims, who was working on AI tools with his startup Stupid Fan Club.


Barbican


Later, Massive Attack worked with developer RjDj on Fantom, a music player app that offered four new tracks and the ability to remix songs based on the user’s location, local time, movement and, if you had a paired smartwatch, heartbeat.


At first, Grierson and his team planned to hand over their technology and let Massive Attack spearhead the Mezzanine remix. “We wanted to help them do things their way,” he said. “That was very much the approach we wanted to take.” After a few months, however, it was clear the systems weren’t ready for even tech-savvy artists like Massive Attack to use. Grierson was reluctant to take creative ownership of the project — his goal was to empower other people — but soon realized that he needed to be more hands-on.


“Eventually I just said, ‘Look if you still really want to do this, I will devote time to making it happen, and I’ll put my own expertise in,"” he said.


Grierson, working alongside students from UAL and Goldsmiths, developed a bunch of machine learning systems that could continuously analyze and ‘train’ themselves on the record. For the best part of a year, Grierson would wake up and load stems — the individual instruments, or tracks that make up a song — into the systems, go about his day and then check to see what they had produced in the evening. It was easy, he discovered, to generate music that was awful or vastly different from the source material. Creating a subtle variation of Mezzanine, something that could pass for a bonus track or extended prelude, was tougher.


“Sometimes it takes hours to generate something, and then you come back and listen to it and it’s just not worked the way you need, so you have to start again,” Grierson said. “There’s no real-time feedback. So we were constantly trying to reach towards a real-time system that would allow us to do this quickly, and that ended up being much more challenging than I originally hoped. It was just really hard. I don’t know what else to say. It was really, really hard.”



The original version of Angel, the first track on Mezzanine.

He decided to focus on a single track, Angel, and nail Massive Attack’s trademark sense of space, or emptiness. “And when I finished that, and I played it to everybody, everyone just went, ‘This is it,"” he said. Elated, Grierson moved on to the other tracks. Though he tackled them separately, the system was allowed to subtly blur in elements from the rest of the album. “If it’s playing back Teardrop,” he explained, “sometimes you’ll get the guitar part from Inertia Creeps, just because, well, I allowed it to have that space, so it could creep in.”


Grierson also built a machine learning system that could produce a strange, but not ear-achingly bad vocal track. Of course, it doesn’t know anything about the words, right? “It doesn’t understand the words, it’s just trying to make up something that sounds a bit like words, but it does sound like 3D [Del Naja],” he said. “It sounds like 3D talking in an alien language.”


By September, both Grierson and Massive Attack were happy with the overall sound. Before they could wrap production, though, the group had to make the AI-powered remix interactive. The Barbican wanted something that was simple and approachable enough to tempt visitors into walking over. In the end, Grierson built a small camera into the podium that holds the glass cabinet and Mezzanine spray can. The hidden hardware tracks the number of people by the exhibit and their proximity to the projection. If a throng of people inch closer, the drums and bass will rise in response.



“If you move around a lot you get a few surprises, and then as people drift away, it calms itself down.”



“So as you approach it, it gives back to you,” Grierson said. “Then if you move around a lot you get a few surprises, and then as people drift away, it calms itself down.” Otherwise, the system moves into an idle state, slowly generating new segments and phasing out old ones. It’s a piece that will change and evolve over the exhibition’s three-month runtime. Every new sequence is, however, confined to some basic parameters — a necessary failsafe so Massive Attack and the rest of the team don’t have to stand watch.


Grierson hopes the installation will make people pause and look at Mezzanine, an album that was released more than two decades ago, with a fresh perspective. It could, he says, also trigger a discussion about the way art will be influenced by artificial intelligence in the future.


Barbican


“I hope it reminds them of the past,” he said, “and makes them think about the future in new ways.” The project has an ephemeral, ever-changing form that’s unlike any traditional record. It echoes live performances, which are slightly different every time, and the way many artists are now choosing to share and update their work online. Kanye West tweaked and re-uploaded The Life of Pablo post-release; HBO, meanwhile, removed a Starbucks cup from Game of Thrones.


You can have a favorite version, of course, but in an era of streaming, remixes and remasters, media is no longer permanent. It’s possible that, in the future, artists will continuously reinterpret their work, and AI may be a part of that process.


“From an artist’s point of view there’s nothing unusual about that,” Grierson said. “It’s just that from a listener’s point of view, they’re used to something being a certain way.”


To coincide with the exhibition, Grierson is launching a site called Mimic that teaches artists how to use similar machine learning techniques. If it catches on, Mezzanine will be remembered not only as a phenomenal record, but one that helped democratize a potentially ground-breaking technology in the music industry. And if not, well — we got an interesting Massive Attack remix out of it.


Images: Barbican








All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.







Nick Summers is a senior reporter, editor and photographer at Engadget. He studied multimedia journalism at Bournemouth University and holds an NCTJ certificate. Nick previously worked at The Next Web and FE Week, an education-focused newspaper in the UK.





Read More from Source



I listened to a Massive Attack record remixed by a neural network
Previous Post
Next Post

About Author

0 Comments: