But when they saw the huge reaction their Nirvana set got, they went full grunge. The band started out as a one-off lark for Halloween an excuse for Hogan and his friends to perform Foo Fighters, Stone Temple Pilots, and Nirvana tribute sets. “Everyone that we brought in, for the most part, were working tribute artists for these bands, so they could kind of do the inflections and make it sound as realistic as possible,” O’Connor says.Įric Hogan has been fronting Atlanta’s Nevermind: The Ultimate Tribute to Nirvana for the past six years. “A lot of the instrumentation was MIDI with different effects added to it,” O’Connor says of the finished recordings. Once the compositions were in place, an audio house arranged all the different parts to evoke the musician. “Man, I Know” (In the style of Amy Winehouse) “It was a lot of trial and error,” O’Connor says, adding that the team examined “pages and pages” of lyrics for turns of phrase that syllabically fit the vocal melodies Magenta produced. They were able to input the artist’s lyrics and start off with a few words and the program would guess the cadence and tone of the poetry to complete it. O’Connor and his team used a similar process for lyrics, using a generic AI program called an artificial neural network. So you start listening through and just finding little moments that are interesting.” But if you just have a bunch of riffs, it’ll put out about five minutes of new AI-written riffs, 90 percent of which is really bad and unlistenable. If you put whole songs through, starts to get really confused on what supposed to sound like. “So we took 20 to 30 songs from each of our artists as MIDI files and broke them down to just the hook, solo, vocal melody or rhythm guitar and put those through one at a time. “The more MIDI files you input, the better,” O’Connor says. After examining each artist’s note choices, rhythmic quirks, and preferences for harmony in the MIDI file, the computer creates new music that the staff could pore over to pick the best moments. Previously, Sony has used the software to make a “new” Beatles song, and the electropop group Yacht used it to write their 2019 album Chain Tripping.įor the Lost Tapes project, Magenta analyzed the artists’ songs as MIDI files, which works similarly to a player-piano scroll by translating pitch and rhythm into a digital code that can be fed through a synthesizer to recreate a song. To create the songs, O’Connor and his staff enlisted Google’s AI program Magenta, which learns how to compose in the style of given artists by analyzing their works. “Somehow in the music industry, is normalized and romanticized … Their music is seen as authentic suffering.” “What if all these musicians that we love had mental health support?” says Sean O’Connor, who is on the board of directors for Over the Bridge and also works as creative director for the advertising agency Rethink. “Drowned in the Sun” (In the style of Nirvana) The project is the work of Over the Bridge, a Toronto organization that helps members of the music industry struggling with mental illness. Each track is the result of AI programs analyzing up to 30 songs by each artist and granularly studying the tracks’ vocal melodies, chord changes, guitar riffs and solos, drum patterns, and lyrics to guess what their “new” compositions would sound like. The tune, titled “Drowned in the Sun,” is part of Lost Tapes of the 27 Club, a project featuring songs written and mostly performed by machines in the styles of other musicians who died at 27: Jimi Hendrix, Jim Morrison, and Amy Winehouse.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |