The story of FN Meka – a fictional character billed as the first partly AI-powered musical artist to be signed by a major record company – might seem strange. In August, Capitol Records dropped FN Meka, whose look, outlaw personality and suggestive lyrics were inspired by real-life music stars like Travis Scott, 6ix9ine and Lil Pump, amid criticism that the project trafficked in stereotypes.
But for seasoned observers of technology in pop music and the cultural appropriation debate, the rise and fall of this so-called robot rapper, whose songs were actually written and voiced by humans, has raised important questions that are not going away anytime soon.
Just last month, an AI work won an award in Colorado and a computer program improvised a classical music solo in real time in New York. From DALL-E 2, the technology that creates visual art on command, to Hatsune Miku, Japanese software that does something similar for music, the art world may be on the verge of a sea change. in how its products are created. And young people are increasingly feeling comfortable consuming culture through digital avatars like FN Meka. It’s happened before in hip-hop: a hologram of rapper Tupac Shakur, who died in 1996, played at a music festival in 2012; Travis Scott gave a concert through his avatar in the Fortnite video game in 2020; and Snoop Dogg and Eminem rapped as their digital selves and their Bored Ape avatars in a metaverse performance at the MTV Video Music Awards last month.
In this brave new world, do fake characters based on real people amount to unseemly borrowing, even stealing, or just the kind of homage that has always defined pop music? Even when artificial intelligence helps write music, should the humans behind it be responsible for machine-created lyrics? And when it comes to race, how do the rules of cultural appropriation work when the person appropriating is not a human being with a unique cultural background but a fictional identity backed by an anonymous, multiracial collective?
“A lot of our intuitions and moral codes as humans may have evolved for a context where we have discrete human actors,” said Ziv Epstein, who holds a Ph.D. student at the MIT Media Lab who studies the intersection of humans and technology. “These emerging technologies require new legal frameworks and new research to understand how we reason about them.”
For critics of FN Meka, the presence of more black people or people of color in the rooms where the character was conceived, designed and promoted may have helped prevent the negative stereotypes they say they fostered. Industry Blackout, a nonprofit advocacy group, said FN Meka had “insulted” black culture and taken the sounds, looks and life experiences of real black artists. Capitol seemed to agree when it apologized for its “insensitivity” in a statement.
For critics, FN Meka’s (exaggerated) debt to AI and digital-only existence had the effect of absolving people who were really calling the shots. “There are humans behind the technology,” said Sinead Bovell, futurist and founder of WAYE, an organization that educates young people about technology. “When we disconnect the two, that’s where we could potentially risk harming different marginalized groups.