Tech

Someone has come up with a cloaking device to fight bogus AI music. It’s pretty cool.

There’s an unfathomable amount of music theft happening online. Pirates are tweaking recordings from other (including well-known) artists ever-so-slightly and then registering the songs as their own, siphoning royalties away from the rightful composers. Others are using AI to exploit the work of other people. What can be done?

I’ve just heard of something called HarmonyCloak, a weapon against unauthorized generative AI models. It works by inserting “imperceptible, error-minimizing noise into compositions.” That noise cannot be heard by the human ear, but AI models do. It confuses the AI, making it “unlearnable.” If the song can’t be learned, then it can’t be replicated or even mimicked. All the AI sees is a “disorganized, unlearnable dataset.”

Oh, the AI can try, but the output will be incoherent. The “cloaking” prevents the AI from “capturing the essence of the protected composition. There are some audio samples here. I cannot hear the noise. Not one bit.

If this works as advertised, musicians and labels are going to all over it. Read more and hear samples of cloaked music using HarmonyCloak’s system here.

Alan Cross

is an internationally known broadcaster, interviewer, writer, consultant, blogger and speaker. In his 40+ years in the music business, Alan has interviewed the biggest names in rock, from David Bowie and U2 to Pearl Jam and the Foo Fighters. He’s also known as a musicologist and documentarian through programs like The Ongoing History of New Music.

Alan Cross has 39717 posts and counting. See all posts by Alan Cross

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.