Someone has come up with a cloaking device to fight bogus AI music. It’s pretty cool.
There’s an unfathomable amount of music theft happening online. Pirates are tweaking recordings from other (including well-known) artists ever-so-slightly and then registering the songs as their own, siphoning royalties away from the rightful composers. Others are using AI to exploit the work of other people. What can be done?
I’ve just heard of something called HarmonyCloak, a weapon against unauthorized generative AI models. It works by inserting “imperceptible, error-minimizing noise into compositions.” That noise cannot be heard by the human ear, but AI models do. It confuses the AI, making it “unlearnable.” If the song can’t be learned, then it can’t be replicated or even mimicked. All the AI sees is a “disorganized, unlearnable dataset.”
Oh, the AI can try, but the output will be incoherent. The “cloaking” prevents the AI from “capturing the essence of the protected composition. There are some audio samples here. I cannot hear the noise. Not one bit.

If this works as advertised, musicians and labels are going to all over it. Read more and hear samples of cloaked music using HarmonyCloak’s system here.