Rob Tracinski talks with Jess Miers of the Chamber of Progress about the controversy over training AI on copyrighted material. Is AI ripping off human creators, or is it no different from a human writer or artist being influenced by another person’s ideas?
We discuss this with reference to historical cases where new technology has raised copyright questions, from Betamax all the way back to piano rolls. Listen to the conversation above, or check it out on video here:
For the article we mention on compulsory licensing and how piano rolls shaped modern copyrights, see here:
See also a fascinating piece on the surprisingly long history of debates over AI and copyright.
And check out a rather odd sidelight on this issue: the use of AI dopplegangers by online “creators”—in one case, an “adult-content creator,” i.e., an aging camgirl who wants to use AI to extend her career and make it easier to provide quasi-personalized engagement with lonely men who want fake female companionship. It sounds pretty primitive so far, not to mention a little sad, but this is the sort of thing that helped build the modern internet, so don’t discount it.
As you can see in our discussion, Jess and I lay out the issues but are still unsure exactly what the right solution is. I expect the answers will flow, as they did with piano rolls and Betamax, out of the particulars of the cases that are moving through the courts. And they are definitely moving, with lawsuits against AI platforms filed by big media companies like the New York Times and Getty Images.
You can see the dilemma. In order to train programs on good-quality text and images, AI platforms rely on an abundance of human-generated material from existing media, and not just amateur bloggers or social media, but well-funded, carefully edited publications. If AI-generated text and images were to completely flood the internet—as they may soon do—then an AI that goes out to “scrape” the material it is designed to emulate would be working off of text and images generated by other AI. This raises the prospect of a doom loop where AI is copying AI and getting farther and farther away from anything that is useful or meaningful to humans.
This implies to me that AI platforms will eventually find some kind of modus vivendi with the existing media industry. AI can offer a lot of value. But it is an imitation of human thinking, not a substitute for it. AI depends on human-generated material, and even if it not legally required to do so, it will have to find a way to direct a portion of its revenues to renew the basic resource it depends on: human writers, artists, and other creators.
Share this post