Artificial intelligence art – who owns the copyright?
If your pet dog Hans takes a selfie, does he own the copyright? A recent decision by the U.S. Court of Appeals for the Ninth Circuit (“Ninth Circuit”) is instructive. It says that a monkey can’t own the copyright to his selfie. The reason? Only humans can own a copyright under U.S. law. But who owns artificial intelligence (“AI”) created artwork? This entry by guest author Ryan E. Long addresses that issue.
The Ninth Circuit Decision
The Indonesian monkey at the heart of the dispute is named “Naruto.” He is actually quite handsome, as you can see if you look up his profile shot – not on LinkedIn, of course. The story began on the island of Sulawesi, not Fantasy Island but close. David Slater, a British wildlife photographer, left his camera unattended. Naruto then picked up the camera and, harnessing his training at the British Museum School of Art and Design, began taking stunning photos of himself.
Whilst Gentleman’s Quarterly and other magazines sought to feature him in their publications, Naruto couldn’t be bothered. His images, posted by Mr. Slater, had already gone viral. Naruto retained the services of People for the Ethical Treatment of Animals (“PETA”) to sue Mr. Slater and his publishers for copyright infringement. The Ninth Circuit dismissed the suit because Naruto can’t own the copyright to the photos. Unfortunately, Naruto couldn’t be reached for comment.
Part of the reasoning of the court was simple. The U.S. Copyright office “will refuse to register a claim if it determines that a human being did not create the work.” The office further states that it will exclude works “produced by machine or mere mechanical process that operates randomly or automatically without any creative input or intervention from a human author.” The question raised by the decision is whether computer generated art is copyrightable and, if so, whether the AI – or its programmer – would be the owner.
AI Art & Blurred Lines
The issue of AI created artwork isn’t academic. According to one recent article in Art Net News, the Paris based collector Nicolas Laugero-Lasserre acquired Le Comte de Belamy, which was created by artificial intelligence. Mr. Laurgero-Lasserre bought the work directly from Obvious, a collective that created the AI behind Le Comte de Belamy. Instead of a signature, the artwork is signed by the AI using an equation. Naruto is jealous.
As AI gets smarter and more evolved, it will be capable of not only just creating art. Think of AI like that found in War Games (1984) which can create systems of engagement resembling warfare. Then you extrapolate such a system to business. In such a case, a company like Obvious can create AI that spawns not only art but other companies, chock full of their own versions of Suri. This AI dominated world is laid out in movies like Her (2013), in which the main actor – Joaquin Phoenix – forms an intimate relationship with an AI app played by Scarlett Johansson. With the proliferation of synthetic body parts, imagining a full functioning AI cyborg that resembles a human isn’t as far-fetched as it may have sounded in the 1950s.
The lines between fair use and copyright infringement have been blurred by mash-ups that modify music samples so that their identities become unrecognizable. Similarly, there will be blurred lines between human and AI created art as the years progress. The law needs to be ready to address these issues.
But, as the character Willie Stark explained in Robert Penn Warren’s All The Kings Men, “(the law) is like a single-bed blanket on a double bed and three folks in the bed and a cold night . . . There [not] ever enough blanket to cover the case, no matter how much pulling and hauling, and somebody is always going to catch pneumonia.” Maybe the shortcomings of the law in dealing with AI issues will always be here. But such shortcomings can be mitigated by policy makers who have foresight today as to where technology is heading tomorrow.
Public Domain Versus Work-For-Hire
If Naruto doesn’t own the copyright to the photo, then it would likely be in the public domain. However, an argument could be made that any art created by other animals who reside on government owned reserves or private property would be owned by the reserve or property owner. This is how a work-for-hire works in the U.S. While the author normally is the proper copyright owner, a work-for-hire arrangement gives the employer of the author the right. A similar approach could be taken by those who provide room and board to the likes of Naruto the handsome.
The issue remains about whether AI created art is also not subject to copyright because it was “produced by machine or mere mechanical process that operates randomly or automatically without any creative input or intervention from a human author.” Using the reasoning of the Ninth Circuit, the answer would be that all such works are in the public domain. But then the question becomes whether one could make copies of Comte de Belamy in the U.S. without worrying about a copyright infringement lawsuit. While several nations, such as the U.K., grant copyrights to a person who arranges for the creation of computer generated works, the U.S. does not.
Either the U.S. takes the U.K.’s lead or these works will end up in the public domain. This overly rigid approach as to what constitutes “intervention from a human author” would result in counterintuitive outcomes for companies like Obvious. By allowing owners of AI to own the creative works spawned by their systems, U.S. law could also conceivably give copyright rights to those who own the property on which the likes of Naruto the handsome reside.
Ryan E. Long is a cooperating attorney with the Electronic Frontier Foundation in San Francisco and a non-residential fellow at The Center for Internet and Society at Stanford Law School. Before starting his law practice in 2016, Ryan was an antitrust and securities litigator at Milberg LLP in New York City and a legal consultant to the American Enterprise Institute in Washington D.C. He has also been an adjunct professor at schools like Brooklyn College and University of California, Santa Cruz.
This article has been published first by Center for Internet and Society at Stanford Law School. For creating the title image, we used Deep Dream Generator. View the original version here.
This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact email@example.com.
Ryan E. Long
Sign up for HIIG's Monthly Digest
and receive our latest blog articles.
Technology is never neutral. And even if the Internet as a medium initially invited us to deconstruct established, fixed role models and identities in supposedly new publics, to break up…
A Pop-Up Lab by the AI & Society Lab A unique opportunity to make an impact at the intersection of AI and inclusion and to collaborate with aspiring researchers in…