Generative AI in Games Will Create a Copyright Crisis | WIRED

Historically, claims of ownership to in-game creations or user-generated creations (IGCs or UGCs) have been rendered moot by “take it or leave it” end-user license agreements—the dreaded EULAs that nobody reads. Generally, this means players surrender any ownership of their creations by switching on the game. (Minecraft is a rare exception here. Its EULA has long afforded players ownership of their IGCs, with relatively few community freakouts.)

AI adds new complexities. Laws in both the US and the UK stipulate that, when it comes to copyright, only humans can claim authorship. So for a game like AI Dungeon, where the platform allows a player to, essentially, “write” a narrative with the help of a chatbot, claims of ownership can get murky: Who owns the output? The company that developed the AI, or the user?

“There’s a big discussion nowadays, with prompt engineering in particular, about the extent to which you as a player imprint your personality and your free and creative choices,” says Alina Trapova, a law professor at University College London who specializes in AI and copyright and has authored several papers on AI Dungeon’s copyright problems. Right now, this gray area is circumvented with an EULA. AI Dungeon’s is particularly vague. It states that users can use content they create “pretty much however they want.” When I emailed Latitude to ask whether I could turn my Mr. Magoo nightmare into a play, book, or film, the support line quickly responded, “Yes, you have complete ownership of any content you created using AI Dungeon.”

Yet games like AI Dungeon (and games people have made with ChatGPT, such as Love in the Classroom) are built on models that have scraped human creativity in order to generate their own content. Fanfic writers are finding their ideas in writing tools like Sudowrite, which uses OpenAI’s GPT-3, the precursor to GPT-4.

Things get even more complicated if someone pays the $9.99 per month required to incorporate Stable Diffusion, the text-to-image generator, which can conjure accompanying pictures in their AI Dungeon stories. Stability AI, the company behind Stable Diffusion, has been hit with lawsuits from visual artists and media company Getty Images.

As generative AI systems grow, the term “plagiarism machines” is beginning to catch on. It’s possible that players of a game using GPT-3 or Stable Diffusion could be making things, in-game, that pull from the work of other people. Latitude’s position appears to be much like Stability AI’s: What the tool produces does not infringe copyright, so the user is the owner of what comes out of it. (Latitude did not respond to questions about these concerns.)

People can’t currently share image-driven stories with AI Dungeon’s story-sharing feature—but the feature offers a window into a future where game developers may start using or allow players to use third-party AI tools to generate in-game maps or NPC dialog. One outcome not being considered, says Trapova, is that the data of these tools might be drawn from across the creative industries. This “raises the stakes,” she argues, growing the number of possible infringements and litigious parties. (Stability AI and OpenAI did not respond to queries about this point.)

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Chronicles Live is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – chronicleslive.com. The content will be deleted within 24 hours.

Leave a Comment