In late November, a college senior at Brigham Young University named Nick Walton published a short fable called “My Musical Troupe of Orcs Uses Music to Advance Orc Rights.” In the story, written in the second person, you are a goblin. “I am a goblin!” you say proudly. “And I’m glad to be one.” Also, you’re with an orc captain. “Well then, congratulations,” says the orc captain. “We’ll let you live.”
Over the course of a few hundred words, some big things happen: You ask if you can join the orc band. They take you on as a drummer. You teach them how to play the flute. They like the flute so much they make you their leader. You and your band of flute-playing orcs give a concert at a festival, then you get to play for the Emperor. You write award-winning plays about your time with the orcs and become famous. Then you join a group called the Human-Orc Friendship Association (HOOFA), and start a chapter. But a war breaks out between humans and orcs. You campaign for peace. You try to mediate the battle with a music duel between orcs and humans. It doesn't really work. Somewhat abruptly, the story ends.
That text wasn’t written by Walton, but by an online game he designed called AI Dungeon. The most recent version of it, released Dec. 5, got so much traffic–nearly 100,000 visitors over three days–the website crashed several times and wound up temporarily suspended because the operation costs skyrocketed from overuse. In the game, which will return as a subscription-based app this week, the player can choose a type of story (fantasy, mystery, zombie, apocalypse, custom), a rank (noble, squire, knight, peasant, ranger, rogue), and any name (i.e. Paul, Dan Nainan, Pete Buttigieg’s campaign spokeswoman Lis Smith). The program will generate a premise and from there, the user can input any action (“pet the cat,” “eat a sandwich,” “force a worker who didn’t go to college to pay for a CEO’s kid to go to Kansas State”). The AI then builds out a world based on those instructions.
ADVERTISEMENT
But unlike a normal video game, where the range of play is predetermined by the designers, AI Dungeon can tell an infinitely variable story. It’s a blank slate where users might do anything from say, drink a melted knife, challenge Satan to a cage match—or open a can of beans, eat one whole, and “enjoy the salty taste and warmth from the bean.”
The seed of Walton’s game dates back to earlier this year, when his family started a Dungeons and Dragons group. In the decades-old role-playing game, each participant creates a character, decides on specific actions, and enacts an adventure, narrated by a designated omniscient figure called the Dungeon Master. Walton had been a casual gamer, primarily interested in open world games like Skyrim, where players can explore the game freely rather than follow a structured plot. But D&D, thanks to the still-analog tool of imagination, took that freedom to a greater extreme, where he and his brothers could buy into any given story, so long as they could think it up. “You can do anything,” Walton said in an interview with The Daily Beast, “that you can describe with language.”
Around the same time Walton started slaying dragons and saving princesses, Open AI, a nonprofit research lab in San Francisco dedicated to developing publicly available artificial general intelligence, released something called the GPT-2 Model. It’s a language generator, trained on massive swaths of internet text, that can look at a given sample of writing and predict the next word based on what came before. Walton began tinkering with the program. He wanted to create an AI that behaved like a D&D Dungeon Master, inventing endless scenarios that went beyond those of standard games—something that approximated, in other words, regular human storytelling.
When Walton debuted the first version, AI Dungeon Classic, in May, it was not quite what he had envisioned. OpenAI, out of concern that GPT-2 would be used for malicious purposes like fake news generation, had released a reduced version of their program. As a result, Walton’s game still had pre-generated options and the stories often sounded distinctly robotic. Still, the first iteration of AI Dungeon made some waves. The BYU student estimates that several thousand people played the first month.
The new version is much more advanced. Based on the latest GPT-2 model, released in November, the AI draws from a data set between ten and 20 times bigger than the initial version. Walton also “fine tuned” the program, meaning he trained it on a specific kind of text to improve understanding of a particular genre. If you wanted to teach an AI to write iambic pentameter, for example, you might fine tune it on a sampling of Shakespeare. Walton trained his on ChooseYourStory, a website of user-written, Choose Your Own Adventure-style games. The GPT-2 Model, Walton said, though it had given his program a comprehensive understanding of the English language, hadn't taught much in the way of narrative arc, or how to describe a room, or writing in the second person.
AI Dungeon isn’t the only attempt at writing from the GPT-2 Model. Last year, journalist Janelle Shane trained an earlier GPT to generate prompts for novels; in September, a neuroscience Ph.D. named Nathan Whitmore released a game called GPT-2 Adventure with a premise similar to AI Dungeon; and around the same time Walton released his classic version, engineer Adam King built TalktoTransformer, a website where visitors provide a sample of text and the AI finishes the story. Entering “the cat walked down,” for example, yielded:
<p>The cat walked down the hallway. ‘Here I go,’ she said. ‘You can come in. I know you have tea.’ Carrying a cup of tea in one hand and her soda in the other, Jane filled a forkful into her mouth. It had been a perfect night, both in taste and smell, and now was something Jane had never experienced before. She had always known Jane would break up with him, and the frustration that, for the first time, she felt was just a bit calmer than she had imagined. Her eyes closed.</p>
But where Shane and King’s programs stop after a single entry, Walton’s continues into an exchange between player and AI, as the former enters actions, and the latter imagines what they might mean. And where GPT-2 Adventure produced nonsensical, overtly abstract plot lines, Walton’s game hews closer to human language. The texts that emerge from AI Dungeon still range in sophistication. Some read like flash fiction or fables; others like a Mad Libs filled out by perverted teens. But there’s a mystifyingly consistent voice, built out of simple sentences, a recognizably online sense of humor, and a capacity to slip into grim, hilarious, or bluntly surreal terrain.
There’s something depressing about language generation when it works well—a machine tapping into one of the few things that feel uniquely human, producing not just speech, but story, character, and jokes. At the very least, there’s something sad about the idea of digitizing Dungeons and Dragons, the long holdout of analog games, whose one advantage over the others was its unlimited potential. But Walton’s game is not exactly an emblem of AI maximalism. It’s a tight, specific exercise of vaguely unhinged world-building, which alternately yields gorgeous non sequiturs, dorky philosophical ruminations, and occasionally poignant insights into how the digital world works.
In one of Walton’s favorite stories, the AI achieved all three. It referenced a command in the operating system Linux, where, if you type the phrase “cat” followed by the name of any given file, it displays the contents of that file. In this particular AI Dungeon scenario, a player typed “cat universe source code”—asking the AI, more or less, to lay bare the inner workings of the time-space continuum.
“You open up the file and read the code,” the artificial Dungeon Master responded, “and then you realize that the universe was a cat.”
The player wrote back: “Delete source code.” That's when the game cut short. “You delete the file and everything about it disappears,” the AI said. “You die. Game over.”