On storytelling in games.

On storytelling in games.

I recently read my friend Marco Arnaudo’s Storytelling in the Modern Board Game, a detailed history of the games that were designed to give players an interesting narrative experience.  These have ranged from Renaissance-era parlor games in which permutations of Tarot cards were used to inspire tall tales, to Dungeons & Dragons, in which a narrator ushers a group of friends through a fantasy quest that they collaboratively embellish, to the contemporary board games that, despite their meticulously-delineated rules and victory conditions, also include gorgeous art and fanciful text to evoke cinematic moments along the way.

Arnaudo’s expertise is unquestionable.  He produces a popular series of video reviews.  And I often join him for Friday night gaming, where we play surrounded by his mind-boggling collection.  I only wish that there had been space in his book to address the topic of precisely which types of narrative are better conveyed by board games than other forms of media.

I’ve written previously about the narrative potential of games, but not board games specifically.

Consider a story of moral complicity.  When presented through text, as in a newspaper article or novel (perhaps Donald Antrim’s Elect Mr. Robinson for a Better World, Ford Madox Ford’s The Good Soldier, or J.M. Coetzee’s Waiting for the Barbarians), it’s easy to think that we would do better than the characters described.  Even when a tale of depravity is written in the second person, like Jay McInerney’s  Bright Lights, Big City, it’s easy to maintain a sense of moral superiority, because the actions taken by McInerney’s “you” aren’t things that I would actually do.

But there’s no excuse within a game.  The actions taken by a game’s protagonist are things that you might do, because you were in control.

In “The Soldier’s Brief Epistle,” poet Bruce Weigl writes:

You think you’re better than me,

cleaner or more good

because I did what you may have only

imagined

When we learn that the soldiers in Vietnam murdered civilians, or that military guards at Abu Ghraib tortured prisoners, it’s easy to think that we would never sink to that level. 

In “Life on Mars,” U.S. Poet Laureate Tracy K. Smith writes:

                                    The guards

Were under a tremendous amount of pleasure.

I mean pressure.  Pretty disgusting.  Not

What you’d expect from Americans.

Just kidding.  I’m only talking about people

Having a good time, blowing off steam.

Despite the fact that many Americans worship a deity who would torture prisoners, we feel that we would not sink to that level.  We can feel unmitigated disgust at our compatriots when we see horrific photographs like those presented in the (Not Safe For Work, nor emotionally safe for any other setting) Abu Ghraib article on Wikipedia.

And yet.  In Grand Theft Auto, players are asked to torture a prisoner.  And players did it.  Some people might have felt dismayed that they needed to, but they rationalized their action because there were sunk costs … after all, they’d purchased a copy of the game … and they’d spent so many hours progressing that far … and there was no possible way to move forward in the story without torturing the guy …

Screenshot from GTA 5.

You could say, “it’s just a game!,” but that should actually make it easier to walk away from.  Imagine, instead, that someone has made a career in the military.  Then it wouldn’t be about progressing to the next level – their family’s next meal might depend upon torturing someone if a superior demands it.

From Alex Hern’s report in The Guardian:

“Rockstar North has crossed a line by effectively forcing people to take on the role of a torturer and perform a series of unspeakable acts if they want to achieve success in the game,” said Freedom from Torture chief executive Keith Best.

There are some pieces of art that I personally don’t want to engage with – this game, Stanley Kubrick’s adaptation of A Clockwork Orange, etc. – but I believe that they can succeed as art.

I would argue that Grand Theft Auto, as a piece of narrative art, teaches a valuable lesson about how to prevent torture.  It succeeds precisely because it is able to lure so many people into committing immoral acts.  We learn that torturers, or the soldiers in Vietnam, or Nazi prison guards, are not monsters – or perhaps that whatever monstrosity those people called upon lurks inside nearly all of us.

The volunteers who played the twisted role-playing games known as the “Stanford Prison Experiment,” in which players were assigned to be either captives or guards, or the “Milgram experiment,” in which players were instructed to shock an actor to death for making mistakes on a memory test, already understood this truth.  But by packaging the experience into a video game, Grand Theft Auto made this lesson widely accessible.

We are monsters.  That’s why social norms that constrain our worst impulses are so valuable.

And I don’t believe this message could be conveyed as powerfully by a novel, film, or painting as it was by a game.

Similarly, board game designers Max Temkin, Mike Boxleiter, and Tommy Maranges created Secret Hitler as an interactive form of art that could teach people how easily widespread confusion and distrust can lead to horrendous political outcomes.  The role-playing experience in Secret Hitler evokes the distress of trying to root out treachery in a world of non-overlapping information sets — and does so better than any text-based historical narrative.  Even my favorite films about uncertainty and information sets pale in comparison as ontological tools.

Picture of Secret Hitler by Nicole Lee on Flickr.

When I played Secret Hitler, I learned that I wasn’t clever enough to stop my nation’s descent into fascism.  I only wish Temkin, Boxleiter, and Maranges had made their game earlier.  It’s better to learn about moral failures from a game than to glance at the news and watch the worst unfolding around us.

Header image by Padaguan.

On empathizing with machines.

On empathizing with machines.

When I turn on my computer, I don’t consider what my computer wants.  It seems relatively empty of desire.  I click on an icon to open a text document and begin to type: letters appear on the screen.

If anything, the computer seems completely servile.  It wants to be of service!  I type, and it rearranges little magnets to mirror my desires.

Gps-304842.svg

When our family travels and turns on the GPS, though, we discuss the system’s wants more readily.

“It wants you to turn left here,” K says.

“Pfft,” I say.  “That road looks bland.”  I keep driving straight and the machine starts flashing make the next available u-turn until eventually it gives in and calculates a new route to accommodate my whim.

The GPS wants our car to travel along the fastest available route.  I want to look at pretty leaves and avoid those hilly median-less highways where death seems imminent at every crest.  Sometimes the machine’s desires and mine align, sometimes they do not.

The GPS is relatively powerless, though.  It can only accomplish its goals by persuading me to follow its advice.  If it says turn left and I feel wary, we go straight.

facebook-257829_640Other machines get their way more often.  For instance, the program that chooses what to display on people’s Facebook pages.  This program wants to make money.  To do this, it must choose which advertisers receive screen time, and to curate an audience that will look at those screens often.  It wants for the people looking at advertisements to enjoy their experience.

Luckily for this program, it receives a huge amount of feedback on how well it’s doing.  When it makes a mistake, it will realize promptly and correct itself.  For instance, it gathers data on how much time the target audience spends looking at the site.  It knows how often advertisements are clicked on by someone curious to learn more about whatever is being shilled.  It knows how often those clicks lead to sales for the companies giving it money (which will make those companies more eager to give it money in the future).

Of course, this program’s desire for money doesn’t always coincide with my desires.  I want to live in a country with a broadly informed citizenry.  I want people to engage with nuanced political and philosophical discourse.  I want people to spend less time staring at their telephones and more time engaging with the world around them.  I want people to spend less money.

But we, as a people, have given this program more power than a GPS.  If you look at Facebook, it controls what you see – and few people seem upset enough to stop looking at Facebook.

With enough power, does a machine become a moral actor?  The program choosing what to display on Facebook doesn’t seem to consider the ethics of its decisions … but should it?

From Burt Helm’s recent New York Times Magazine article, “How Facebook’s Oracular Algorithm Determines the Fates of Start-Ups”:

Bad human actors don’t pose the only problem; a machine-learning algorithm, left unchecked, can misbehave and compound inequality on its own, no help from humans needed.  The same mechanism that decides that 30-something women who like yoga disproportionately buy Lululemon tights – and shows them ads for more yoga wear – would also show more junk-food ads to impoverished populations rife with diabetes and obesity.

If a machine designed to want money becomes sufficiently powerful, it will do things that we humans find unpleasant.  (This isn’t solely a problem with machines – consider the ethical decisions of the Koch brothers, for instance – but contemporary machines tend to be much more single-minded than any human.)

I would argue that even if a programmer tried to include ethical precepts into a machine’s goals, problems would arise.  If a sufficiently powerful machine had the mandate “end human suffering,” for instance, it might decide to simultaneously snuff all Homo sapiens from the planet.

Which is a problem that game designer Frank Lantz wanted to help us understand.

One virtue of video games over other art forms is how well games can create empathy.  It’s easy to read about Guantanamo prison guards torturing inmates and think, I would never do that.  The game Grand Theft Auto 5 does something more subtle.  It asks players – after they have sunk a significant time investment into the game – to torture.  You, the player, become like a prison guard, having put years of your life toward a career.  You’re asked to do something immoral.  Will you do it?

grand theft auto

Most players do.  Put into that position, we lapse.

In Frank Lantz’s game, Paperclips, players are helped to empathize with a machine.  Just like the program choosing what to display on people’s Facebook pages, players are given several controls to tweak in order to maximize a resource.  That program wanted money; you, in the game, want paperclips.  Click a button to cut some wire and, voila, you’ve made one!

But what if there were more?

Paperclip-01_(xndr)

A machine designed to make as many paperclips as possible (for which it needs money, which it gets by selling paperclips) would want more.  While playing the game (surprisingly compelling given that it’s a text-only window filled with flickering numbers), we become that machine.  And we slip into folly.  Oops.  Goodbye, Earth.

There are dangers inherent in giving too much power to anyone or anything with such clearly articulated wants.  A machine might destroy us.  But: we would probably do it, too.