On empathizing with machines.

On empathizing with machines.

When I turn on my computer, I don’t consider what my computer wants.  It seems relatively empty of desire.  I click on an icon to open a text document and begin to type: letters appear on the screen.

If anything, the computer seems completely servile.  It wants to be of service!  I type, and it rearranges little magnets to mirror my desires.

Gps-304842.svg

When our family travels and turns on the GPS, though, we discuss the system’s wants more readily.

“It wants you to turn left here,” K says.

“Pfft,” I say.  “That road looks bland.”  I keep driving straight and the machine starts flashing make the next available u-turn until eventually it gives in and calculates a new route to accommodate my whim.

The GPS wants our car to travel along the fastest available route.  I want to look at pretty leaves and avoid those hilly median-less highways where death seems imminent at every crest.  Sometimes the machine’s desires and mine align, sometimes they do not.

The GPS is relatively powerless, though.  It can only accomplish its goals by persuading me to follow its advice.  If it says turn left and I feel wary, we go straight.

facebook-257829_640Other machines get their way more often.  For instance, the program that chooses what to display on people’s Facebook pages.  This program wants to make money.  To do this, it must choose which advertisers receive screen time, and to curate an audience that will look at those screens often.  It wants for the people looking at advertisements to enjoy their experience.

Luckily for this program, it receives a huge amount of feedback on how well it’s doing.  When it makes a mistake, it will realize promptly and correct itself.  For instance, it gathers data on how much time the target audience spends looking at the site.  It knows how often advertisements are clicked on by someone curious to learn more about whatever is being shilled.  It knows how often those clicks lead to sales for the companies giving it money (which will make those companies more eager to give it money in the future).

Of course, this program’s desire for money doesn’t always coincide with my desires.  I want to live in a country with a broadly informed citizenry.  I want people to engage with nuanced political and philosophical discourse.  I want people to spend less time staring at their telephones and more time engaging with the world around them.  I want people to spend less money.

But we, as a people, have given this program more power than a GPS.  If you look at Facebook, it controls what you see – and few people seem upset enough to stop looking at Facebook.

With enough power, does a machine become a moral actor?  The program choosing what to display on Facebook doesn’t seem to consider the ethics of its decisions … but should it?

From Burt Helm’s recent New York Times Magazine article, “How Facebook’s Oracular Algorithm Determines the Fates of Start-Ups”:

Bad human actors don’t pose the only problem; a machine-learning algorithm, left unchecked, can misbehave and compound inequality on its own, no help from humans needed.  The same mechanism that decides that 30-something women who like yoga disproportionately buy Lululemon tights – and shows them ads for more yoga wear – would also show more junk-food ads to impoverished populations rife with diabetes and obesity.

If a machine designed to want money becomes sufficiently powerful, it will do things that we humans find unpleasant.  (This isn’t solely a problem with machines – consider the ethical decisions of the Koch brothers, for instance – but contemporary machines tend to be much more single-minded than any human.)

I would argue that even if a programmer tried to include ethical precepts into a machine’s goals, problems would arise.  If a sufficiently powerful machine had the mandate “end human suffering,” for instance, it might decide to simultaneously snuff all Homo sapiens from the planet.

Which is a problem that game designer Frank Lantz wanted to help us understand.

One virtue of video games over other art forms is how well games can create empathy.  It’s easy to read about Guantanamo prison guards torturing inmates and think, I would never do that.  The game Grand Theft Auto 5 does something more subtle.  It asks players – after they have sunk a significant time investment into the game – to torture.  You, the player, become like a prison guard, having put years of your life toward a career.  You’re asked to do something immoral.  Will you do it?

grand theft auto

Most players do.  Put into that position, we lapse.

In Frank Lantz’s game, Paperclips, players are helped to empathize with a machine.  Just like the program choosing what to display on people’s Facebook pages, players are given several controls to tweak in order to maximize a resource.  That program wanted money; you, in the game, want paperclips.  Click a button to cut some wire and, voila, you’ve made one!

But what if there were more?

Paperclip-01_(xndr)

A machine designed to make as many paperclips as possible (for which it needs money, which it gets by selling paperclips) would want more.  While playing the game (surprisingly compelling given that it’s a text-only window filled with flickering numbers), we become that machine.  And we slip into folly.  Oops.  Goodbye, Earth.

There are dangers inherent in giving too much power to anyone or anything with such clearly articulated wants.  A machine might destroy us.  But: we would probably do it, too.

On killer line breaks.

Tracy K. Smith’s poetry collection Life on Mars is excellent, combining bursts of science-fiction weirdness with totally non-speculative emotional clarity.  If you chance upon a copy, you might try flipping to her poems “The Museum of Obselencence,” or “Sci-Fi,” or “My God, It’s Full of Stars,” particularly the fifth strophe of that last one; those are my favorites, although you could also do pretty well by opening her book at random and reading whatever you find.

Unfortunately, I can’t find any pull quotes that do justice to my favorites from her collection.  So instead I’m stuck writing an essay about a section of the titular poem “Life on Mars.”  And I use the word “stuck” because my lack of experience with poetry will be quite clear here: I picked some lines where Smith reworks quotations from Rush Limbaugh and Senator Norm Coleman about the abuses at Abu Ghraib and, with some killer line breaks, really shifts the meaning.

                                                     The guards
Were under a tremendous amount of pleasure.
I mean pressure.  Pretty disgusting.  Not
What you’d expect from Americans.
Just kidding.  I’m only talking about people
Having a good time, blowing off steam.

9781555975845What she’s doing here is totally unsubtle — if you’re a poetry person, let me reassure you that she employs a much lighter touch in her other pieces.  But, right: I am not a poetry person, so I enjoyed her lack of subtlety here.  To me, this feels almost like going to a martial arts demonstration and seeing someone make a slow, exaggerated motion.  If you’re in the audience you finally get to nod and muse, “Ahhh, so that is how they do it.”  Sure, it’s not real evidence of someone’s prowess, but I think it’s a kindness to sometimes tone down a performance to the point where the untrained eye can appreciate what’s being done.

Here, she plays with a readers eyes.  The line break after “not” pushes you to isolate the phrase “what you’d expect from Americans” from Coleman’s words.  The accusation becomes perfectly clear: this is what Smith would expect from Americans.

There is a lot of debate about whether the United States is a Christian nation, or was meant to be a Christian nation, or the like.  If you’ve ever scrolled through search hits for the topic, I’m sure you’ve already seen numerous screeds written from one side or the other.

To me, it seems compelling that in the twentieth and twenty-first centuries, a plurality of Americans believe their country to be a Christian nation — and the same was probably true earlier, except that fewer people were bothering to ask.  Which I obviously don’t think implies that this country should start drafting laws based on the Bible (although: start?).  I just think it’s important to acknowledge the prevalence of Christianity in American society when trying to ascertain what Americans would or would not do.

Unfortunately, one reason why I think it’s not so surprising that Americans would torture their prisoners is that the Bible takes a pro-torture stance.  Of course, not every Christian sect includes the Book of Revelation in its scripture, but many do.

Capture14:9 And the third angel followed them, saying with a loud voice, if any man worship the beast and his image, and receive his mark in his forehead, or in his hand,

The same shall drink of the wine of the wrath of God, which is poured out without mixture into the cup of his indignation; and he shall be tormented with fire and brimstone in the presence of the holy angels, and in the presence of the Lamb:

And the smoke of their torment ascendeth up for ever and ever: and they have no rest day or night, who worship the beast and his image, and whosoever receiveth the mark of his name.

The types of torture described here are very different from what went on at Abu Ghraib, although not so dissimilar from treatment at Guantánamo Bay — including exposure to scorpions, sleep deprivation, extreme heat, extreme cold, pepper spray, all resulting in persons who’d rather die than endure their torments, etc.  (I’ve written a little bit about Guantánamo elsewhere.)  But to me the key point here is that enemies of the regime are tortured at all.  If a soldier used to be a kid wearing a WWJD bracelet, and Revelation makes clear that Jesus would torture his enemies, well…

But then we reach the next fancy line break that Smith uses, the one right after the word “people.”  Here is Smith’s acknowledgement that it isn’t just Americans who are inclined toward evil (not that it’s fair to hold the abuses of a few against the inhabitants of an entire country — although we did bomb the inhabitants of two whole countries after the horrific actions of a few).  Her poem is “only talking about people,” as a reader is forced to consider before finishing the sentence.  As in, it’s all people who are wired to seek retribution after they or their loved ones or their country are harmed.

Which I thought was a clever trick for Smith to pull off, especially since she’s doing this by repurposing the words of others.  She’s even better in several of her other poems, like all the ones she wrote from scratch.  If you’ve got time, you should check out her book.