On auctions, politics, quantum computing, and waste.

On auctions, politics, quantum computing, and waste.

I recently played the board game Fists of Dragonstone.  It was fun – the premise is that each turn a spell is revealed and players will make a simultaneous, secret bid to acquire its effect.  The spells might earn victory points, increase your future income, or help you thwart other players’ plans.

Each turn felt tense because Fists of Dragonstone uses “all pay” auctions.  If you bid two dollars, you’ll lose this money whether or not you get the prize you wanted.  This type of auction is a slippery beast – inherently stressful in the real world, but psychologically compelling within the safe confines of a game.

Fists of Dragonstone. Image by hal_99 on Flickr.

When most people think of auctions, they imagine the type that eBay uses – only the winner pays, and the amount paid is equal to the second-highest bid.  In this type of auction, you ought to state your intentions honestly.  If you would get $15 worth of joy from owning an item, you should bid $15 – you’ll either get to have it for that amount of money (or less), or else learn that someone else values the item more.

If we didn’t have such rampant wealth & income inequality, this type of auction would arguably improve the world.  Objects would wind up in the hands of whomever valued them most, boosting overall happiness.

In practice, of course, things don’t work out so well.  Some people have access to far more money than others.  Even if a wealthy person estimates that a blanket would provide $60 of happiness, and a poor person estimates that the same blanket would provide $10 of happiness, it might be that the poor person would actually get more happiness from the blanket.  Inequality means that there’s no universal way to convert between money and joy, but the marketplace treats all our dollars the same.

Image by Todd Huffman on Flickr.

In a board game, you can address inequality by doling out the same set of initial resources to each player.  But the standard auction type – which rewards honest valuation – wouldn’t be much fun.  Everyone should value each item equivalently, and so the game is reduced to a puzzle.  It might be fun to solve once, but there wouldn’t be a reason to play again.

In an “all pay” auction, though, you benefit by being unpredictable.  Because you lose your bid whether or not you win the auction, you should often bid zero even if there’s an item you’d like.  You’re throwing away money if you make a non-zero bid but someone else bids higher.

You could still attempt to “solve” this sort of game, but the optimal solution invokes random behavior.  You should make a bid somewhere between zero and your true valuation, with a certain probability assigned to each.  That’s what a robot would do.

Most humans are pretty terrible at doing things that are actually random, though.  When we try to create a fake list of outcomes from a set of coin flips, for instance, we usually hew to an alternating pattern of heads and tails.

Since we’re bad at making random choices – and we know that other players are bad at it too – we fall back on misguided psychological reasoning.  She bid nothing the last two rounds, so maybe I can sneakily win this next auction with a $1 bid!  We get to feel clever when our stratagems succeed.  We get to curse when they fail.  All much more fun than the honest appraisal encouraged by auctions in which only the winner pays!

In the real world, though, an “all pay” auction is a recipe for waste.

This type of auction is a good proxy for many types of adversarial encounters.  Political contests, computer security, sporting events.  Even restaurant management, if people have a discrete budget set aside for eating out and are simply choosing which establishment to frequent. 

In each of these situations, every player has to pay – to run for political office, you invest years of your life and spend a whole bunch of money on advertisements.  It’s not as though you get that time or money back when you lose.  All players spend their total bids, but only one gets the prize of elected office.

Contemporary political campaigns are incredibly expensive.  So many people have already devoted years of their lives to the 2020 presidential campaign.  The efforts of the losing side will have been wasted.  Because major platforms are willing to air totally fraudulent advertisements, candidates have little chance of victory if they spend much less than their opponents.

Sure, sometimes people will console themselves with the thought that “We may not have won the election, but we changed the tenor of political discourse!”   In our country, this is a fantasy.  U.S. politics is sufficiently polarized that the winners rarely concern themselves with the expressed desires of the losing side.  Two of our past three presidents lost the popular vote and still proceeded with their agendas as though they’d received an overwhelming mandate.

Security is another form of “all pay” auction.  This is an asymmetrical game – your initial resources and victory conditions are clearly different if you happen to be playing as a homeowner or a thief – but the basic principle remains the same.  One player bids an amount on security; the other player bids time and money to undermine it; depending on who bids more, a break-in succeeds or it doesn’t.

As in Fists of Dragonstone, players have an incentive to randomize their behavior.  Sometimes a homeowner should display signs for a security system that hasn’t actually been installed.  Sometimes a thief should pass by a house even if it looks like a juicy target.  If players are too predictable, they can be narrowly outbid.

Computer encryption is an auction like this.  Equifax bid less than the people trying to hack its servers; a huge amount of personal data was stolen.  Mine too.  As an apology for low-balling their security bid, Equifax will send me a settlement check for some amount between $125 and $0.03, depending on how many of the other victims they choose to compensate.

What could I do with three pennies?

I glued pennies together to make little legs for my laptop computer – three cents for the back legs, two for the front – hoping to improve air flow for the exhaust fan.  When a computer overheats, programs malfunction.  The operating system might freeze, the same way I do when I’m typing and somebody says “Hi” to me.  My brain stutters – processing, processing – unable to determine whether I know this person, and, if so, from where.

Shut down, reboot.

Anyway, building these laptop stilts out of pennies seemed cheaper than any other materials.  I’ve already built them, though.  I don’t really need another $0.03 check from Equifax.

But this situation must feel frustrating for the people at Equifax, too.  Improved encryption isn’t valuable in and of itself.  This is an adversarial contest that produces only waste.  A world in which companies spent little or nothing on computer security and other people simply chose not to breach their nonexistent defenses would be better than our world, in which data needs to be scrupulously guarded.

A world in which politicians didn’t advertise, trusting voters to learn about their platforms from impartial sources, would be better than our world.

That’s not where we live, though.  Instead, scientists are working to create quantum computers.  These are marvels of engineering.  In contrast to the behavior of macroscopic objects, certain properties of a quantum transistor can remain undefined during a calculation, collapsing into a discrete binary value only at the end.  To accomplish this, the transistor must be guarded from its environs – you may have heard that “measurement” collapses wavefunctions, but measurement doesn’t mean that a human is looking at something.  Measurement simply means that the state of an object becomes coupled with the state of its environment.

If a photon approaches, the state of the object becomes linked with the state of the photon.  They might’ve collided or not, which narrows the range of space in which the object might exist, which narrows the set of wavefunctions that could be summed to give its momentum.  A collision-less encounter restricts us to a different set of futures than if the photon hit the thing.

In practice, that means a quantum computer needs to be kept dark, and atmosphere-less, and very, very cold.  For a long time – the transistors have to stay unmolested for the entire duration of a calculation.

IBM’s Quantum Q. Photo by IBM research on Flickr.

Obviously, these devices are very expensive to build and run.

And why might we want them?  Well, they’d be better than conventional computers at … um … at factoring the large numbers that are used for computer encryption! 

Quantum computers are fascinating.  Our attempts to build them have helped us learn more about the workings of our world.  But the actual existence of quantum computers – at least until we think of an application other than cracking computer security – will make the world worse.

Worried that people might copy data and then use quantum computers to decode it later — you know, after these computers have been invented — security experts say that we need to start spending more money on encryption now

While playing Fists of Dragonstone, my friends would curse and shout after making an exorbitantly high bid and then seeing that every other player bid zero.  I could have won with $1! 

That’s basically what security experts are encouraging us to do. Not curse — overbid. They say that we should make extremely high bids on encryption now, to protect ourselves from a technology that might never exist.  Otherwise, undesirables might gain access to the password-protected folder of risqué photographs that you and your partner(s) took.  Or break into your bank account.

Occasionally, adversarial work improves the world.  When restaurants compete, service might get better. The food, tastier.

But most adversarial contests are engines for waste.  High-speed stock trading makes the market more fluid – you can log on and purchase a few dozen shares of whatever you’d like since AI algorithms are ready to facilitate transactions between buyers and sellers. 

That’s a small service, though.  High-speed trading firms shouldn’t be extracting as much wealth as they are in this country.  Mostly they eavesdrop on others’ conversations, sneak in front of people who’re trying to buy something, then scalp it back at higher prices.  Trading firms pay exorbitant rent on shelf space that’s close as possible to the stock exchange mainframes – if one scalper is microseconds faster than another, that’s the one who gets to shake you down.

In a board game, cooperation is generally less fun than adversarial play.  For the former, players are trying to solve a puzzle created by the designer.  With adversarial rules, players are using their intelligence to create puzzles for each other in real time.

In a game, the waste is the entire point.  Nothing tangible is produced, but the expended time leads to social camaraderie.  The expended brainpower can give you a sense of satisfaction from having worked through intellectual puzzles.  And, hopefully, you’ll have fun.

But – whoops – we’ve used the principles of good game design and mistakenly applied them to the real world.  Fists of Dragonstone was fun; our political system shouldn’t be based on all-pay auctions.  With major politicians poised to ravage the Amazon, cull the world’s few remaining old-growth forests, and dredge up Arctic oil fields, the people wealthy enough to make high bids on upcoming elections might well destroy us.

NASA image revealing the ongoing deforestation of the Amazon rainforest.  Just f.y.i., the forest is being cleared to make space for cows.  Each time you choose eat beef or dairy cheese, you’re contributing to the destruction of the “Lungs of our Planet.”

Featured image for this post: “Auction Today” by Dave McLean on Flickr.

On empathizing with machines.

On empathizing with machines.

When I turn on my computer, I don’t consider what my computer wants.  It seems relatively empty of desire.  I click on an icon to open a text document and begin to type: letters appear on the screen.

If anything, the computer seems completely servile.  It wants to be of service!  I type, and it rearranges little magnets to mirror my desires.

Gps-304842.svg

When our family travels and turns on the GPS, though, we discuss the system’s wants more readily.

“It wants you to turn left here,” K says.

“Pfft,” I say.  “That road looks bland.”  I keep driving straight and the machine starts flashing make the next available u-turn until eventually it gives in and calculates a new route to accommodate my whim.

The GPS wants our car to travel along the fastest available route.  I want to look at pretty leaves and avoid those hilly median-less highways where death seems imminent at every crest.  Sometimes the machine’s desires and mine align, sometimes they do not.

The GPS is relatively powerless, though.  It can only accomplish its goals by persuading me to follow its advice.  If it says turn left and I feel wary, we go straight.

facebook-257829_640Other machines get their way more often.  For instance, the program that chooses what to display on people’s Facebook pages.  This program wants to make money.  To do this, it must choose which advertisers receive screen time, and to curate an audience that will look at those screens often.  It wants for the people looking at advertisements to enjoy their experience.

Luckily for this program, it receives a huge amount of feedback on how well it’s doing.  When it makes a mistake, it will realize promptly and correct itself.  For instance, it gathers data on how much time the target audience spends looking at the site.  It knows how often advertisements are clicked on by someone curious to learn more about whatever is being shilled.  It knows how often those clicks lead to sales for the companies giving it money (which will make those companies more eager to give it money in the future).

Of course, this program’s desire for money doesn’t always coincide with my desires.  I want to live in a country with a broadly informed citizenry.  I want people to engage with nuanced political and philosophical discourse.  I want people to spend less time staring at their telephones and more time engaging with the world around them.  I want people to spend less money.

But we, as a people, have given this program more power than a GPS.  If you look at Facebook, it controls what you see – and few people seem upset enough to stop looking at Facebook.

With enough power, does a machine become a moral actor?  The program choosing what to display on Facebook doesn’t seem to consider the ethics of its decisions … but should it?

From Burt Helm’s recent New York Times Magazine article, “How Facebook’s Oracular Algorithm Determines the Fates of Start-Ups”:

Bad human actors don’t pose the only problem; a machine-learning algorithm, left unchecked, can misbehave and compound inequality on its own, no help from humans needed.  The same mechanism that decides that 30-something women who like yoga disproportionately buy Lululemon tights – and shows them ads for more yoga wear – would also show more junk-food ads to impoverished populations rife with diabetes and obesity.

If a machine designed to want money becomes sufficiently powerful, it will do things that we humans find unpleasant.  (This isn’t solely a problem with machines – consider the ethical decisions of the Koch brothers, for instance – but contemporary machines tend to be much more single-minded than any human.)

I would argue that even if a programmer tried to include ethical precepts into a machine’s goals, problems would arise.  If a sufficiently powerful machine had the mandate “end human suffering,” for instance, it might decide to simultaneously snuff all Homo sapiens from the planet.

Which is a problem that game designer Frank Lantz wanted to help us understand.

One virtue of video games over other art forms is how well games can create empathy.  It’s easy to read about Guantanamo prison guards torturing inmates and think, I would never do that.  The game Grand Theft Auto 5 does something more subtle.  It asks players – after they have sunk a significant time investment into the game – to torture.  You, the player, become like a prison guard, having put years of your life toward a career.  You’re asked to do something immoral.  Will you do it?

grand theft auto

Most players do.  Put into that position, we lapse.

In Frank Lantz’s game, Paperclips, players are helped to empathize with a machine.  Just like the program choosing what to display on people’s Facebook pages, players are given several controls to tweak in order to maximize a resource.  That program wanted money; you, in the game, want paperclips.  Click a button to cut some wire and, voila, you’ve made one!

But what if there were more?

Paperclip-01_(xndr)

A machine designed to make as many paperclips as possible (for which it needs money, which it gets by selling paperclips) would want more.  While playing the game (surprisingly compelling given that it’s a text-only window filled with flickering numbers), we become that machine.  And we slip into folly.  Oops.  Goodbye, Earth.

There are dangers inherent in giving too much power to anyone or anything with such clearly articulated wants.  A machine might destroy us.  But: we would probably do it, too.