On scientific misconceptions, Eurocentrism, and the evolution of skin color.

On scientific misconceptions, Eurocentrism, and the evolution of skin color.

There’s a story that many scientists tell about the evolution of human skin color.

The story goes roughly like this:

In the beginning, our ancestors had dark fur and lightly pigmented skin. This was perhaps six million years ago? Over time, our ancestors lost their fur and needed darkly pigmented skin to protect themselves from the harsh light of the sun.

Later, some people left their ancestral homeland. Migratory humans covered the globe. As humans traveled farther from the equator, they evolved light skin again – otherwise they’d have too little vitamin D.


In Joanne Cole (author) & Bruce Degen (illustrator)’s The Magic School Bus Explores Human Evolution (which is surprisingly good! You can read my review here), this story is told in a single panel.

Variants on this story percolated through the scientific literature for years, but the version above is derived largely from the work of anthropologists Nina Jablonski & George Chaplin. In their article “The Evolution of Skin Coloration,” they write that “As hominins migrated outside of the tropics, varying degrees of depigmentation evolved to permit ultraviolet-light-induced synthesis of vitamin D.

This story is often treated as accepted science, even by researchers who describe human evolution from an explicitly anti-racist perspective. For example, in A Brief History of Everyone Who Ever Lived, Adam Rutherford writes that “The unglamorous truth is that there are but a handful of uniquely human traits that we have clearly demonstrated are adaptations evolved to thrive in specific geographical regions. Skin color is one. The ability to digest milk is another, which fits perfectly with the emergence of dairy farming.

However, this story about the evolution of human skin color isn’t supported by the actual data. Instead, it’s based on Eurocentric misconceptions about what sort of environment and lifestyle are “normal” for human beings.


Unquestionably, darkly pigmented skin can protect humans from sunlight. And sunlight is dangerous! You should wear sunscreen. (I’m sure that somebody has told you this already.)

But the benefits of light skin have been vastly overstated by (light-skinned) researchers. And a quick glance at the data is enough to demonstrate the major flaws in the evolutionary story I described above.

That same page of The Magic School Bus Explores Human Evolution includes a world map with a (again, surprisingly accurate!) depiction of the paths that ancient humans took to populate the planet.

Looking at those red arrows, you’ll see several occasions when groups of humans migrated farther from the equator. The people who settled in France, Korea, and Patagonia had all reached similar latitudes. (As did the humans who settled in New Zealand, but they only arrived about 800 years ago, which probably isn’t enough time to expect dramatic shifts in skin color. Especially given the likelihood of continued gene flux across latitudes – by the time anyone reached New Zealand, people were probably traveling to and fro by boat often, rather than forming an isolated community.)

If the above story about the evolution of human skin color were correct, we’d expect that indigenous people from France, Korea, and Patagonia would all have similar skin color. Indeed, artist Gail McCormick worked closely with Jablonski & Chaplin to create a cut-paper map depicting the indigenous skin color that their story predicts for various regions.

But this map doesn’t match the skin color we actually see from humans across the globe. The indigenous people of France evolved lightly pigmented skin. The indigenous people of Korea, Patagonia, and North America did not.

Jablonski & Chaplin arrived at their conclusion because they considered very few human populations; Figure 4 from their paper, which I’ve included below, depicts in white all the regions of the globe that they left out of their data set.

Each human migration was another natural experiment: Does migration away from the equator result in lighter skin?

For the people migrating into Europe, the answer is pretty unambiguously “yes.” We have evidence of dramatic, rapid selection for genes that result in lighter skin among these people. Many of the gene variants responsible for lightly pigmented skin in Europeans had been long present among ancient humans living in Africa (as documented by Crawford & colleagues in “Loci Associated with Skin Pigmentation Identified in African Populations”), but then spread rapidly among Europeans approximately 4,000 years ago (as documented by Mathieson & colleagues in “Genome-Wide Patterns of Selection in 230 Ancient Eurasians”).

The dramatic selection for genes associated with lightly pigmented skin in Europe occurred within the span of about a thousand years, and occurred about 30,000 or 40,000 years after Homo sapiens first populated that region.


Among the various groups of ancient humans who migrated toward similar latitudes, only the indigenous people of Northern Europe evolved lightly pigmented skin. This trait spread rapidly (by evolutionary standards) about 4,000 years ago.

This timing is similar to the spread of lactose tolerance genes among the people of Northern Europe. Most animals, including most humans, can’t digest milk in adulthood. Even among humans who live in cultures where cows’ milk is a major component of the diet, many people can’t digest it and will experience routine gastrointestinal distress and diarrhea. (Which is serious! Although a few bottles of Gatorade would save their lives, diarrhea still kills about 2 million people per year. Among ancient humans, diarrhea could easily cause deaths by malnutrition, dehydration, or increased susceptibility to disease.)

For their 2022 study “Dairying, Diseases, and the Evolution of Lactase Persistence in Europe,” Evershed & colleagues looked at food residues stuck to ancient pottery and found that cows’ milk has been a major part of European diets for approximately 9,000 years. But these people couldn’t digest milk well. For their 2020 study “Low Prevalence of Lactase Persistence in Bronze Age Europe Indicates Ongoing Strong Selection over the Last 3,000 Years,” Burger & colleagues found that most of the dead warriors from an ancient European battleground did not have the genes for lactose tolerance.

And yet, just before the Europeans’ vast spree of kidnapping, abduction, and resource extraction led to massive amounts of human migration (which began approximately 500 years ago), nearly 95% of the people living in Europe had the genes for lactose tolerance.

That’s a huge change, and really fast! Which should make us realize that something strange might be going on with this group of people – they must’ve had particularly atrocious diets. Which helps explain why they’d need lighter skin.

After all, vitamin D is a dietary nutrient. If you get enough vitamin D from your food, there’s no downside to darkly pigmented skin. And, as David Graeber & David Wengrow describe comically in The Dawn of Everything (“We might call this the ‘all the bad spots are taken!’ argument”), most ancient humans chose to live in places where they could find food, water, and shelter. Otherwise they’d migrate.

Yet, in a savage twist of fate, the same culture that generally resulted in low-quality diets – farming – also made migration more difficult. People stayed near their farms, with their insufficient amounts of low-quality food, because that way they’d at least have something.

I’ve written previously about the social and environmental repercussions of ancient farming – a lovely essay, in my opinion! – but in order to understand the evolution of skin color, all we really need to know is the impact of farming on human health. As James Scott writes in Against the Grain,

Evidence for the relative restriction and impoverishment of early farmers’ diets comes largely from comparisons of skeletal remains of farmers with those of hunter-gatherers living nearby at the same time. The hunter-gatherers were several inches taller on average. This presumably reflected their more varied and abundant diet. It would be hard, as we have explained, to exaggerate that variety. Not only might it span several food webs – marine, wetland, forest, savanna, arid – each with its seasonal variation, but even when it came to plant foods, the diversity was, by agricultural standards, staggering. The archaeological site of Abu Hureyra, for example, in its hunter-gatherer phase, yielded remains from 192 different plants, of which 142 could be identified, and of which 118 are known to be consumed by contemporary hunter-gatherers.

The crops and livestock raised by farmers in Northern Europe provide very little vitamin D. But ancient humans often settled in areas where they could catch fish, which provides plenty of dietary vitamin D (as measured by Schmid & colleagues for their study “Natural Vitamin D Content in Animal Products”).

As it happens, if the picture from The Magic School Bus Explores Human Evolution were an accurate depiction of those people’s diet (not to mention their clothes, exposing quite a bit of skin!), they’d probably experience very little selective pressure for lighter skin.


Whenever we discuss evolution, it’s important to remember that natural selection doesn’t enrich for traits that are “better.” There’s rarely any such thing as “better.” Consider: the ancestors of starfish had brains! But – given their particular environment – their lineage was more successful after evolving to be brainless. Or: the ancestors of penguins could fly! But – given their particular environment – their lineage was more successful after evolving to be flightless.

We humans have long legs and arched feet that are great for running, but these same long legs and stubby toes make us so much worse at climbing trees than a chimpanzee. It’s a trade-off. (And a trade-off that I’m pretty happy with, given that I love to run and am afraid of heights.)

Lightly pigmented skin carries a very clear cost – UV penetration with its attendant folate degradation, skin cancers, and discomfort – and only carries a compensatory benefit at extreme northern or southern latitudes among ancestral populations with diets low in vitamin D.

We do ourselves a major disservice – and perpetuate Eurocentric racism – if we consider the selective pressures encountered by one particular group of Homo sapiens to be the default against which all others are measured.

On the case against God.

On the case against God.

Sometimes people discuss the case for or against God, hoping to prove or disprove His existence.

That’s not my goal. Deities – and magic of all kinds – are often defined as being beyond the realm of evidence or proof. You either believe or you don’t.

As far as our scientific discoveries are concerned, there’s no reason to believe in God. We’ve never encountered data that would require the presence of a deity to be explained.

But then again, as far as our scientific discoveries are concerned, there’s no reason to believe in free will. We’ve never encountered data that would suggest that the workings of our brains are caused by anything other than the predictable movement of salt atoms inside of us. And, personally? I’m totally willing to believe in free will, based solely on how my existence feels.

So I can’t fault anyone for believing in God. Or gods. Witchcraft, ghosts, or aliens – sure, I do think some of these beliefs are a bit more outlandish than my belief in free will, but it’s all a matter of degree.

Instead, I’d like to discuss the legal case against God.


I’m pro-life.

That’s why I’m vegan – I don’t believe animals should be killed or caged just for me to have a tastier meal. As a heterotroph, I obviously have to hurt somebody every time I eat, but I’d rather hurt a carrot than a cow.

And it’s why I’m an environmentalist. Although climate change would open up a variety of new ecological niches, presumably benefiting many lifeforms (including some that don’t even exist yet!), many of our world’s current denizens would suffer. Many current species would go extinct.

And, because I’m pro-life, I’m also pro-choice. I believe that parents can do best when they’re allowed to choose when & with whom they’ll have children. I believe that fooling around with people is often fun, and can be deeply emotionally fulfilling, and that people should be able to partake in consensual pleasure without the fear of lifelong repercussions. I believe that human women are living creatures and should have autonomy over their bodies.

I vastly prefer contraception to abortion. It would be marvelous to live in a world where safe, effective contraception was freely available to everyone who wanted it!

When my spouse and I were hoping to have children, we declined genetic testing during each pregnancy. Given our immense privilege, we could afford to love and raise whomever arrived in our family. But not everyone believes that they can. Some people feel that they’ll be unable to care for children with dramatic healthcare needs. (Inevitably, when we allow people choice, some people will base their choices on rationales that I don’t agree with.)


Following the Supreme Court’s misguided decision in Dobbs v. Jackson Women’s Health Organization, many states have criminalized abortion. In Washington state, legislation provides “to unborn children the equal protection of the laws of this state,” and in Iowa, legal personhood begins “from the moment of conception.” Under such laws, abortion constitutes murder.

And worse. As Madeleine Schwartz documents in her excellent 2020 essay “Criminalizing a Constitutional Right,” even before the Dobbs decision, many women were already being charged with murder or neglect if they happened to have a miscarriage or stillbirth.

In the vast majority of cases, though, a miscarriage is not the mother’s fault.

Most often, the culprit is God.

Under these laws, state prosecutors ought to bring their murder charges against God.


After conception, each embryo passes through several developmental checkpoints. A wide range of genetic or chromosomal abnormalities could cause a fetus or embryo to fail to pass these checkpoints. At that point, the pregnancy is terminated. The unborn child is aborted by – or, if you agree with the sort of legal language that the Dobbs decision unleashed, murdered by – God.

A miscarriage is often an emotionally wrenching experience for aspiring mothers. The emotional aftermath of miscarriage is typically much worse than that of abortion. The outcome is the same – the pregnancy is terminated – but when God aborts a pregnancy with miscarriage, a perhaps desperately wanted unborn child is lost.

Miscarriage is frequent, too.

It’s hard to know the exact frequencies, because in addition to the general culture of shame and disparagement with which the medical community has long regarded women’s bodies, miscarriage is particularly hidden. Miscarriage is so common that women are advised not to announce their pregnancies until their second or third trimesters, but this means that their support networks of friends, family, and colleagues might not even know why a person feels devastated.

But a good estimate is that about fifty percent of conceptions will fail to pass all the necessary genetic and chromosomal checkpoints.

Which means that – insofar as we believe that legal personhood begins at conception – about fifty percent of all people are murdered by God before they are born. God is a ruthless eugenicist, dispassionately evaluating the DNA of each unborn child and quelling the development of half.


From Schwartz’s essay, you’ll learn of numerous women who were imprisoned – and lost their jobs, their homes, their families – because they were suspected of harming their own unborn children. (And this was all before the Dobbs decision.)

For the cases that Schwartz chooses to discuss, most of the women were very poor. If we as a nation had chosen to spend money to give all women access to high-quality nutrition and prenatal medical care, some of these fetuses may have survived their pregnancies and had the opportunity to become living, breathing, impoverished babies. In which case I’d argue that the people who intentionally withhold free access to nutrition and prenatal care – the Republican governors and legislators – are accessories to murder.

But before we punish any of them, we should start with God.

On medical spending.

On medical spending.

Trepanation_-_feldbuch-der_wundartzneyBack when doctors were curing headaches by drilling holes through people’s skulls, or slapping on a few leeches to drain out the bad blood when sick patients came stumbling through the door, medical spending wasn’t a big deal.  There weren’t any serious political considerations attached.  If you were wealthy, you might visit a doctor and get yourself killed.  If you were poor, you’d probably go without medical care.  You’d live or die according to the virulence of your disease and the quality of your diet.  Maybe you’d buy a small amulet representing one of the healing saints, or pay a witch to bury herbs in an auspicious location near your house.

I haven’t done an extensive review of the historical data, but to the best of my knowledge no ancient kingdoms were bankrupted trying to provide leeches to all their sick citizens.

Now, though, the situation is different.  Medical care is better.  Doctors know enough that patients receiving care fare significantly better than those left untreated.

There are dramatic economic consequences of improved medical care, though.  Leeches and bloodletting and trapanation were ineffectual, but they were cheap.  Modern medical care actually saves people’s lives, but it comes at a huge cost.  In the United States, health care spending is about a fifth of the total economy, and rising.


Albrecht_Dürer_-_Death_and_the_Lansquenet_(NGA_1943.3.3611)Death is scary.  For people who started learning philosophy with Camus (which is not something I’d recommend — this can result in an excessively bleak world view and is probably appropriate only for incurable depressives), inescapable death seems to be the major quandary in our attempt to ascribe meaning to life.

The fear of death fuels medical spending.  Also our spending on biomedical research.  Medical care is pretty great currently, especially if you’re comparing statins and anti-retrovirals and insulin to leeches.  But people still die.  We haven’t reached the singularity yet (thank goodness).

Leeching-largeBiomedical research spending makes the population as a whole sicker, though.  Most research innovations — and certainly the most lucrative ones — are for managing chronic conditions, not curing them.  People who would’ve died — how many leeches do we prescribe for atrial fibrillation? — survive instead, lowering our population’s average health.  And raises average age, since those first few maladies aren’t killing people as often.

It’s not so difficult to imagine that, if these biomedical research trends continue, people might survive until a hundred and fifty, maybe two hundred years old … and health care spending will rise until it’s a third of the U.S. economy, or fifty percent, or more.

That could doom the country.

But the real tragedy, to my mind, is the way that health care money is being spent.

9781250044631I think a passage from Damon Tweedy’s Black Man in a White Coat gives an elegant summary of the problem.  The whole book is great — I’d highly recommend it to anyone who cares about either racial inequality or the U.S. medical industry.  Tweedy’s writing is so compassionate, always looking to describe the best in people even when his narrative compels him to shown them at their worst.

The passage I want to quote appears just after Tweedy describes a preventable medical tragedy brought on by poor lifestyle choices.  Tweedy grabs a hasty meal with some of his colleagues and is still mulling over what more could’ve been done to help the patient.  Ironically, this leads to a conversation about counseling patients to eat better, but Tweedy and the other doctors are scarfing extremely unhealthful meals.

It really is a great book — big-hearted and earnest, with Tweedy always clear-eyed about his own failings.  His descriptions of his own struggles with poor lifestyle choices really dramatize his efforts to address other black men’s unhealthy lifestyles.

(Oh, and, I fixed a minor typographical error in the following quote without marking it — I always think  sic erat scriptum sounds snarky, and Tweedy’s book was good enough that I’d feel like a total jerk if I made him look bad for what was probably someone else’s mistake.)

Medical doctors should know better than to eat hospiteria (hospital cafeteria) pizza.

I asked them their thoughts on counseling patients about nutrition and exercise.

“That’s the responsibility of his outpatient primary care doctor,” he said.  “We’re here to deal with the life-and-death stuff.”

This focus on biomedical treatment over preventative care is not limited to Duke or similar schools.  Indeed, outpatient primary care physicians — the doctors that Mike felt bore the responsibility for counseling patients on diet and exercise — are often no more inclined than other doctors to have this discussion, even for diseases where these interventions are vital.  There are many barriers, among them money (dietary counseling is reimbursed poorly compared to medical procedures), time (physicians often see patients every ten or fifteen minutes), and the sense that nutrition talk is better left to dieticians, and that doctors should focus on their expertise (prescribing medications, interpreting tests, and performing procedures).  In addition, experience has made many doctors cynical about patient behavior and the likelihood for change.

The tragedy of U.S. health care spending isn’t just that we shovel too much money into it, which limits what we can spend on other, more important causes, but also that we pour huge sums of money into end-stage therapies that don’t increase quality of life nearly as much as cheaper, earlier interventions.

My father-in-law’s treatment is a great example.  By the end of his life, the federal government was spending hundreds of thousands on his care.  Medication for cholesterol and diabetes, high-tech surgery to replace arteries & restore nervous function in his hands after they’d been numbed by diabetic neuropathy, installing an internal defibrillator once his heart began to fail…

Those treatments helped.  Sure.  They kept him alive longer.  He was incredibly happy after the hand surgery — for months he’d been unable to play guitar because he couldn’t feel anything and could barely exert enough pressure to fret the strings, and after that surgery he could play again, invited everyone he knew for another potluck & jam session.

When K dropped her father off after that surgery, she realized our government’s medical spending on him was actually helping dozens of people — all his neighbors were outside waiting to greet him, and once he could use his hands well enough to cook again he resumed baking loaf after loaf of sourdough bread to give to them.

I couldn’t find an image of any breads that look quite as dense as the whole-wheat loaves Mike used to bake for everyone, but David Jackmanson‘s seems close.

At the same time, our government could’ve brought K’s father — and everyone he helped — more joy by helping him earlier.  They spent nothing on him until his untreated conditions left him too disabled to work.  Only then, and even more so after he reached sixty-five, could he get help.

It’s crummy knowing that he would’ve been happier, and would’ve been able to give more back to his community, if he’d been helped earlier.  His childhood was rotten, but nothing was spent to overcome the scars left from hostile parenting.  Our government didn’t help him get counseling after a traumatic event in his early adulthood, either, and that was the root of so many of his later problems.  A few thousand spent to help him then could’ve kept him from becoming indigent. A few thousand spent on psychiatric counseling then would’ve staved off the need for the hundreds of thousands in medical care that were provided later.

This bizarre state of spending priorities is reflected very clearly in our federal budget.  For instance, there’s no money set aside for universal pre-K education.  This would only cost on the order of $10 billion dollars, though, whereas we spend something like $500 billion on health care for the elderly.  But if our goal is to produce good health, childhood education accomplishes much more than surgery and pharmaceuticals for the elderly.

As Tweedy wrote, simply teaching people to eat better would obviate the need for a significant percentage of our medical spending.  Maybe we’d need to spend some money subsidizing real food so that a better diet was within more people’s reach, but, still… that’s much cheaper than the life-and-death medical care that Tweedy was trained to provide.

After an education worth some hundred thousand dollars, after two decades of hard work & studying on his part, Tweedy served as part of a care team working arduous thirty-hour shifts … all to save people who might’ve stayed away from the hospital entirely if they’d been eating vegetables.