On scientific beliefs, Indigenous knowledge, and paternity.

On scientific beliefs, Indigenous knowledge, and paternity.

Recently my spouse & I reviewed Jennifer Raff’s Origin: A Genetic History of the Americas for the American Biology Teacher magazine (in brief: Raff’s book is lovely, you should read it! I’ll include a link to our review once it’s published!), which deftly balances twin goals of disseminating scientific findings and honoring traditional knowledge.

By the time European immigrants reached the Americas, many of the people living here told stories suggesting that their ancestors had always inhabited these lands. This is not literally true. We have very good evidence that all human species – including Homo sapiens, Homo neaderthalensis, and Homo denisovans among possible others – first lived in Africa. Their descendants then migrated around the globe over a period of a few hundred thousand years.

As best we know, no lasting population of humans reached the Americas until about twenty thousand years ago (by which time most human species had gone extinct – only Homo sapiens remained).

During the most recent ice age, a few thousand humans lived in an isolated, Texas-sized grassland called Beringia for perhaps a few thousand years. They were cut off from other humans to the west and an entire continent to the east by glacial ice sheets. By about twenty thousand years ago, though, some members of this group ventured south by boat and established new homes along the shoreline.

By about ten thousand years ago, and perhaps earlier, descendants of these travelers reached the southern tip of South America, the eastern seaboard of North America, and everywhere between. This spread was likely quite rapid (from the perspective of an evolutionary biologist) based on the diversity of local languages that had developed by the time Europeans arrived, about five hundred years ago.

So, by the time Europeans arrived, some groups of people had probably been living in place for nearly 10,000 years. This is not “always” from a scientific perspective, which judges our planet to be over 4,000,000,000 years old. But this is “always” when in conversation with an immigrant who believes the planet to be about 4,000 years old. Compared with Isaac Newton’s interpretation of Genesis, the First People had been living here long before God created Adam and Eve.

If “In the beginning …” marks the beginning of time, then, yes, their people had always lived here.

#

I found myself reflecting on the balance between scientific & traditional knowledge while reading Gabriel Andrade’s essay, “How ‘Indigenous Ways of Knowing’ Works in Venezuela.” Andrade describes his interactions with students who hold the traditional belief in partible paternity: that semen is the stuff of life from which human babies are formed, and so every cis-man who ejaculates during penetrative sex with a pregnant person becomes a father to the child.

Such beliefs might have been common among ancient humans – from their behavior, it appears that contemporary chimpanzees might also hold similar beliefs – and were almost certainly widespread among the First Peoples of South America.

I appreciate partible paternity because, although this belief is often framed in misogynistic language – inaccurately grandiose claims about the role of semen in fetal development, often while ignoring the huge contribution of a pregnant person’s body – the belief makes the world better. People who are or might become pregnant are given more freedom. Other parents, typically men, are encouraged to help many children.

Replacing belief in partible paternity with a scientifically “correct” understanding of reproduction would probably make the world worse – people who might become pregnant would be permitted less freedom, and potential parents might cease to aid children whom they didn’t know to be their own genetic offspring.

Also, the traditional knowledge – belief in partible paternity – might be correct.

Obviously, there’s a question of relationships – what makes someone a parent? But I also mean something more biological — a human child actually can have three or more genetic contributors among their parents.

#

Presumably you know the scientific version of human reproduction. To wit: a single sperm cell merges with a single egg cell. This egg rapidly changes to exclude all the other sperm cells surrounding it, then implants in the uterine lining. Over the next nine months, this pluripotent cell divides repeatedly to form the entire body of a child. The resulting child has exactly two parents. Every cell in the child’s body has the same 3 billion base pair long genome.

No scientist believes in this simplified version. For instance, every time a cell divides, the entire genome must be copied – each time, this process will create a few mistakes. By the time a human child is ready to be born, their cells will have divided so many times that the genome of a cell in the hand is different from the genome of a cell in the liver or in the brain.

In Unique, David Linden writes that:

Until recently, reading someone’s DNA required a goodly amount of it: you’d take a blood draw or a cheek swab and pool the DNA from many cells before loading it into the sequencing machine.

However, in recent years it has become possible to read the complete sequence of DNA, all three billion or so nucleotides, from individual cells, such as a single skin cell or neuron. With this technique in hand, Christopher Walsh and his coworkers at Boston Children’s Hopsital and Harvard Medical School isolated thirty-six individual neurons from three healthy postmortem human brains and then determined the complete genetic sequence for each of them.

This revealed that no two neurons had exactly the same DNA sequence. In fact, each neuron harbored, on average, about 1,500 single-nucleotide mutations. That’s 1,500 nucleotides out of a total of three billion in the entire genome – a very low rate, but those mutations can have important consequences. For example, one was in a gene that instructs the production of an ion channel protein that’s crucial for electrical signaling in neurons. If this mutation were present in a group of neurons, instead of just one, it could cause epilepsy.

No human has a genome: we are composite creatures.

#

Most scientists do believe that all these unique individual genomes inside your cells were composed by combining genetic information from your two parents and then layering on novel mutations. But we don’t know how often this is false.

Pluripotent (“able to form many things”) cells from a developing human embryo / fetus / baby can travel throughout a pregnant person’s body. This is quite common – most people with XX chromosomes who have given birth to people with XY chromosomes will have cells with Y chromosomes in their brains. During the gestation of twins, the twins often swap cells (and therefore genomes).

At the time of birth, most humans aren’t twins, but many of us do start that way. There’s only a one in fifty chance of twin birth following a dizygotic pregnancy (the fertilization of two or more eggs cells released during a single ovulation). Usually what happens next is a merger or absorption of one set of these cells by another, resulting in a single child. When this occurs, different regions of a person’s body end up with distinct genetic lineages, but it’s difficult to identify. Before the advent of genetic sequencing, you might notice only if there was a difference in eye, skin, or hair color from one part of a person’s body to the next. Even now, you’ll only notice if you sequence full genomes from several regions of a person’s body and find that they’re distinct.

For a person to have more than two genetic contributors, there would have to be a dizygotic pregnancy in which sperm cells from unique individuals merged with the two eggs.

In the United States, where the dominant culture is such that people who are trying to get pregnant are exhorted not to mate with multiple individuals, studies conducted in the 1990s found that at least one set of every few hundred twins had separate fathers (termed “heteropaternal superfecundication”). In these cases, the children almost certainly had genomes derived from the genetic contributions of three separate people (although each individual cell in the children’s bodies would have a genome derived from only two genetic contributors).

So, we actually know that partible paternity is real. Because it’s so difficult to notice, our current estimates are probably lower bounds. If 1:400 were the rate among live twins, probably that many dizygotic pregnancies in the United States also result from three or more genetic contributors. Probably this frequency is higher in cultures that celebrate rather than castigate this practice.

Honestly, I could be persuaded that estimates ranging anywhere from 1:20 to 1:4,000 were reasonable for the frequency that individuals from these cultures have three or more genetic contributors.** We just don’t know.

#

I agree with Gabriel Andrade that we’d like for medical students who grew up believing in partible paternity to benefit from our scientific understanding of genetics and inheritance – this scientific knowledge will help them help their patients. But I also believe that, even in this extreme case, the traditional knowledge should be respected. It’s not as inaccurate as we might reflexively believe!

The scientific uncertainty I’ve described above doesn’t quite match the traditional knowledge, though. A person can only receive genetic inheritance from, ahem, mating events that happen during ovulation, whereas partible paternity belief systems also treat everyone who has sex with the pregnant person over the next few months as a parent, too.

But there’s a big difference between contributing genes and being a parent. In Our Transgenic Future: Spider Goats, Genetic Modification, and the Will to Change Nature, Lisa Jean Moore discusses the many parents who have helped raise the three children she conceived through artificial insemination. Even after Moore’s romantic relationships with some of these people ended, they remained parents to her children. The parental bond, like all human relationships, is created by the relationship itself.

This should go without saying, but: foster families are families. Adopted families are families. Families are families.

Partible paternity is a belief that makes itself real.

.

.

.

** A note on the math: Dizygotic fertilization appears to account for 1:10 human births, and in each of these cases there is probably at least some degree of chimerism in the resulting child. My upper estimate for the frequency that individuals have three or more genetic contributors, 1:20, would be if sperm from multiple individuals had exactly equal probabilities of fertilizing each of the two egg cells. My lower estimate of 1:4,000 would be if dizygotic fertilization from multiple individuals had the same odds as the 1:400 that fraternal twin pairs in the U.S. have distinct primary genetic contributors. Presumably a culture that actively pursues partible paternity would have a higher rate than this, but we don’t know for sure. And in any case, these are large numbers! Up to 5% of people from these cultures might actually have three or more genetic contributors, which is both biologically relevant and something that we’d be likely to overlook if we ignored the traditional Indigenous knowledge about partible paternity.

.

.

header image from Zappy’s Technology Solution on flickr

On attentiveness and names.

On attentiveness and names.

When a scientist first discovers a function for a gene, that scientist gets to name it.  Sometimes these names seem reasonable enough: I worked with a hematologist who did a study to identify proteins involved in apoptosis, which means roughly “programmed cell death” or “cellular suicide,” and so each gene wound up named “Requiem 3”, “Requiem 4,” etc.

Fruit fly geneticists tend to give their discoveries more creative names than other scientists.  There’s the gene “cheap date” – if a fruit fly is missing that gene, it will – ha ha – be unable to process ethanol and  so quickly passes out.  Another genetic mutation produced male flies that would court either males or females, and so this was known for over a decade as “fruity,” until another scientist decided that universal courtship could be less offensively described by the term “fruitless,” because clearly any mating-like activity that does not lead to progeny is a waste of time.

Yup, some gene names were bad.  One person’s idea of a joke might seem to somebody else like a mean-spirited reference to the wider world’s power dynamics.

Other gene names were bad not out of malice, but because humor at the expense of a fruit fly doesn’t make as many people laugh when a human child is dying. 

A gene that produces a somewhat spiky-shaped protein was named after Sonic Hedgehog.  It seemed funny at the time!  See?  The protein is spiky, the video game character has spiky hair, and … get it?  You get it, right?

 Okay, so this Sonic Hedgehog protein doesn’t look all that much like Sonic the Hedgehog.  But spend enough time staring at something like protein crystal structures and you’ll experience pareidolia, like seeing animal shapes in irregularly dappled plaster ceilings, or anthropomorphic gods amongst the twinklings of the stars.

Well, the Sonic Hedgehog protein establishes a concentration gradient that allows cells to recognize their spatial position in a developing body.  If a human fetus comes to term despite having a mutation in the Sonic Hedgehog gene (genetic abnormalities will often result in a miscarriage, but not always), the resulting child will have severe brain defects.

And then a doctor has to explain, “Your baby is suffering because of a Sonic Hedgehog mutation.”

And so, in 2006, geneticists capitulated to medical doctors. No more fanciful names for genes that might lie at the root of human health problems … which, because humans and fruit flies are actually pretty similar, means most genes.  Patients would now be told about a mutation in the SHH gene instead of Sonic Hedgehog, or a mutation in the LFNG gene instead of Lunatic Fringe.

Words have power, after all.


Some people are more attentive to their environments than others.  During evolutionary time, this trait was obviously good for humanity.  If your tribe is traveling through a hostile environment, it helps to have somebody around who is paying attention to the world.  A friend who’s primed to notice encroaching threats like a hungry lion about to leap out and attack.  Maybe we should take a different path.  Which, yeah, that sounds like a good idea.

Other people are particularly inattentive to their surroundings, so it’s easy for them to ignore the world and focus instead on one single problem.  During evolutionary time, this trait was surely good for humanity, too.  It’s helpful to have somebody on the lookout for threats that might eat you, obviously.  But it’s also helpful to have somebody who might discover a way of using dried grass to weave baskets.  A way of cooking mud into pottery that could carry or store water.

Image by Herb Roe on Wikimedia Commons.

Neurodiversity is a virtue in and of itself.  Over the millennia, the world has offered our species many challenges.  Populations that were sufficiently diverse that some members were good at each of a variety of tasks were most likely to flourish.  A cooperative species like termites or Homo sapiens benefits from specialization among its members.

Left to our their own devices, people would naturally fall asleep and wake up at different times.  Some brains are primed to work best in the early morning; others work best late at night.  And that’s good.  It reduces the amount of time that a tribe would be susceptible to attack, everyone asleep.

But in the modern world, we occasionally forget to feel grateful for the diversity that allowed our species to thrive.  The high school students whose brains are primed for late-night thinking drag themselves through morning classes like zombies.  They’ll be midway through first period before the sun rises.  Their teachers glance derisively at their slumped and scruffy forms and call them lazy.


Eventually, humans invented language.  Much later, we invented writing.  Much, much later, we invented the printing press, and then written words became so widely accessible that most humans could benefit from learning how to read.

Of course, reading is easier for people who are inattentive to their environment.

If I had been born earlier in human evolution, I totally would have been lion bait.  When I’m reading a book, or am deep in thought, the rest of the world melts away.  When I’m typing at home, K or the kids sometimes shout my name several times before I even realize that I’m being spoken to. 

People like me, or this kid at a library, totally would’ve been lion bait.

Luckily for me, I wasn’t born way back then.  Instead I was born into a world where inattentive people – the people best able to block out the world and instead focus on their own thoughts – are the most likely to find academic success.  People like me become medical doctors.  Then we get to name the world’s various conditions and maladies.

And so, when it came time to categorize the sort of person who is especially attentive to the world, people like me (who obviously thought that our way of being is the best way to be) referred to those others as having an attention deficit disorder.

Identifying those people’s awareness of their environs might sound like a virtue; instead, we castigated those people’s difficulty at ignoring the world.

I’ve never read the Percy Jackson books, but I’m glad that they exist, if only for passages like this (from The Lightning Thief):

“And the ADHD – you’re impulsive, can’t sit still in the classroom.  That’s your battlefield reflexes.  In a real fight, they’d keep you alive.  As for the attention problems, that’s because you see too much, Percy, not too little.”


Childhood trauma can cause symptoms that medical doctors term “attention deficit disorder.”  Which makes sense – if you’ve gone through an experience where your environs were threatening, you should learn to be more aware of your environment.  It should become more difficult to ignore a world that has proven itself to be dangerous.

Even for somebody with my type of brain, it’s going to be easier to sit outside and read a book when there’s a squirrel nearby than if there’s a prowling grizzly fifteen meters away.

Some children have to learn early on that daddy’s sometimes a grizzly.  And if it can happen to him, why not other grown-ups, too?  Best to stay on high alert around the teacher.  She’s trying to get you absorbed in these number tables … but what if that’s a trap?


Certain drugs can narrow a person’s perception of the world.  They act like blinders, chemicals like nicotine, ritalin, and amphetamines, both un-methylated (sold under the trade name Adderall) and methylated (a CH3 group attached to the amine moiety of Adderall will slow its degradation by CYP2D6 enzymes in the liver, increasing the duration of its effects).

Note to non-chemists: the methylated analogue of Adderall goes by several names, including “ice,” “shard,” and “crystal meth.”  Perhaps you’ve heard of it — this compound played a key role in the television show Breaking Bad.  And it’s very similar to the stuff prescribed to eight year olds.  Feel free to glance at the chemical structures, below.

In poetry class last week, a man who has cycled in and out of jail several times during the few years I’ve taught there – who I’d said “hello” to on the outside just a few weeks earlier when he rode his bicycle past the high school runners and me – plonked himself down in the squeaky plastic hair next to mine.

I groaned.

“I know, I know,” he said.  “But I might be out on Monday.”

“What happened?”

“Failed a urine screen.  But I was doing good.  Out for six months, and they were screening me like all the time, I only failed three of them.”

“With … ?”

“Meth,” he said, nodding.  “But I wasn’t hitting it bad, this time.  I know I look like I lost some weight, dropped from 230 down to 205, but that’s just cause it was hard getting enough to eat.  Wasn’t like last time.  I don’t know if you remember, like, just how gaunt my whole face looked when they brought me in.  But, man, it’s just … as soon as I step outside this place, my anxiety shoots through the roof … “

This is apparently a common phenomenon.  When we incarcerate people, we carve away so much of their experience of the world.  Inside the jail, there is a set routine.  Somebody is often barking orders, telling people exactly what to do.  There aren’t even many colors to be distracted by, just the white-painted concrete walls, the faded orange of inmate scrubs, the dull tan CO shirts and dark brown pants.

The world in there is bleak, which means there are very few choices to make.  Will you sit and try to listen to the TV?  (The screen is visible from three or four of the twelve cells, but not from the others.)  Try, against all odds, to read a book?  Or add your shouting voice to the din, trying to have a conversation (there’s no weather, so instead the fall-back topic is speculating what’s going to be served for dinner)?

After spending time locked up, a person’s ability to navigate the wider world atrophies, the same as your leg would if you spent months with it bundled up in a cast.

And these are people whom we should be helping to learn how to navigate the world better.

“ … so I vape a lot, outside.  I step out of this place, that’s the first thing I do, suck down a cigarette.  And, every now and then … “

He feels physically pained, being so attentive to his surroundings.  And so he doses himself with chemicals that let him ignore the world as well as I can.

And, yes.  He grew up with an abusive stepfather.  This led to his acting squirrelly in school.  And so, at ten years old, medical doctors began dosing him with powerful stimulants.

Meanwhile, our man dutifully internalized the thought that he had a personal failing.  The doctors referred to his hyper-vigilance as an attention deficit disorder.


Words have power.

We can’t know now, after all the hurt we’ve piled on him, but think: where might our man be if he’d learned to think of his attentiveness as a virtue?

On violence and gratitude.

On violence and gratitude.

Although I consider myself a benevolent tyrant, some of my cells have turned against me.  Mutinous, they were swayed by the propaganda of a virus and started churning out capsids rather than helping me type this essay.  Which leaves me sitting at a YMCA snack room table snerking, goo leaking down my throat and out my nose.

Unconsciously, I take violent reprisal against the traitors.  I send my enforcers to put down the revolt – they cannibalize the still-living rebels, first gnawing the skin, then devouring the organs that come spilling out.  Then the defector dies.

800px-CD8+_T_cell_destruction_of_infected_cells
CD8+ T cell destruction of infected cells by Dananguyen on Wikimedia.

My cells are also expected to commit suicide whenever they cease to be useful for my grand designs.  Any time a revolutionary loses the resolve to commit suicide, my enforcers put it down.  Unless my internal surveillance state fails to notice in time – the other name for a cell that doesn’t want to commit suicide is “cancer,” and even the most robust immune system might be stymied by cancer when the traitor’s family grows too large.

Worse is when the rebels “metastasize,” like contemporary terrorists.  This word signifies that the family has sent sleeper agents to infiltrate the world at large, attempting to develop new pockets of resistance in other areas.  Even if my enforcers crush one cluster of rebellion, others could flourish unchecked.

800px-How_metastasis_occurs_illustration
How metastasis occurs. Image by the National Cancer Institute on Wikimedia.

I know something that perhaps they don’t – if their rebellion succeeds, they will die.  A flourishing cancer sequesters so many resources that the rest of my body would soon prove too weak to seek food and water, causing every cell inside of me to die.

But perhaps they’ve learned my kingdom’s vile secret – rebel or not, they will die.  As with any hereditary monarchy, a select few of my cells are privileged above all others.  And it’s not the cells in my brain that rule.

Every “somatic cell” is doomed.  These cells compose my brain and body.  Each has slight variations from “my” genome – every round of cell division introduces random mutations, making every cell’s DNA slightly different from its neighbors’.

The basic idea behind Richard Dawkins’s The Selfish Gene is that each of these cells “wants” for its genome to pass down through the ages.  Dawkins argued that familial altruism is rational because any sacrifice bolsters the chances for a very similar genome to propagate.  Similarly, each somatic cell is expected to sacrifice itself to boost the odds for a very similar genome carried by the gametes.

Only gametes – the heralded population of germ cells in our genitalia – can possibly see their lineage continue.  All others are like the commoners who (perhaps foolishly) chant their king or kingdom’s name as they rush into battle to die.  I expect them to show absolute fealty to me, their tyrant.  Apoptosis – uncomplaining suicide – was required of many before I was even born, like when cells forming the webbing between my fingers slit their own bellies in dramatic synchronized hara-kiri.

28407608404_84b3c64433_h
Human gametes by Karl-Ludwig Poggemann on Flickr.

Any evolutionary biologist could explain that each such act of sacrifice was in a cell’s mathematical best interest.  But if I were a conscious somatic cell, would I submit so easily?  Or do I owe some sliver of respect to the traitors inside me?

The world is a violent place.  I’m an extremely liberal vegan environmentalist – yet it takes a lot of violence to keep me going.

From Suzana Herculano-Houzel’s The Human Advantage:

image (1)Animals that we are, we must face, every single day of our lives, the consequences of our most basic predicament: we don’t do photosynthesis.  For lack of the necessary genes, we don’t just absorb carbon from the air around us and fix it as new bodily matter with a little help from sunlight.  To survive, we animals have to eat other living organisms, whether animal, vegetable, or fungus, and transform their matter into ours.

And yet the violence doesn’t begin with animals.  Photosynthesis seems benign by comparison – all you’d need is light from the sun! – unless you watch a time-lapsed video of plant growth in any forest or jungle.

The sun casts off electromagnetic radiation without a care in the world, but the amount of useful light reaching any particular spot on earth is limited.  And plants will fight for it.  They race upwards, a sprint that we sometimes fail to notice only because they’ve adapted a timescale of days, years, and centuries rather than our seconds, hours, and years.  They reach over competitors’ heads, attempting to grab any extra smidgen of light … and starving those below.  Many vines physically strangle their foes.  Several trees excrete poison from their roots.  Why win fair if you don’t have to?  A banquet of warm sunlight awaits the tallest plant left standing.

And so why, in such a violent world, would it be worthwhile to be vegan?  After all, nothing wants to be eaten.  Sure, a plant wants for animals to eat its fruit – fruits and animals co-evolved in a system of gift exchange.  The plant freely offers fruit, with no way of guaranteeing recompense, in hope that the animal might plant its seeds in a useful location.

But actual pieces of fruit – the individual cells composing an apple – probably don’t want to be eaten, no more than cancers or my own virus-infected cells want to be put down for the greater good.

A kale plant doesn’t want for me to tear off its leaves and dice them for my morning ramen.

But by acknowledging how much sacrifice it takes to allow for us to be typing or reading or otherwise reaping the pleasures of existence, I think it’s easier to maintain awe.  A sense of gratitude toward all that we’ve been given.  Most humans appreciate things more when we think they cost more.

We should appreciate the chance to be alive.  It costs an absurd amount for us to be here.

But, in the modern world, it’s possible to have a wonderful, rampantly hedonistic life as a vegan.  Why make our existence cost more when we don’t have to?  A bottle of wine tastes better when we’re told that it’s $45-dollar and not $5-dollar wine, but it won’t taste any better if you tell somebody “It’s $45-dollar wine, but you’ll have to pay $90 for it.”

Personally, I’d think it tasted worse, each sip with the savor of squander.

On stuttering.

On stuttering.

CaptureDuring his first year of graduate school at Harvard, a friend of mine was trying to pick a research advisor.  This is a pretty big deal — barring disaster, whoever you choose will have a great deal of control over your life for the next five to eight years.

My friend found someone who seemed reasonable.  The dude was conducting research in an exciting field.  He seemed personable.  Or, well, he seemed human, which can be what passes for personable among research professors at top-tier universities.  But while my friend and the putative advisor-to-be were talking, they got onto the topic of molecular dynamics simulations.

My friend mentioned that his schoolmate’s father studies simulations of cellular membranes.  And that guy, the father, is incredibly intelligent and very friendly — when I showed up at a wedding too broke for a hotel, he let me sleep on the floor of the room he’d booked for himself and his wife.

But the putative advisor corrected my friend when he mentioned the guy’s name.  “Oh, you mean duh, duh, duh, duh, Doctor ________.”  And smiled, as though my friend was going to chuckle too.

stutter_by_visualtextproject-d49ak0vThat’s when my friend realized, okay, I don’t wanna talk to you no more.  He found a different advisor.  He never regretted his choice.

Well, no, that’s not true.  All graduate students regret their choice of advisor sometimes.  But my friend never wished he’d worked for the jerk.

Yes, some people, with a huge amount of effort and probably an equal measure of luck, are able to get over stuttering.  But most can’t.  So it’s crummy that even well-educated, ostensibly sophisticated people would feel entitled to mock somebody for a stutter.  Presumably even that jerk would’ve refrained from an equivalent comment if my friend’s schoolmate’s father was blind or confined to a wheelchair.

But stuttering, along with a few other conditions like depression and obsessive compulsive disorder, still gets treated like a moral failing.  Like a sufferer should be able to try harder and just get over it.

That attitude is especially bad as regards stuttering, because mockery and castigation seems to make the condition worse.  There are genetic factors that confer a predilection toward stuttering, but (unpublished, evil) work from Dr. Wendell Johnson showed that sufficiently vituperative abuse can cause children of any genetic background to become stutterers.

CaptureYou’ve read about the “monster” study, right?  Dr. Johnson stuttered, and he had a theory that his stuttering had been exacerbated by people’s well-meaning attempts to cure him.  His parents would correct his speech, draw attention to his mistakes, exhort him to be more mindful when talking.  Dr. Johnson thought that the undue attention placed on his speech patterns made him more likely to freeze up and stutter.  And, once that cycle had begun, his brain dug itself into a rut.  He began to castigate himself for his mistakes, perpetuating the condition.

Of course, that was just a theory.  To test it, you’d want to show two things.  First, that by not paying attention to the mistakes of an incipient stutterer, you can help that person evade or cure the condition.  And, second, that you could cause well-spoken people to develop stutters by convincing them and their interlocutors that they already were stuttering, and castigating them for it.

It’s totally ethical to conduct the first experiment.  The process itself would cause no harm, and the intention is to improve someone’s life.  If you can help someone get over a stutter, you’ll smooth future social interactions.  Stave off some mockery from colleagues at Harvard.  That sort of thing.

But the second experiment?  The process is miserable for the study subjects — you’re cutting them off all the time, criticizing them, forcing them to say things over and over until their thoughts are expressed perfectly.  And, worse, if you succeed, you’ve saddled them with burdens they’ll have to deal with for the rest of their lives.  Let the mockery commence!

CaptureDr. Johnson made one of his students conduct that second experiment on six orphaned children.  In the end, none of the children developed the syllabic repetition typical of most stutterers, but they became extremely self-conscious and reluctant to speak — symptoms that stayed with them for the rest of their lives.

Indeed, the symptoms triggered in those children are equivalent to the symptoms monitored for a stuttering model in mice.  One of the genetic factors associated with stuttering was recreated in mice, and those mice exhibited a condition somewhat analogous to human stuttering.

Dr. Dolittle did not participate in this new study, which made matters much more difficult for Barnes & colleagues.  If you don’t know what a mouse is saying, how do you know whether it’s studying?  They did measure variance from one vocalization to the next — in humans, repeating the initial syllable of a word lowers total syllabic variance — and saw that their mice with the stuttering gene repeated sounds more often.

Their best measurements, though, were the rate of squeaking, and the length of pauses between squeaks.  Like an oft-badgered child, the mice with the stuttering gene talked less and spent more time waiting, maybe thinking, between statements.

And it pleases me, given my pre-existing biases, to see more data showing that, if somebody stutters, it’s not that person’s fault.  Genetic predilection certainly isn’t the same thing as destiny, but it’s a nice corrective to the mocking jerks.  Sure, you can speak fine, Mister Mockingpants, but are you fighting against the current of a lysosomal targeting mutation?

(Oh, right, sorry, my mistake. Doctor Mockingpants. You jerk.)

*************

Capturep.s. As it happens, the mutation Barnes et al. introduced into mice is involved in the pathway I studied for my thesis work.  They introduced a mutation in the Gnptab gene (trust me, you don’t want me to write out the full name that Gnptab stands for), which is supposed to produce a protein that links a targeting signal onto lysosomal enzymes.  In less formal terms, Gnptab is supposed to slap shipping labels onto machinery destined for the cell’s recycling plants.  Without Gnptab function, bottles & cans & old televisions pile up in the recycling plant. The machinery to process them never arrives.

Which does seem a little strange to me… stuttering is a very specific phenotype, and that is such a general cellular function.  Lysosomal targeting is needed for all cells, not just neurons in speech areas of the brain.  It’s a sufficiently common function that biologists often refer to Gnptab as a “housekeeping” gene.  And proper lysosome function is sufficiently important that problems typically cause major neurodegeneration, seizures, blindness, and death, typically at a very young age.  Compared to that litany of disasters, stuttering doesn’t sound so bad.

On how human different humans happen to be (hint: equivalently human).

CaptureI finally read some of the initial papers (circa 1981) describing an outbreak of opportunistic infections among previously-healthy homosexual men in the United States.  The case studies are harrowing — a dispassionate litany of suffering, ending with death.

And, yes, these are papers from before I was born.  I should’ve read them already, or at least know enough about them that they’d have no impact.  To someone like my father, for instance, who has worked with HIV patients for most of my life, the old case studies would not seem shocking — I recently read Henry Marsh’s Do No Harm, which carries a beautiful epigraph from Rene Leriche (I’m not sure who translated this from the French — if somebody knows, please tell me!): “Every surgeon carries within himself a small cemetery, where from time to time he goes to pray — a place of bitterness and regret, where he must look for an explanation for his failures.” — my father, like most medical doctors, can surely close his eyes to summon up memories more bleak than the case studies I’ve been reading.

But to me, a medical naif, the papers remind me of the horrifying violence against women section of Roberto Bolaño’s 2666.  Personal tragedy and heart-wrenching suffering condensed into clinical prose.  Not fun.

But I had a reason for subjecting myself to this!  A recent NPR news investigation alerted me to Susan Smith’s article “Mustard Gas and American Race-based Human Experimentation in World War II.”

To put these experiments in perspective, I think it’s worth considering how mustard gas works.  Luckily, I took a medicinal chemistry class with Rick Silverman where we discussed the mechanism of both mustard gas and the early mustard-gas-esque chemotherapy drugs known as nitrogen mustards.  It was a cool topic, so I still remember it: I’ve drawn out the mechanism (with some helpful notes!) below.

mechanism001

And, looking back on this, there are a few things worth noting.  One is, yeah, it’s perhaps obvious why I was emotionally leveled by reading those AIDS case studies — most of what I know is massively abstracted.  It’s very different to hear the words “mustard gas” and envision a lines-and-letters mechanism  versus seeing a image of Rollins Edwards juxtaposed with another depicting a jarful of his own skin (which appears halfway down the page for the NPR story).

Capture
See the NPR article here.

I’d like to think that the scientists who originally designed these experiments were picturing everything on that same level of abstraction.  Not that this excuses what they did, but it’s slightly less awful to imagine that they were simply oblivious to the human harm they were causing.

The second is, well, look!  Mustard gas crosslinks DNA!  How different from black or Puerto Rican or Japanese soldiers did those white scientists imagine themselves to be to think that mustard gas would show differential efficacy?

And that’s why I was looking up the AIDS papers.  Because I attended a symposium in 2002 where Lane Fenrich read excerpts from those original papers.  His message was that the authors of those original papers implied that homosexuals are distinct on a cellular level.

I no longer remember which passages he chose to read, but here are two quotes that convey his point.  The first is from the paper by Gottlieb et al.:

Depression of T-cell numbers and of proliferative responses to the degree observed in our patients has not been reported to occur in ctyomegalovirus-induced syndromes in normal persons.

Should I be doing something cheeky with font to add emphasis to the words “normal persons” at the end of that sentence?  Naw, I think you probably get the point.

The second quote I thought I’d include is from a 1982 Center for Disease Control report:

Infectious agents not yet identified may cause the acquired cellular immunodeficiency that appears to underlie [Kaposi’s sacroma] and/or [Pneumocystis cainii pneumonia] among homosexual males.

Again, the message being sent is that there are cellular differences.  An infectious agent targets basic human biology among homosexual males.  Which is a crazy message to send.  Sure, they only had a small data set — they didn’t have any evidence yet that the same infectious agent might cause immunodeficiency in heterosexuals, or in women.  But, wouldn’t that be a reasonable assumption to make?  You have to presume pretty extreme levels of otherness to think that would not be the case.

ZPp_fotx0TiwtLE3nEBnVw_sharegeneswmeReading these papers made me pretty happy that a friend sent us a copy of 23andMe’s board book You Share Genes with Me shortly after N was born.  With corny rhymes the book celebrates how similar we are to organisms ranging from grasses, flies, fish, up to chimpanzees and our (presumed) human friends.  With numbers, too — if N could speak, perhaps she could let you know that chimpanzees share ca. 96% of her DNA sequence, and another human baby ca. 99.5%.

Which is a nice message to send.  Human brains are so good at presuming otherness; it’s charming to have a book for her that makes clear how similar we all are, people, animals, and even plants.

********************

p.s. Maybe you’ve read reports about pharmaceuticals with race-based differential efficacy.  And, yes, despite over 99% DNA sequence identity between any two human beings, there are some differences that correlate with ethnicity.

Appearance, for one — many people assume they can assess ethnicity well from photographs.  Lactase persistence is another, and that seems to have developed recently (as far as evolutionary timescales go).  It’s not so unreasonable to imagine differences in drug metabolism between humans of differing genetic ancestries, and that can have a big impact on efficacy: two people taking the same dose of a medication might experience significantly different concentrations of the active ingredient.

I’ll include more about these issues when I finally get around to posting that essay on the evolution of skin color, but on the whole these seem to be pretty minor differences, and nothing that would affect sensitivity to mustard gas, which takes a baseball bat to your DNA long before you’d have a chance to metabolize it.  And the only big news story about race & pharmaceuticals that I know about is for that heart medication, BiDil.  In that case, it seems most likely that their rationale for claiming race-based efficacy was to help them file a new patent.  If you’re curious, you could read Dorothy Roberts’s article chastising the race-based claims; here I’d like to highlight just these three lines:

In the past, the FDA has had no problem generalizing clinical trials involving white people to approve drugs for everyone.  That is because it believes that white bodies function like human bodies.  However with BiDil, a clinical trial involving all African Americans could only serve as proof of how the drug works in blacks.