Here are various discursions and distractions that didn’t find their way into The Tale of the Dueling Neurosurgeons—not because they’re un-fun or un-illuminating, but because paper is more expensive than electrons, and I had to cut something.


Chapter One:

One way to do it

Getting a master surgeon degree in Ambroise Paré’s time required the candidate to deliver an address in Latin. Paré didn’t speak Latin, so when he decided to pursue the degree, he faced a dilemma. He finally convinced a learned friend and ally to write out his speech for him months ahead of time, and he memorized it. This obvious sham of a lecture stirred up scandal among the officials granting the degree, and Paré barely mustered enough support.

Leonardo’s (non-)influence

Some scholars have argued that Leonardo da Vinci dissected bodies long before Vesalius did and therefore deserves credit as the true pioneer in anatomy. It’s true that Leonardo had dissected a number of bodies by the later 1400s. But he did so in secret, and his work did not influence anatomical science, since no one knew of it. Leonardo also dissected the bodies haphazardly and superficially, nothing like the systematic work of Vesalius.

Body press

In addition to producing Fabrica, Vesalius’s printer was notorious for printing the first Koran translated into Latin. He was jailed for this stunt, and it took Martin Luther, of all people, to get him sprung.

By the by, while waiting for his book to print in Basel, Vesalius got restless, and decided to dissect a body that more or less fell into his lap there. A local bigamist named Jakob Karrer had decided to end his legal troubles by murdering his first wife, and got caught and hanged. Vesalius sweet-talked the authorities into letting him take possession of Karrer, and he held a public dissection for the masses. No one had dissected a body in Basel for a dozen years before that, and the university decided to preserve the specimen. In fact, it still exists today, making it the oldest known anatomical preparation of a body.

Theory, not practice

It’s easy to be smug about the superficial anatomy lectures in Vesalius’s day, and to deride the students and professors for lacking any imagination or gumption: they were obviously more interested in parroting the ancients than on striking out and discovering something on their own. But how much better are things today? At least one historian has pointed out that these lectures were probably about on par, in terms of intellectual stimulation, with the experiments that high school and college students do in classes today, since any halfway bright lad or lass knows the “right” answer and will manipulate his findings or believe whatever he needs to believe to get a good grade. Makes you think.

Superstitious folk cures

Regarding folk cures, Paré also exploded the myth of the bezoar stone. Bezoar stones are indigestible lumps of hair and other organic gunk found in the intestines of many animals, and for some reason, doctors claimed that bezoar stones could cure any poison. So when a stone came for sale in Paris in the 1560s, Charles IX asked Paré for his advice. Paré, being an experimentalist, devised a test. A local duke’s cook had been caught stealing silver plates, and was scheduled to hang. Pare offered him the chance to die by poisoning instead—provided he swallowed the bezoar stone immediately afterward. The cook thought he had nothing to lose, and agreed.

An apothecary prepared a potent mix of mercury chloride, and both it and the stone went down the cook’s hatch. Within moments, his insides felt like they were being roasted alive. Within an hour he was crawling on all fours, coughing and gagging and bleeding from every orifice. The bezoar stone of course didn’t help at all, and the cook died after seven hours of agony. (Near the end, he said he wished he’d been hanged.) Charles thanked Paré heartily for saving him the cash.

Chapter Two:

An insane plea

Actually, Scoville did try another defense, but it didn’t go anywhere. Because Garfield had been shot in one place (D.C.) and died in another (New Jersey), Guiteau technically—according to a strict legalistic reading of the law—may not have committed murder, which required the murderous act and the death to take place in the same jurisdiction. (This was a legacy of the days when individual states had greater autonomy.) This appeal was known as the Burr defense, after Aaron Burr, who—while Vice President of the United States, mind you—had shot Alexander Hamilton in a duel in New Jersey, but let Hamilton crawl to New York to die. As a result, Burr never stood trial. The judge in Guiteau’s case nixed the idea.

Lack of guards

Reading over the details of Garfield’s shooting, we today can’t help but wonder where the heck his guards were. He didn’t have any, really. The Secret Service, created after the Civil War to combat counterfeiters, had occasionally guarded the President before then, but the agency got in trouble when it tried to take on the duty full-time: people didn’t like the idea of a permanent praetorian guard, which seemed more fitting for decadent European monarchs than leaders of a free country. Garfield enjoyed the freedom to walk around alone at night anyway. Reportedly, one secret serviceman was posted outside the D.C. train depot on July 2, and heard a shot, but didn’t investigate, even after he saw a man being hauled away in cuffs. He reported the incident to his boss after his shift, and didn’t think anything more about it until he saw the headlines the next day. Incidentally, the train depot where Garfield was shot was not today’s Union Station, but an old station that stood near where the National Gallery of Art stands today.

McKinley’s speech

Elected in 1896, McKinley had dragged the country out of depression and won the Spanish-American war, both popular moves. Now he wanted to expand the American economic empire, through treaties, alliances, and trade. In a speech at the PanAm Expo on September 5, he declared, “this country is in a state of unexampled prosperity,” and he promised still brighter days ahead. Newspapers worldwide praised the talk as a welcome departure from the usual American isolationism and xenophobia. The irony of the speech was not lost on Czolgosz, who certainly didn’t have much prosperity to look forward to.

Guiteau’s chutzpah

During his trial, Guiteau had the cheek to cite the nation’s sudden unity – in hatred, against him – as evidence of God’s hand in preordaining the assassination of Garfield. Guiteau also wrote to now-President Chester Arthur at one point to ask for funds for his defense: after all, Guiteau pointed out, he’d secured Arthur his promotion, and sextupled his salary to $50,000.

Civil service reform

The one positive thing to come out of Guiteau’s murderous act was the Pendleton Act, a congressional law that reformed the civil service and ensured that most government jobs would not change hands even after a change of administrations. This killed the spoils system for good.

$1,800

As another measure of what a sham the trial of Czolgosz was, the whole trial cost $1,800, much of which went to pay the alienists who ended up damning Czolgosz anyway.

Exceptions

There are some exceptions in the natural world to the general soup/spark mechanism outlined in the chapter. The big one is that some neurons actually are fused together at their ends, much like Golgi envisioned, and can therefore pass signals between each other electrically, without the need for intermediary chemicals. Mammal eyes, for instance, often contain some neurons like this. Similarly, neurons wired up to our sense organs (those in the ear, the nose, etc.), generally don’t fire in response to chemical input; there’s usually some sort of physical impulse (like a sound, or odor) that triggers them. After that impulse, though, they send information along with sparks and soups, and in general, virtually all neurons inside the brain do use the general soup-spark mechanism outlined here.

Parochialism defined

No sitting president would leave the country until 1906, when Teddy Roosevelt, McKinley’s successor, visited the construction site of his pet project, the Panama Canal.

Our virus forefathers

To release neurotransmitters into the synapse, the axon bubbles that carry the chemicals first have to fuse with the axon’s outer membrane. And believe it or not, those bubbles may well have stolen their fusion equipment from viruses. The thinking is that microbes are very good about fusing with the membranes of other cells—that’s how they infect cells. And it turns out that viruses end up exchanging tons of DNA with their hosts—up to 8 percent of human DNA is old, broken-down virus DNA, after all. Somewhere during our evolution, then, we may have appropriated the viral-fusing DNA for our own purpose, and used it to make our neurons more efficient. Perhaps we owe our very genius to viruses!

Admittedly, this is all a theory. But however far-fetched it sounds, we already know that something similar has happened in our past.: the DNA that helps mammalian embryos form a placenta inside their mothers’ wombs was also ripped off from viruses. I talk about this connection more in my book The Violinist’s Thumb.

Chapter Three:

Nabokov on synesthesia

The full Nabokov quote on synesthesia reads: “I present a fine case of colored hearing. Perhaps ‘hearing’ is not quite accurate, since the color sensation seems to be produced by the very act of my orally forming a given letter while I imagine its outline. The long a of the English alphabet (and it is this alphabet I have in mind farther on unless otherwise stated) has for me the tint of weathered wood, but a French a evokes polished ebony. This black group also includes hard g (vulcanized rubber) and r (a sooty rag bag being ripped). Oatmeal n, noodle-limp l, and the ivory-backed hand mirror of o take care of the whites. I am puzzled by my French on which I see as the brimming tension-surface of alcohol in a small glass. Passing on to the blue group, there is steely x, thundercloud z, and huckleberry k. Since a subtle interaction exists between sound and shape, I see q as browner than k, while s is not the light blue of c, but a curious mixture of azure and mother-of-pearl. Adjacent tints do not merge, and diphthongs do not have special colors of their own, unless represented by a single character in some other language…

“In the green group, there are alder-leaf f, the unripe apple of p, and pistachio t. Dull green, combined somehow with violet, is the best I can do for w. The yellows comprise various e’s and i’s, creamy d, bright-golden y, and u, whose alphabetical value I can express only by ‘brassy with an olive sheen.’ In the brown group, there are the rich rubbery tone of soft g, paler j, and the drab shoelace of h. Finally, among the reds, b has the tone called burnt sienna by painters, m is a fold of pink flannel, and today I have at last perfectly matched v with ‘Rose Quartz’ in Maerz and Paul’s Dictionary of Color. The word for rainbow, a primary, but decidedly muddy, rainbow, is in my private language the hardly pronounceable: kzspygv.”

Holman in Rome

In some ways Rome really was the perfect city for Holman: The Colosseum and every other ancient structure had fallen into disrepair, and were therefore best evoked through the imagination anyway. He also had a network of friends to help him there. When visiting the Vatican’s art museums, they would alert Holman when the guards’ backs were turned, so he could feel up the statues.

Stroop test

Anyone who took Psych 101 probably took the Stroop test, in which color names like “green” or “orange” are written in colored fonts. When the font and word correspond (“yellow” written in yellow), it’s easy to name the color. When they conflict (“yellow” written in blue), people struggle to name the color. Similarly, if you show a synesthete a number or letter that’s the “wrong” color, she’ll take much longer to identify it. To overly sensitive synesthetes, the wrong color actively repels them.

LSD hangover

Albert Hofmann found that his brain continued to work differently even the day after he’d taken the first deliberate hit of LSD, long after the high had faded: “Breakfast tasted delicious and gave me extraordinary pleasure,” he reported. “When I later walked out into the garden, in which the sun shone now after a spring rain, everything glistened and sparkled in a fresh light … All my senses vibrated in a condition of highest sensitivity.”

Nintendo power!

When Paul Bach-y-Rita’s team developed the glove to help the man with leprosy regain a sense of touch, they actually adapted an old Nintendo power glove.

Forced synesthesia

You might be wondering how on earth can synesthetes even read, what with colored letters popping off the page. Most synesthetes, though, don’t find the experience distracting—they like it.

For what it’s worth, I did get a hint of what reading with this condition must be like when I ordered a used book online recently. It ran about four hundred pages, and pretty much every single line, from intro to index, had been highlighted in different colors of pencil—red, blue, green, peach, yellow. Sometimes a sentence would start in one color and end in another, and there seemed no rhyme or reason to the madness. (The online vendor and I clearly had a different definition of “used—good condition!”) I thought I’d never get through a page of this, but in the end it really didn’t matter. As distracting as the colors were when you focused on them, a story can break through and seize your attention.

In contrast, people do generally find miscolored foods unsettling, even disgusting. Psychologists who drained the color from bacons and cheeses and juices found that people judged them as having weaker tastes. I also remember reading while in college—although I cannot find the citation, alas!—about an experiment where psychologists served people green steaks and blue mashed potatoes and similar food in a setting with strangely colored lights, to imitate a trendy restaurant. When people couldn’t see the true color of their food, they scarfed it. When the psychologists turned on normal white light, however, people immediately lost their appetites, and some even darted for the bathroom. And again, I believe this. I remember being excited when Heinz introduced green catsup in 2000, but I simply couldn’t stomach the stuff. Even the apparatus of food can affect its taste: when Coca-Cola introduced while holiday cans in 2011, many consumers complained that the pop inside tasted different. Although anecdotal, these cases hint that we all have a trace of synesthesia working inside us, because the color of the food clearly affects the taste.

Sensory substitution mechanisms

It’s kind of an ill-formed question, but you might also be wondering how the brain “knows” to send tongue or ear data to the visual center for processing. That is, even if you assume that pathways exist between the tongue and vision centers, why does the brain allow the signal to reach the vision center in the first place?

At times during interviews, Bach-y-Rita suggested that perhaps the different senses encode their information with different patterns of pulses, and perhaps the brain recognizes those patterns. So if visual data go badda-bah-bah-bah, maybe tactile data goes badda-bah-badda-bah. So maybe when the brain sees the first pattern, it knows to route the information to the visual centers, even if it’s coming from the “wrong” source.

Other scientists explain how the brain “knows” in a different way. We normally use vision to parse details about space—where objects are around us, and how to move around them. But hearing also contains a residue of spatial information, since we can determine where sounds come from. Most sighted people more or less disregard this information, but blind people can exploit it more fully, and recruit the unemployed visual processing centers to do extra work. In this view, the so-called vision-processing centers of the brain aren’t vision processors per se; instead, they’re more focused on analyzing and discriminating spatial relations, regardless of which specific sense supplies the input. Touch also contains some spatial information, which probably explains why blind people who read Braille show activity in their “visual” cortex.

By extension, then, perhaps all our sensory centers really, deep down, don’t process individual senses as much as they process more abstract qualities, like movement, shape, or spatial relations. This means that senses that contain residues of more than one type of information, like touch, vision, or hearing, can indeed be processed in multiple places in the brain. For more on this type of brain plasticity, see the excellent book The Brain That Changes Itself, by Norman Doidge.

Chapter Four:

More on Ladd

About her unconventional marriage, there’s an anecdote that, one morning, Ladd dropped her wedding ring into her empty coffee mug at breakfast and summoned her maid. She told her to remove both the cup and the ring, as she’d finished with both.

Ladd was inspired to open her mask studio by Francis Derwent Wood, a sculptor based at the Third General Hospital in London. Too old to enlist in 1915, Wood had joined the medical corps as an orderly; feeling he was wasting his talent, he began crafting metal faces for Tommies in 1916. Ladd heard of Wood’s outfit—nicknamed the “Tin Noses Shop”—late the next year and wrote to him for instructions, which Wood happily supplied.

Incidentally, Ladd’s idea to add metal facial hair was not new. To make their work look more realistic, ancient Greek sculptors sometimes adorned their statues with silver foil eyebrows and mustaches. The Greeks also often left the eyes of statues blank and later painted them in, to heighten the realism even more. Funnily enough, Renaissance sculptors didn’t realize this about the eyes, so when they began copying everything the Greeks did, they often left the eyes blank, too, which made their work look not realistic but simply spooky.

Sentinel tissue

In modern face transplants, doctors often transplant a small flap of the donor’s skin on the body of the transplant recipient as well, a graft called a “sentinel flap.” The idea is that if the body starts to reject the donated skin, it’s easier to tell that from a sentinel flap on the torso than from the face skin, which blushes and gets flushed and in general changes its appearance much more readily than body skin.

Unfair warning

In addition to the cornflower blue jackets that soldiers at the facial-injury hospital in Kent wore, villages near the hospital erected blue bus benches for the patients to wait on, benches far removed from where the regular folks waited. At least one man who’d lost his face in the war got sick of such treatment and enacted a plan for revenge. He had to wear a leather mask between operations, which only piqued people’s interest and made them more curious. Busybodies would even edge near him on the train or bus, ostentatiously nonchalant, and take peeks. When enough ladies and gentlemen had gathered around, he’d whip his mask off and snarl. The next day, at his doctor’s office, he would walk in holding up a certain number of fingers—the number of gawkers he’d sent away screaming.

W-O-M-E-N

Just thought I’d throw this in there. Again, people with primary-visual-cortex damage often have sharp and distinct blind spots. V.S. Ramachandran, in his book Phantoms in the Brain, recounts an amusing story about a guy who repeatedly walked into the wrong restroom in public, because the “W-O” in the word “women” on the door often fell within his blind spot.

More kitty torture

Hubel and Wiesel did other diabolical experiments on cats as well. The oddest work involved raising kittens, from birth, inside plastic cylinders with black stripes inside them – either all horizontal or all vertical. They then took care not to wear striped shirts, or in any way expose the kitten to lines running the other way. The net result was cats who couldn’t see lines of certain orientations, because their neurons had never been trained to see those lines in childhood.

As you can imagine, both the vertical-world cats and the horizontal-world cats struggled when finally let out into the real world, but they struggled with different tasks. The horizontal-world cats could see the (horizontal) seat of a chair just fine and would hop up onto it to nap. But they couldn’t see the chair legs, since their neurons had never encountered verticals, and they would knock into the legs over and over. Vertical-world cats had the opposite problem: they could avoid the chair legs, but could never find a cozy place to snooze. Hubel and Wiesel concluded that all animals have a “critical period” in childhood where they need exposure to lines of all sorts, or else they lose the ability to see them.

No human beings have ever been raised this way, of course, but scientists have tested our predilections for horizontals and verticals in other ways. It turns out that the neurons of people raised in first-world countries, surrounded by rectilinear walls and buildings, respond strongest to flat lines and vertical lines. People who grew up outside the first-world visual hegemony, especially in nomadic tribes, have brains much more open to lines of different orientations.

Seeing in space

Seeing a shape usually requires something to ping a nerve cell within the retina, but perhaps not always. Astronauts in space regularly report seeing geometric hallucinations or flashes of lights. They’re usually white and elongated streaks, and they often move side-to-side. One astronaut compared them to “luminous dancing fairies,” and they reportedly disturbed people’s sleep sometimes. These are almost certainly caused by cosmic rays, which we’re shielded from on the ground but that astronauts get exposed to (at least a little bit) in space. Most of these cosmic rays probably hit the retina and excite cells there, but some scientists have suggested that the rays might penetrate the skull and tickle the visual cortex directly.

While we’re talking about outer space and the brain, a study in late 2012 argued that exposure to cosmic rays in space might increase someone’s chances of developing Alzheimer’s disease, but critics said the risks outlined in the study were overblown. One ailment that astronauts do seem at higher risk to develop is cataracts.

Semi-upside-down faces

In case you’re curious, experiments have revealed that the literal tipping point between faces that we can process with our face-recognition circuits, and faces that have to be relegated to our object-recognition circuits, is around 45°. Past that point faces effectively become objects.

Chapter Five:

Lots of ice

As one measure of just how rapidly the U.S. medical corps expanded during the Civil War, consider this: by mid-war, the United States was spending more on ice than it had the entire medical corps in the prewar years.

Not even three

I mentioned in the book that the U.S. military scoured its records and couldn’t find any evidence of anyone who’d had all four limbs amputated. Truth be told, the military couldn’t find any evidence of someone who’d lost even three limbs during the Civil War.

Military justice

Many grunts during the Civil War loathed army surgeons as butchers, and stories circulated throughout the war of gravely injured men who, with their last bit of strength, rose up, socked a surgeon’s jaw, then fell back dead and happy. These were probably apocryphal, but many soldiers indulged similar fantasies. Mitchell, however, defended the Gettysburg medical corps as heroes. As the battle roiled and the tides shifted those three days, new areas constantly came under fire, and quiet little surgical redoubts became contested ground. Sometimes the doctors evacuated their patients, but other times they continued operating under fire. Thirteen surgeons died or took serious wounds at Gettysburg.

Mitchell on stumps

Mitchell used a homey metaphor to describe why the brain was fooled by signals coming from the stumps. He compared the nervous system to a bell-wire, a sort of early doorbell. To use the bell-wire, guests pulled a lever near the front door. This tripped a long wire inside the house, which ran upstairs and rang a bell in the servant’s quarters. As Mitchell said, in normal circumstances, a ringing bell means that someone is waiting at the door. But you could also play a joke and tug the wire at a point between the door and servant’s quarters, thereby fooling the servant into believing that someone was outside. Just so, a nerve impulse from a stump halfway between the brain and foot will fool the brain into thinking that the foot itself, where the nerve once originated, was still intact.

Refinement

The premotor cortex and supplementary motor area both help coordinate movement, but they’re not identical. The premotor cortex gets input from the where/how stream and helps us grasp things in space; in general, it’s responsible for movement that’s guided by our senses. The supplementary motor area takes charge of movement that’s internally generated—movement controlled by our volition and free will.

Does the mirror box really work?

Some scientists have speculated that the mirror box might be a placebo. That’s always a danger. Ramachandran, however, thinks he can prove that his patients were not lying about the relief they got from the mirror box. Some amputees had burning sensations as well as cramping pains, and in these cases, the mirror did nothing to quench the blaze. That would be a strange placebo indeed to cure some types of pain but not all.

Mirror unreality

In some clever follow-up experiments, Ramachandran got his patients’ phantoms to defy laws of physics. Again, touching the cheeks of some amputees can produce an identical touch sensation in the phantom. So when Ramachandran dabbed water on one patient’s cheek, the patient felt the droplet running down his phantom as well. Simple enough. But when Ramachandran asked him to mentally raise the phantom over his head, and repeated the dribble down the cheek, the droplets now started running up the patient’s phantom arm, in defiance of gravity. Through other contrived scenarios, Ramachandran also found he could make people’s phantom fingers bend backward and “touch” the back of the phantom hand.

Chapter Six:

New Guinea’s languages

It’s fitting that the western world named the Fore, a tribe from New Guinea, after their language. Despite its modest size, the island contains something like one-sixth of the world’s known languages, primarily because its steep terrain isolates people in small pockets. Despite this linguistic wealth, kuru—Fore for “cold trembling”—remains the only word that these many languages have given the world at large.

Nit-picking

Because the Fore ate their dead for religious reasons, anthropologists sometimes referred to the act not as cannibalism, but as transumption, a sort of transcendent consumption.

Steps in the right direction

Beyond gaining independence, another symbolic step for Papua New Guinea—a sign that the country was joining the modern world, at least on paper—was the Sorcery Act of 1971, which made sorcery illegal. That said, the need to legislate against sorcery shows that many people still practiced it. As a side note, however silly sorcery seems to us, New Guinea doctors have pointed out that millions upon millions of Westerners read their horoscopes each day, which is about on the same level.

Prion nightmare

One clinical horror story from the 1980s revealed just how scarily tenacious prions are: A neurosurgeon in Japan had to operate on two different epileptics patients in one day. Unbeknownst to him, the first one had Creutzfeldt-Jakob disease. And although the surgeon sterilized his equipment between operations, he ended up passing the disease to his second patient later that day. Both patients, of course, died of CJD not long afterward.

Kuru genetics

For those interested in the genetic details of kuru, here goes. The stretch of DNA that creates the prion protein can have two different nucleotides in the 387th position—either A, which produces the amino acid methionine (abbreviated M); or G, which produces valine (V). Each person has two copies of this gene (one from Mom and one from Dad), and being homozygous for methionine (i.e., being MM) pretty much doomed you if you got exposed to kuru prions. Having a V, though, delayed onset and offered resistance. Why? The thinking is that the M and V versions of the gene produce proteins of somewhat different shapes. The M version is more common, though. So, assuming that the rogue prions also used M in their structure, these M-prions would have a tougher time locking onto the rarer V version and would therefore have a harder time corrupting them.

Similarly, in other prion diseases—presumably caused by V-based prions—having VV is a death sentence. Overall, then, your best bet for survival is to be MV, and in fact the MV genotype is associated with lower risk in every known prion disease.

All this leads to some intriguing speculation. Apes and monkeys rarely have anything but two copies of M; as geneticists say, the M version is highly conserved in primates. In humans, though, the M/V mix is widespread: pretty much every ethnic group across the globe has some Vs. In tandem, those facts suggest that the V version might have offered ancient humans a survival advantage, and therefore spread for a reason. One possibility is that many other tribes besides the Fore indulged in cannibalism in their past, and they survived kuru-like epidemics only because of the M/V mix in that one single spot in their DNA.

Chapter Seven:

Midgets versus dwarves

Roughly 10 percent of dwarfs can trace their troubles back to the pituitary gland. (Seventy percent, have a genetic condition called achondroplasia, which prevents bone growth.) The term has fallen out of favor, but doctors historically referred to pituitary dwarfs—who, however small, are normally proportioned—as “midgets,” reserving the term “dwarfs” for people with stubby limbs and larger-than-normal heads. Because they looked more like shrunken adults, midgets proved more popular in circus sideshows. Oddly, many pituitary midgets, because the growth plates in their arms and legs never fuse, continue to grow their entire lives, albeit achingly slowly.

On a related note, postcards depicting giants and dwarfs were quite popular in this era. Most pictures tried to be cutesy, showing dwarfs or midgets standing next to giant magnums of champagne, or giants riding Shetland ponies. Cushing’s book scratched the same itch of fascination in people, but gave their gawking a scientific veneer.

Cushing’s overactive imagination

The famous photo of the giant John Turner (from Cushing’s book) also helped disprove a legend about Turner’s final height. Cushing had always publicized Turner as eight-foot-three, as if treating a taller giant was more prestigious. But by comparing Turner’s apparent height in the photo to the known height of Cushing’s assistant in the photo, later scientists showed that Cushing may have been channeling his inner P.T. Barnum here, and they lowered the estimate of Turner’s height by a full foot.

Another thing about Turner that Cushing obsessed over was the giant’s feet. Apparently, the skin there had cracked and broken into patches like “alligator skin” and it appeared almost blue. Turner reported that he used the legs of long underwear as socks, because nothing else would fit over them.

Incidentally, Turner attributed his own growth spurt to alcohol, especially his habit of passing out and sleeping outside in wet clothes. Turner reportedly admitted drinking up to a half-pint of whiskey some mornings before breakfast.

And he still charged her afterward

Cushing was not the first surgeon to attack the pituitary gland through the nose; that honor goes to German rhinologist Oscar Hirsch. Oddly, Hirsch often performed this procedure on conscious patients, under merely local anesthesia. One surgeon remembers a poor girl, nearly blind, sitting upright and having to hold a basin for Hirsch, probably to catch blood. After the operation she also had to walk back to her hospital bed herself.

A note to a note

Regarding Mr. O., the Iowa man who shot himself in the head and ended up peeing like a racehorse: The medical name for this ailment is diabetes insipidus. This isn’t related to the disease associated with blood sugar (diabetes mellitus), but both ailments do share the symptom of excessive urination. That’s what the word “diabetes” refers to.

Mental zombies

Sex isn’t the only thing that seizure victims do unconsciously. Some people get in their cars and start driving during seizures, or sit down to play the piano. Some men cross-dress in their wives’ clothes, or punch through walls. One poor sap boarded a bus and woke up hundreds of miles away, which made for some embarrassment. In each case, the seizures basically turned them into zombies, creatures who went through the motions of daily life without any conscious intent.

Excruciating

For various reasons, Antonio Damasio emphasized Elliot as his most important case. But another of Damasio’s patients with frontal lobe damage demonstrated the folly of pure reason even more clearly. This man visited the clinic one day during a freezing-rain storm, and Damasio commented that the roads must have been scary. Not really, the man said. He then explained, in a disarmingly flat tone, how a woman driving right in front of him had careened into the ditch. He, though, coasted over the same patch of ice moments later without any worry. (In this case, a lack of emotions probably helped.) The appointment proceeded, and afterward Damasio asked the man to pick one of two dates for a follow-up. Here’s where the trouble started. The man spent a full half-hour flipping between pages in his appointment book, weighing the merits of each day: the probable weather, the proximity to other appointments, probably even what was on TV that night—every last damn thing he could think of. Although intrigued at first, Damasio’s staff starting pulling their hair out pretty soon: this could have gone on for weeks, even past the dates of the appointments. Damasio finally interceded and just picked one day for him. The man shrugged—good enough. Like Elliot, he never realized how much time he’d wasted over a trivial task, nor how embarrassing this must have looked. Or rather, he may have realized all this, but he still didn’t act, because no emotion spurred him.

Finding the seizure

With regard to finding auras: Given the millions of memories we all have, you might think it impossible to locate one specific memory or sensation on the surface of the brain, especially with something as crude as an electric wire. And normally you’d be correct. But auras aren’t normal memories. The people Wilder Penfield operated on had had tons of seizures, which had conditioned their brains and lowered the threshold for activating the aura. In other words, their brains were much more prone to produce this memory than other memories. As a result, the chances of the surgeon locating and summoning the aura weren’t half bad.

Even then, of course, Penfield often couldn’t find the aura. In those cases, he relied on other clues to know what to cut out. Diseased brain often looks jaundiced, for instance, and by the 1940s, surgeons also had electroencephalograph machines (EEGs), which helped detect abnormal electrical activity within the brain. (Tissue that causes a seizure produces characteristic spikes.) All these tools proved valuable in his work.

Chapter Eight:

Top her off

After excavating large sections of the brain, surgeons before Penfield typically filled up the resulting cavities with fat or other tissue. Penfield pioneered the use of warm saline, which he argued was closer to the natural cerebrospinal fluid that the brain was bathed in.

Ida McKinley’s epilepsy

In 1873 Ida McKinley lost both her mother and grandmother in quick succession. Perhaps worse, Ida was pregnant with a daughter of her own at the time, and also lost the baby after a hard labor. To compound her grief, the McKinleys’ first daughter, Katie, died a few years later from typhoid fever. Heartbroken, Ida became an invalid, with a variety of medical problems, including migraines and epilepsy. (By all accounts McKinley nursed her devotedly, although his political career did take priority sometimes.) Despite some very public seizures, few people knew the true nature of Ida’s “delicate nerves,” even inside the family—epilepsy was just too shameful to admit to. A niece once heard a public rumor that her aunt had epilepsy, and lashed out at the Democratic party for stooping so low as to smear her like that. The nerve.

More on Dostoyevsky

Dostoyevsky wasn’t the only one permanently disturbed by the mock execution. Another comrade all but came unglued mentally, and ended his life blubbering in an asylum. Sadly, Dostoyevsky passed his epilepsy onto his son, Alexey, who died of complications from a massive seizure at age three in 1878.

Sacred and profane

In addition to losing their senses of humor, temporal-lobe epileptics often remain blasé when confronted with emotionally charged images. Normal people, when shown slides of erotic moments, four-letter words, or people getting eaten by alligators, show strong signs of arousal or distress on brain scans. Temporal epileptics barely react. Instead, only slides of religious icons or religiously charged words stir their souls. It’s as if nothing else in the world, good or bad, matters except god and the sacred.

The hobgoblin of little minds

Ironically, Andreas Vesalius himself believed that the brain was indivisible into smaller operational units. It’s not clear why—after opening the world’s eyes to all the anatomical divisions within the brain—Vesalius balked at the idea that the brain’s function could differ from part to part, but there you have it. Innovators in one arena can be stodgy in another. Einstein was a good example of this: he revolutionized physics with his theory of relativity but stubbornly refused to accept quantum mechanics.

Penfield’s not-so-novel novel

Penfield actually inherited the manuscript for the Abraham novel from his mother, who had labored over it for years without bringing it to fruition. Penfield worked on it for a decade himself, requiring two trips to the Middle East to capture the ambiance of the story.

The benefits (seriously) of shock therapy

Some treatments for depression aim to reproduce, by overloading the brain with electricity, the serenity and well-being that temporal-lobe epileptics feel. Chief among these treatments is electroconvulsive therapy (ECT), which induces short seizures through electrodes on the scalp. The mere thought of ECT makes many people gasp—isn’t that torture? In truth the therapy has been unjustly maligned by a strange coalition of doctors, activists, and, reportedly, Scientology front groups. (Many Scientologists oppose all standard mental health treatments.) There’s no doubt that, in its early days, some doctors abused ECT. ECT can have serious side effects as well, including memory loss and broken legs or jaws from uncontrolled seizures. And the field got some bad PR when Ernest Hemingway shot himself after undergoing ECT.

But for some patients—especially people who react poorly to anti-depressants, or people so suicidal they cannot wait the months it takes for those drugs to kick in—ECT offers a viable alternative. In fact, something like 95 percent of chronically depressed patients get relief from ECT, far better than any drug can do. Modern safeguards, like muscle relaxants and the restriction of the electrodes to one side of the head, have greatly reduced thrashing, memory loss, and other side effects as well.

For an excellent explanation of ECT and the plot to discredit it, see “Shock and Disbelief,” by Daniel Smith, in the February 2001 issue of The Atlantic Monthly. It’s unsettling that we don’t know why ECT works. But that it does work, and does bring relief to thousands of people every year—sometimes the first relief they’ve ever known—is incontestable.

Chapter Nine:

Ivy rivalry

Woodrow Wilson was a Democratic president, Henry Cabot Lodge a Republican senator, but their mutual enmity ran even deeper than party politics. That’s because Wilson was a Princeton man, while Lodge went to Harvard. (Both had attended their respective schools as undergraduates; Wilson also served as president of his, and Lodge earned the first Ph.D. in political science from his.) It really boiled Wilson’s blood when Lodge, in mocking the League of Nations charter and its literary pretensions, chided, “It might get by at Princeton, but certainly not at Harvard.”

Wilson’s limerick

As the “turmoil in Central America” comment shows, Wilson often made light of his illnesses, but this levity seemed increasingly bizarre after his stroke and paralysis. One day his doctor, Cary Grayson, was struggling to feed Wilson when the president beckoned him near, as if to say something important. Wilson whispered:

A wonderful bird is the pelican.
His bill will hold more than his belly can.
He can take in his beak
enough food for a week.
I wonder how in the hell he can.

Grayson knew not what to make of this.

Prohibition

Three weeks after Wilson’s stroke, Congress passed the Volstead Act, which enforced Prohibition. Wilson vetoed it, but his Secretary of Labor—probably with Edith’s approval—actually wrote up Wilson’s justification for the veto, and some historians think Wilson had little idea what he was vetoing, and wouldn’t have vetoed it at all had he been in his right mind.

Telephone syndrome

In Capgras syndrome, vision overrides hearing within the brain. An even more extreme case of vision overriding hearing occurs in so-called telephone syndrome. In this case, a brain-dead patient shows no sign of life when family members are in the room with him, no matter how much they plead. But the patient perks up (at least a bit) if family members call him or otherwise restrict their interactions to words. Again, it seems that the visual parts of the brain help stifle the auditory parts when both should be working together.

Alien hand nicknames

Some people refer to their alien hands in the third person, as “she” or “he.” Others name theirs. Neurologists have recorded different ones called George, Toby, Silly Billy, My Buttinski, Floppy Joe, The Immovable One, Lazy Bones, and Pet Rock. These monikers appear in Todd Feinberg’s remarkable book Altered Egos, a great resource on delusions and on how the brain works in general.

A shocking truth

A few neuroscientists have discovered a temporary cure for the inability to recognize paralysis: pouring ice-cold water into people’s ear canal. Miraculously, as soon as this happens, people often fess up to their disabilities. Some even admit how long they’ve been paralyzed at this point. The scientists believe that the cold water shocks the vestibular system within the parietal lobe, which monitors limb position and movement; this shock forces the system to start paying attention to the paralyzed parts again, probably by using underground neural channels. The cold water might also somehow kick-start the ability of the parietal lobe to detect discrepancies. Funnily (or sadly) enough, as soon as the water warms up or drains away, the person slips right back into denial—even denying, unbelievably, that she ever admitted her paralysis at all.

This all implies that the victim somehow “knew” she was paralyzed, at least on some level. That might sound paradoxical—how can you know and not know something simultaneously?—but our brains are very good at hushing up information we don’t want to hear. How many times have any of us said something rude or done something stupid in the car?—and yet we still all consider ourselves kind people and good drivers. The human brain contains multitudes, and is complex enough to accommodate contradictions.

Chapter Ten:

Beer and lunacy

Speaking of alcohol and mental patients, there was actually a long-standing tradition in English asylums of providing patients with pretty much all the beer they wanted. Sometimes patients earned beer in lieu of wages for work performed, but many asylums just served their patients a pint or six as a matter of course. From a modern perspective, this seems like an awesomely terrible idea—not to mention sinister, since it probably induced addiction and dependency—but back in the days of questionable water and a lack of knowledge about what spread diseases, people drank lots more beer (and wine) than they do now.

More hardship

Beyond the train wreck, Lennox and de Wardener had other woes with tissue samples. Most discouragingly, while in Changi, a Japanese doctor showed up one day and commandeered for his own research a cache of brain slices that they’d spent months acquiring. The duo later commented that the man’s “claims to being a pathologist appeared to rest on the possession of a magnificent set of necropsy instruments looted from the Singapore General Hospital.” No one knows what, if anything, ever came of the doctor’s theft.

Choking in sports

Procedural memories are part of a larger group of “nondeclarative memories.” This group is pretty disparate—according to some accounts, it includes things like neurotic phobias. But the common trait is that we retain all these “memories” in our unconscious and would have trouble articulating them if asked to. In fact, with motor memories in particular, performance often deteriorates with conscious reflection. Take choking in sports. If you’re struggling to perform, thinking really hard about your form while swinging a golf club or shooting a free throw will likely lead you to botch it. Among their other talents, one skill that great athletes master is the ability to not think when they shouldn’t. This probably also explains why sports interviews are so vapid. If you ask an athlete what he was thinking during a game, he likely wasn’t thinking at all, and has to fall back on spouting clichés to please reporters.

“Five-eight-four, five-eight-four, five-eight-four…”

When Brenda Milner asked H.M. to explain how he had remembered the number 584, he said, “You just remember 8. You see, 5, 8, and 4 add to 17. You remember 8, subtract it from 17, and it leaves 9. Divide 9 in half and you get 5 and 4, and there you are: 584. Easy.” That probably seems harder to most people than just remembering 5-8-4, but hey, whatever works. The next lines in the conversation are also interesting, as they reveal just how fleeing H.M.’s memory was:

“Well, that’s very good,” Milner said. “And do you remember my name?”
“No, I’m sorry. My trouble is my memory.”
“I’m Dr. Milner, and I come from Montreal.”
“Oh, Montreal, Canada. I was in Canada once—I went to Toronto.”
“Oh. Do you still remember the number?”
“Number?” H.M. asked. “Was there a number?”

Both episodic and semantic

Some information straddles the line between episodic and semantic memory. Colors, for instance. Invariable colors (e.g., the color of blood, snow, stop-signs) get stored in semantic memory, which makes sense, since each of these colors is intrinsic to the object’s nature. But the colors of people’s cars (which vary) get stored in episodic memory.

Another mnemonic champ

One person with a memory comparable to Shereshevsky’s was VP, a man who (in a cosmic coincidence) was born in Latvia, mere miles from where Shereshevsky was born. (VP later moved to the United States.) Like Shereshevsky, VP didn’t make much of himself in life: he ended up working as a store clerk. But he seemed to have led a much happier life overall. One source of pleasure for VP was chess. He could play seven games at once, blindfolded, and played up to sixty games by mail without consulting notes or written records. He was not a synesthete, and psychologists who examined him don’t think he had an eidetic (i.e., photographic) memory. Instead, he had off-the-charts verbal skills and relied heavily on linguistic associations to recall things.

Overall, VP and his examiners traced his prodigious memory to three factors: he began training his memory early, in Hebrew school, where he had to memorize long sections of scripture; he noticed details quite well; and he was a passive person, someone who was content to absorb information without much reflection on it, and without calculating how he could twist the information to his advantage. That last skill made him a good memorizer, but doomed him in the business world.

Incidentally, probably the greatest feats of mnemonic “muscle” involve memorizing digits of pi. The current world record holder, Akira Haraguchi, can recite the first 100,000 digits, a feat that took him sixteen hours.

That salty fence

As mentioned, “no real boundary” existed between the different senses within Shereshevsky’s mind, and that lack of a boundary led to some inadvertently comic moments. Once, when taking a stroll with Shereshevsky after an appointment, Luria was deep in thought and absent-mindedly asked his patient whether he could find his way back on his own. “Come now,” Shereshevsky said, “How could I possibly forget? After all, here’s this fence [that leads back]. It has such a salty taste, and feels so rough. Furthermore, it has such a sharp piercing sound…” and so on, describing every landmark in excruciating detail. “I was forgetting whom I was dealing with,” Luria commented.

Dennis the dentist

Beyond filling memory gaps, the mind also confabulates because we feel the need to tell a coherent story about ourselves and our motivations. And if we can’t find a plausible explanation, we make something up. For instance, statistically speaking, naming children Dennis or Denise increases their chances of becoming dentists, apparently because they like the “den” sound. And guess where people named Georgia and Virginia prefer to live? Influence can be negative, too. In lab tests, foul-smelling rooms trigger feelings of disgust—and push people to mete out harsher punishments than perfumed or neutral rooms. And young women, perhaps unconsciously fearing incest, call their fathers less often while they ovulate. Now, if you ask Denise or the ovulating woman why she does such things, she’ll of course provide a reasonable answer. And it’s not that her answer is necessarily bogus—many a Denise would still like dentistry if named Cavatina. But such answers are incomplete. We cannot recognize our unconscious motivations by definition, and they can be all the more insidious for that, because we assume we understand ourselves fully.

Whom to believe?

Regarding the giant John Turner’s funeral, it’s hard to know whom to believe. Sharpe’s tale is more dramatic, which makes it more suspect—but not necessarily wrong. What’s more, Cushing’s other assistant might well have minimized the crassness of the scene in his own mind, in order to salve his conscience and protect Cushing’s reputation.

Chapter Eleven:

Samuel Johnson’s aphasia

Samuel Johnson suffered a stroke at around 3 a.m. on June 17th, 1783. Perhaps worse, he woke up knowing exactly what was happening to him: he felt confused and couldn’t speak aloud, but he had no choice but to lie there in the dark until morning, unsure how much of his mental power was disintegrating second by second. Being Samuel Johnson, however, he devised a test for himself to bide the time: he composed in his mind a quatrain in Latin, in which he asked the Lord to spare his intellect and reason—intellect and reason being the things he cared about most in life. Johnson wasn’t happy with his quatrain: a mediocre effort, he judged. But he took the fact that he could still compose in Latin—and the fact that he knew it was a mediocre poem—as signs that God had spared his mind after all. Johnson later bragged in letters to friends of beating the stroke (which he called a palsy), but in reality he lived on only eighteen months afterward and struggled to speak for much of that time. Johnson’s case also reveals an important truth about aphasia, that language deficits are relative. While Johnson’s eloquence was certainly diminished after his stroke, he remained far more eloquent than your average schmo, since he started on a much higher level. Strangely, too, Johnson stopped using semicolons in his writing after his stroke.

Our number sense

More and more, scientists believe that we do have a basic, innate “number sense” of some sort. As evidence for this, they note that even people who suffer brain damage and lose the ability to count can usually tell, instantly, which of two numbers are bigger. They can also read numbers accurately on a thermometer. And even when they screw up arithmetic problems, their answers are not wildly off: it’s not like you ask them to add 13 and 8 and they say 74,002; they retain a basic sense of quantity and magnitude. That said, there are cases where even the intrinsic number sense goes haywire. One woman who suffered brain damage told her doctors that you can drive from New York to LA in six hours, that horses run at 5 mph, and that 90 percent of Americans are male.

Split-brain pioneers

W.J.’s operation wasn’t the first time surgeons split someone’s corpus callosum in an effort to control epilepsy. Surgeons in Rochester, New York, also split the c.c.’s of a few dozen patients in the 1940s. They didn’t have much luck, as the seizures continued almost unabated. But after reading reports of the operation, the L.A. surgeons realized that the Rochester doctors probably hadn’t cut deeply enough, leaving a small bridge of corpus callosum intact. Hence their willingness to try again, and make a clean cut, with W.J.

By the by, children born without a corpus callosum often grow up just fine. How? Because children’s brains are far more plastic, they probably learn to route information through some of the secondary bundles of fibers that connect the two halves of the brain. In normal brains, these fibers are weak, but with practice from an early age, they presumably turn into information superhighways. Kim Peek, the real-life “Rain Man” had no corpus callosum. He also had a shriveled, misshapen left hemisphere, which probably explained the origin of some of his amazing talents—including, when young, the ability to move each eye independently. I discuss Peek’s life in my book The Violinist’s Thumb.

Left brain/right brain

Today we take the existence of left-brain/right-brain differences for granted, but Sperry and Gazzaniga’s work was controversial in its time, especially their work on right-brain language. Number vary, but in something like 97 percent of right-handed people and 66 percent of left-handers, the left hemisphere controls language: it articulates words, strings them together, remembers the difference between zeppelins and dirigibles, and so on. But in many people, the right brain can at least grasp some language if given the chance. For example, split-brain patients could follow simple commands flashed to their right brains (smile, clap, rub, kick). The hot/dog experiment also provided evidence for right-brain language: the right brain alone saw “hot,” yet the person still sketched a sun. Overall the right brain demonstrated language skills about on par with a toddler. That may seem like an anodyne conclusion, but the Broca-Dax idea of a mute right brain had dominated neuroscience for so long that one of Sperry’s collaborators actually withdrew his name from a paper, for of fear of ridicule, when Sperry suggested the right brain might not be mute after all.

Scientists long assumed that human beings alone had lateralized brains, and it’s true that the human hemispheres are far more specialized than any other animals’. But other species do show persistent left-right differences, too. Most animals strike more often at prey to their right; react more quickly to predators on their left; and suffer battle wounds more often on one side than the other. It’s also common for the left brain to control vocalization (cries, roars, twitters, croaks) even in non-mammalian species. Indeed, some scientists interpret the fact that chimpanzees gesture more often with the right hand as the probable precursor of our own left-brain language skills. To be scrupulous, though, most of the studies that revealed a right-hand bias among chimpanzees involved captive populations; wild populations seem more ambidextrous. For more on these findings, see “Origins of the Left and Right Brain,” in the July 2009 issue of Scientific American, and this post by anthropologist John Hawks.

Hurting himself

Regarding the possible influence of the mind on the brain: Unless you want to delve into souls and chis and things of that ilk, there is one conceivable and fully scientific way that a nonmaterial mind could bend back and influence the brain—through magnetic and electric fields. Unfortunately, Sperry’s own work with putting mica and tantalum wires into the brains of cats seems to have ruled out the possibility that either field plays any sort of important role.

Good tip

While visiting London once, Gazzaniga went through customs, and an official asked if he was visiting for business or pleasure. Gazzaniga explained that he was giving a talk about the brain. The official asked for details, and Gazzaniga said that he worked on the brain’s hemispheres. The official asked if he meant how the left brain dominates for language and the right brain for spatial skills, things like that. Flattered, Gazzaniga explained that he played a key role in determining those talents in the first place, many years ago. They had a nice chat until the official asked him what he was speaking about this time. “Consciousness and the brain,” Gazzaniga answered. The official lowered his eyes and looked at him. “Have you ever thought of quitting while you’re ahead?”

A one, and a two…

Similar to Leonardo’s ambidextrous talents, President James Garfield could reportedly write with Greek in one hand and Latin with the other at the same time!

Chapter Twelve:

For the neuroanatomy geeks…

Regarding the prefrontal parietal network, if you want to get more specific about the location of the patches of grey matter that help produce and stoke consciousness, the patch in the frontal cortex lies about halfway up the brain; while the patch in the parietal cortex lies toward its rear.

More on Clive Wearing

In addition to getting angry, Wearing engaged in a lot of conspiracy-mongering in the first few years after his amnesia set in. In most of his fantasies, doctors had destroyed his brain in order to hush him up about something. (What they wanted to hush him up about was never clear.) Sometimes the conspiracy had a global reach, and he regularly found inspiration in the newspaper: a random story about, say, King Hussein of Jordan, would spark all sorts of associations in his damaged neuron circuits, and could lead him to conclude just about anything.

Another point: Not to trivialize Wearing’s suffering, but the way his consciousness pulses on and off reminded me more than anything of an old-school Nintendo on the blink. After years of (ab)use, the Nintendo would still start up a game when you pushed the power button: the title screen appeared on the television, and music started playing. Then, within two seconds, the power would cut out. But instead of remaining off and just crapping out for good—which would have been less excruciating in the end—the game would actually restart. The title page and the music would flash on again as if everything was fine. You’d get your hopes up—then watch it die a second time. At which point it restarted again. And again, and again, an endless loop. Similarly, Wearing’s consciousness seems to start up just fine, but it never quite “catches.”

Phineas Gage and androids

The “uncanny valley”—an idea from the robotics and animation industries—proposes that any human-like character that you build, draw, or create will fall along a contiuum. On the one end lie characters that look only vaguely human—like a car whose grille and headlights are distorted into a face, or a duck drawn anthropomorphically. We humans, narcissists that we are, tend to find such characters cute. On the other end of the spectrum are highly realistic depictions of humans, and we enjoy these renderings, too. It’s the middle ground, the almost-human, where things get interesting.

Imagine traveling from one end of the spectrum to the other: you start with cars with faces, graduate to animals and Simpsons characters, and so on. In each step so far, the characters look more and more human, and we find that we can relate better to them. But the funny thing is, this process doesn’t continue indefinitely: we don’t just keep falling more and more in love at each step. We actually start to resist at some point. In fact, we tend to find anything that’s pretty-close-to-human-but-still-noticeably-off—like animatronic characters in movies—downright creepy. Psychologically, it seems that with animals or cartoons or whatever, we focus on the ways in which they look like us and accept them. But with androids and the like, we ignore the 99 percent that’s similar and focus on the 1 percent that’s different, then magnify that difference until it’s all we can think about.

I think we do the same with the likes of Phineas Gage and H.M. Even though their brains can still do virtually every single thing that your brain and my brain can do, their deficits make them seem uncanny. They fall into a sort of uncanny valley of humanness, where the 99 percent of them that’s just like us gets overwhelmed by the 1 percent that’s creepily different, and that 1 percent assumes an outsized importance.

Memory fugues

A scrawny, sixty-year-old carpenter wearing a Derby hat left his home in Coventry, Rhode Island, on January 17, 1887, carrying $551 cash. Ansel Bourne was going to buy himself a farm. He first stopped at his nephew’s store in Providence, then stabled his horse and visited his sister. He parted with her and walked on. On the corner of Dorrance and Broad streets, a freight express wagon crossed his path. By the time it left the intersection, Ansel Bourne had vanished.

A missing-person notice in the local newspaper a few days later described Bourne as an ex-preacher with grey hair and a scraggly grey beard. The notice further described him as prone “to attacks of a peculiar kind, which rendered him temporarily insensible.” Basically, Bourne had amnesic fits: he would set out to walk a block or two but would wake up three miles distant, having no idea how he’d got there. He also suffered “fainting fits,” probably a euphemism for seizures. Still, the attacks never lasted long, and he’d certainly never vanished. His horse remained stabled three weeks while everyone searched, until his nephew and sister swallowed hard and admitted what must have happened. With all that money on him, no doubt some drifter had taken his life. They were right, in a topsy-turvy way.

The day Bourne disappeared a drifter calling himself Albert John Brown took a carriage from Providence to Pawtucket, then hopped a train to New York, where he stayed a few days in a hotel. On February 1 he wandered into Norristown, Pennsylvania, twenty miles outside Philadelphia, where he rented a room in the boardinghouse of one Pinkston Earle. “Brown” divided his room with curtains, installed furniture, and opened a store in the front half, stocking his shelves with candies, toys, nuts, and dime goods; his biggest seller was toffee. Some Norristownians found Brown peculiar but most ignored him. He kept to himself, usually cooking steak or ham in his back room for dinner.

One morning at 5 a.m., eight weeks after Bourne disappeared, Brown heard a bang! like a pistol firing, and started awake. The bed felt alien, with an unfamiliar ridge digging into his back. He then noticed electric lights outside, and realized he’d somehow fallen asleep in a stranger’s room. He huddled for two hours beneath the covers waiting for the occupant (or the police) to barge in. He finally crept down the hall and risked knocking at a door. After hearing a friendly greeting, he asked where he was and what day it was.

“The 14th,” Pinkston Earle replied.

“Does time run backward here? When I left home it was the 17th.”

“The 17th of what?”

“Seventeenth of January.”

“It is the 14th of March.”

Brown gasped. Earle, thinking Brown had gone soft in the head, summoned a local doctor. But under questioning, the man insisted Brown wasn’t his name. He was a preacher and carpenter named Ansel. Later that day the doctor sent a telegram to the man’s nephew, who confirmed that Ansel Bourne had indeed gone missing. Bourne and Brown were the same man—and yet not the same: they inhabited the same body, but had different identities.

The doctor soon mentioned Bourne/Brown to a medical buddy who handled queer cases, and Silas Weir Mitchell’s involvement helped make Bourne a medical celebrity. Eventually William James got involved, and Bourne became the H.M. of his day. James especially wanted to unravel Bourne’s eight lost weeks, so he started summoning “Brown” through hypnosis to interrogate him. It became obvious under hypnosis that Brown was little but an unimaginative doppelgänger of Bourne—like a bad alias. They had the same birthday (July 8th, 1826); Brown’s wife had died the same day as Bourne’s wife; and like Bourne the ex-preacher, Brown had studied theology. Brown even claimed to have heard of Ansel Bourne’s marvelous conversion story, though he hadn’t had the honor of meeting the man. Brown nevertheless insisted he was an independent, autonomous soul. After taking the biography down, James studied the case from all angles and even tried to unite Brown and Bourne through various gimmicks. (In one experiment he put Brown to bed and then detonated a blasting cap near his ear, to recreate the bang that had awakened Bourne. Brown kept sawing logs.) In the end James never did determine how the Brown zombie had wrested control of Bourne’s consciousness.

It would be easy to dismiss Bourne’s case as exaggerated or silly, a product of a more credulous era. But however rarely, people today do suffer from Bourne’s ailment, now known as “dissociative fugue.” In short, someone’s sense of self vanishes, and autopilot kicks in. Many victims feel a compulsion to roam, and some don’t realize they’ve misplaced their identities until someone asks them who they are. At this point a new and fully-formed identity emerges like Athena from Zeus’s brow. The most famous, albeit fictional, victim of dissociate fugue is Jason Bourne (Bourne Identity, etc.), who’s likely named for Ansel, natch.

Psychologists know of three factors that can precipitate a dissociative fugue: severe stress, despair, and a history of transient amnesia. Bourne showed all three. He’d long suffered from amnesic absences, and just before his disappearance he’d been feeling stress and despair over his immortal soul. Of the many “fits” he’d suffered during his lifetime, one of the few that Bourne could remember happened at age thirty-one, when he had a St.-Paul-on-the-way-to-Damascus moment on a country road. An atheist until then—he even played cards on Sunday, to spite God—Bourne suddenly heard a disembodied voice asking him to attend chapel. He sneered he’d rather be struck deaf, dumb, and blind. A minute later he got his wish, and was struck so senseless that his pupils stopped dilating in response to light.

To hear Bourne tell it, God restored his sight the next day, and a thankful Bourne began writing repentant letters for a local preacher to read before the assembled each Sabbath. During one recitation six weeks later, Bourne broke into a drenching sweat onstage. A moment later, like the crashing of a wave, his speech and hearing came roaring back to him all at once. “Glory to God and the Lamb forever!” he hollered, and converted on the spot.

Not long afterward, while trying to decide what to do with his life now, Bourne saw another ex-carpenter in a vision: “Settle up your worldly business,” J.C. said, “and go to work for me.” Bourne did, becoming an itinerant preacher. But after Bourne spent a few decades roaming for Christ’s sake, his second wife demanded that he stop for health reasons. The guilt of stopping—of neglecting his Christly duty—tore at Bourne, and probably triggered his crisis.

Like with regular amnesia, fugues disable some types of memory (personal memories) more than others (motor memories). One case that illustrates this point involved a twenty-three-year-old New Yorker named Hannah Upp. Upp reportedly felt trapped in her job teaching Spanish to middle-schoolers in Harlem, and she disappeared the day before school started in 2008. Some friends assumed she’d joined a local group of freegans—environmentalists who live on free discards, often from dumpsters—while others feared her dead. Three weeks later, a Staten Island Ferry captain spotted brown hair bobbing in the water a quarter mile away. Five minutes later, his crew dredged Upp up. She had a gargantuan blister on her foot from wandering the city, as well as a wicked sunburn. While she convalesced, security footage emerged that showed her visiting her gym and even logging onto her email in an Apple store. She remembered none of this, but neuroscientists explained that she could still do these things because procedural muscle memories remain intact during a fugue. Most of us do indeed type our passwords without thinking.

Fugues also help illuminate another basic distinction in memory research, between familiarity and recollection. (Again, a feeling of recollection is I specifically remember doing this, while a feeling of familiarity is more like this sounds familiar, even if I don’t remember the details.) Oddly, and unlike most amnesiacs, fugue victims lose all familiarity for their former life. Not even their names trigger anything: while scanning her email account during the fugue, Upp says she vaguely remembers being unable to figure out who this Hannah person was. An acquaintance even spotted her once during her wanderings and inquired if she was the missing girl. Upp denied it.

As a final consideration, notice that Upp and Bourne snapped back quite quickly. Regular amnesiacs almost never recover lost memories. But fugue victims—usually after being spooked or almost dying—often do resume their old identities. This seems to imply that the old self, although suppressed, never quite vanishes. Overall, Bourne, Upp, and other fugue victims show that if the brain has specialized circuits for something—like providing a stable identity—those circuits can short out, even as the rest of the brain staggers on. But ultimately, the self, the core self, truly is tenacious.

PS If you read The Tale of the Dueling Neurosurgeons in its entirety; and you polished off all its notes; AND you devoured these web notes as well – then wow, congratulations! Drop me a line to say hello…