The Battle of the Tooth Worm

_toothwormI come across a lot of strange objects in my research: books bound in human skin, prosthetic noses made of silver, iron coffins with safety devices to prevent premature burial. But perhaps one of the strangest objects I’ve seen is the one pictured on the left.

This is a depiction of the infamous tooth worm believed by many people in the past to bore holes in human teeth and cause toothaches.  But before I tell you about this fascinating piece of art, let me give you a quick lesson in dental folklore.

Tooth worms have a long history, first appearing in a Sumerian text around 5,000 BC. References to tooth worms can be found in China, Egypt and India long before the belief finally takes root (pun intended) into Western Europe in the 8th century. [1]

Treatment of tooth worms varied depending on the severity of the patient’s pain. Often, practitioners would try to ‘smoke’ the worm out by heating a mixture of beeswax and henbane seed on a piece of iron and directing the fumes into the cavity with a funnel. Afterwards, the hole was filled with powered henbane seed and gum mastic.  This may have provided temporary relief given the fact that henbane is a mild narcotic. Many times, though, the achy tooth had to be removed altogether. Some tooth-pullers mistook nerves for tooth worms, and extracted both the tooth and the nerve in what was certainly an extremely painful procedure in a period before anaesthetics. [2]

_Toothworm3The tooth worm came under attack in the 18th century when Pierre Fauchard—known today as the father of modern dentistry—posited that tooth decay was linked to sugar consumption and not little creatures burrowing inside the tooth. In the 1890s, W.D. Miller took this idea a step further, and discovered through a series of experiments that bacteria living inside the mouth produced acids that dissolved tooth enamel when in the presence of fermentable carbohydrates.

Despite these discoveries, many people continued to believe in the existence of tooth worms even into the 20th century.

The piece of art at the top of the article is titled ‘The Tooth Worm as Hell’s Demon.’ It was created in the 18th century by an unknown artist, and is carved from ivory. It is an incredibly intricate piece when you consider it only stands a little over 4 inches tall. The two halves open up to reveal a scene about the infernal torments of a toothache depicted as a battle with the tooth worm, complete with mini skulls, hellfire, and naked humans wielding clubs.

_toothworm4

It is, without a doubt, one of the strangest objects I’ve come across in my research; and today, I pass this random bit of trivia on to you in the hopes that you may use it someday to revive a dying conversation at a cocktail party.

1. W. E. Gerabek, ‘The Tooth-Worm: Historical Apsects of a Popular Belief,’ Clinical Oral Investigations (April 1999): pp. 1-6.
2. Leo Kanner, Folklore of the Teeth (1928).

By |2014-01-06T12:38:31+00:00January 6th, 2014|Casebooks|31 Comments

Piss Prophets & The Wheel of Urine

L0030213 U. Binder, Epiphaniae medicorum, 1506.I recently watched an episode of Dr Oz in which he pontificated about the colour of some unfortunate woman’s urine in front of millions of viewers. She offered up a cup of what looked like diluted molasses to the good doctor for judgement. ‘Dehydration,’ Dr Oz decreed.  ‘More water!’

(As if she didn’t have a sneaky suspicion of this already from the looks of the dark, murky fluid residing in the bottom of the plastic cup.)

Watching this spectacle reminded me of the medieval urine wheel used to diagnose disease based on the colour, smell and taste of a person’s urine. And yes, I did say taste. I’ll return to that point in a minute.

Before stethoscopes, blood tests and x-rays, a pot of pee was a crucial diagnostic tool. Due to the enduring influence of the Greco-Roman physician, Galen (131-201 AD), medical practitioners believed that urine was vital in gauging the health of a person’s liver, where blood was thought to be produced. Analysing urine was the best way to determine whether a patient’s four humours (blood, phlegm, yellow and black bile) were in balance.

The wheel consisted of 20 colours ranging from ‘white as wellwater’ to ‘ruddy as pure intense gold’ and lastly ‘black as very dark horn.’ George III (1738-1820) reportedly had purple urine. This could have been a sign of a rare condition known as porphyria which can manifest itself in many of the neurological disorders for which the ‘Mad King’ was known.

M0013714 Miniature: a physician examining a urine flask.Of course, examining a person’s urine inside a dark chamber pot proved problematic, so practitioners created a round-bottomed glass flask shaped like a bladder called a matula. Indeed, the image of the doctor holding up the urine-filled flask to the light came to epitomize medicine during this period, and is still a recognized symbol today.

The smell and taste of a patient’s urine were equally important when determining a course of treatment, and often corresponded with specific colours. In 1674, the English physician Thomas Willis described the urine of a diabetic as ‘wonderfully sweet as if it were imbued with honey or sugar.’ He also noted that diabetic urine was often the colour of honey, something observed by earlier practitioners using the urine wheel. Willis went on to coin the term mellitus (literally honey sweet) in diabetes mellitus, and for a long time, the condition was known as ‘Willis’s disease.’

The urine wheel may not have been useful in diagnosing diseases as we understand them today; however, it was used in standard practice during the medieval period. By the 16th and 17th centuries, urine wheels had become so prolific due to the printing press that all sorts of people were using them, including unlicensed medical practitioners, or quacks. The practice of uroscopy—using urine to analyse a patient’s health—soon turned into uromancy, which was something altogether different.

L0025265 A physician examining a flask of urine brought by a young woUromancy is the art of divination using urine. Piss prophets (as they were known) each had a different method for predicting the future. Some took omens from the urine’s colour; others from its taste. Most commonly, piss prophets ‘read the bubbles’ seconds after it hit the divination bowl. The presence of large bubbles spread far apart signified that the urinator was about to come into a lot of money. Conversely, the presence of small bubbles packed tightly together signified illness, loss or the death of a loved one. Even pregnant women visited piss prophets to the hopes of learning the sex of their babies.

Today, physicians no longer taste our urine, nor do they spend much time contemplating its smell or colour (Dr Oz aside). That said, asking a patient to pee into an impossibly tiny cup is not an uncommon request, as anyone entering a hospital or medical office today knows. One can’t help but think the experience would be much more enjoyable if the doctor, upon being presented with the warm cup of cloudy liquid, poured it into a divination bowl and told us we were all going to be rich.

Just like him.

By |2013-12-06T10:54:11+00:00December 6th, 2013|Casebooks|16 Comments

The Dangers of [Georgian] Vanity

Huzzar2The other day, I walked through the makeup section of a department store just outside of Chicago. Every step of the way, I was bombarded by sales attendants trying to sell me the latest anti-aging potions. There was Rodial Snake Venom—an anti-wrinkle cream which allegedly simulates the paralysing effects of a viper bite to reduce expression lines in the face—as well as a host of other products including Freeze 24/7, which purports to be a ‘clinically proven dream cream.’ Topping the list of quack remedies was the ‘Vampire Facelift,’ a non-surgical procedure involving the reinjection of gel-like substance derived from the patient’s own blood.

With all these products on the market today, you might think that we are uniquely obsessed with finding eternal youth. Yet, people in the 18th century were equally concerned with turning back the hands of time, and their beauty regime could be just as futile (and toxic) as our own.

Read the full article on Huzzar: The 18th-Century Inspired Fashion and Lifestyle Webzine.

By |2013-11-08T15:09:14+00:00November 8th, 2013|Casebooks|4 Comments

Death & Childhood in Victorian England

CD2I remember many childhood days spent propped up on my grandmother’s couch with a tower of pillows. I’d watch the day peacefully unfold from her picture window. One month, it was bronchitis. The next, it was pneumonia. My mother—then a nursing student—rushed me in and out of doctors’ offices and emergency rooms, where I was poked, prodded and eventually sent home with a bag full of medications.

Principals were notified; classes were missed. Friends brought armfuls of heavy books home each day after school with daily assignments. I’d hear their voices in the other room but never see their faces. Contagion was always a risk.

This was the life of a sick child.

At the time, I felt incredibly sorry for myself. Why couldn’t I enjoy good health like the rest of my girlfriends? Why did I have to stay indoors day after day, and swallow pills that made me nauseated and dizzy?

Years on, however, I began to realise that I was actually very lucky.  This wasn’t just the life of a sick child. This was the life of a sick child in the 1980s.

Today, we often associate death with old age. But we don’t have to go back far in history to find a time when childhood was both dangerous and deadly.

CD3Victorian children were at risk of dying from a lot of diseases that we’ve eradicated or can control in the 21st century, like smallpox, measles, whooping cough, diphtheria, and dysentery (to name just a few). Death was a common visitor to Victorian households; and the younger one was, the more vulnerable he or she would be.

In 1856, Archibald Tait—the future Archbishop of Canterbury—lost five children in just as many weeks to scarlet fever. [1] When the fever wasn’t fatal, it nearly always weakened the child who often died months or even years later from complications.  Indeed, this is the fate of Beth in Louisa May Alcott’s famous book, Little Women (1868/9).

Tuberculosis was also a common killer in the 19th century. On 26 April 1870, Louisa Baldwin (mother of the future prime minister, Stanley Baldwin) wrote in her diary:

I paid a sad call at the Worths where 2 children seem to be at the point of dying, the poor terrible little baby has constant fits & little Madge two years old, who has been ill 12 days with congestion of the lungs. This is the second time I’ve seen them in this illness…we went into next door where we saw poor little Miss Lee evidently very near the end, but sweet and affectionate as ever. [2]

_8No one was immune. The great scientist, Charles Darwin, lost his 10-year-old daughter, Annie [left], to tuberculosis in 1851. In his personal memoir, the grief-stricken father wrote: ‘We have lost the joy of the household, and the solace of our old age…Oh that she could now know how deeply, how tenderly we do still & and shall ever love her dear joyous face.’ [3] By the mid-19th century, tuberculosis accounted for as many as 60,000 children’s deaths per year. [4]

Literature from the period reflects the prevalence of children’s deaths in Victorian England. The dying child makes a frequent appearance in 19th-century novels. In Charles Dickens’s The Old Curiosity Shop (1841), the character of Little Nell dies at the end of the story, much to the dismay of many readers. When describing the scene to his illustrator, George Cattermole, the novelist wrote:

The child lying dead in the little sleeping room, which is behind the open screen. It is winter-time, so there are no flowers; but upon her breast and pillow, and about her bed, there may be strips of holly and berries, and such free green things. Window overgrown with ivy. The little boy who had that talk with her about angels may be by the bedside, if you like it so; but I think it will be quieter and more peaceful if she is alone. I want it to express the most beautiful repose and tranquility, and to have something of a happy look, if death can…I am breaking my heart over this story, and cannot bear to finish it. [5]

CD4Though children died with frequent regularity during the Victorian period, a child’s death was still seen as particularly tragic. Even Dickens could not help but mourn the passing of his young, fictitious character [depicted by Cattermole, right].

As a historian, people often ask me if I would have liked to have lived in the past. My answer is always a resounding ‘NO!’ When you consider that only 40 per cent of children born in the 1850s reached their 60th birthday—and less than 10 per cent reached their 80th—I feel very lucky indeed to have been born in 1982.

My life expectancy is 78.

 

1. D. P. Helm ‘”A Sense of Mercies”: End of Life Care in the Victorian Home’ (Masters Thesis, University of Worcester, 2012), p. 15.
2. Diary of Louisa Baldwin 1870, 26th April 1870. Baldwin papers. 705:775/8229/7 (ii), Worcestershire Record Office. Originally quoted in Helm.
3. The original manuscript is in the Darwin Archive of Cambridge University Library (DAR 210.13). You can find the entire transcript online here.
4. J. Lane, A Social History of Medicine: Health, Healing and Disease in England 1750‐1950 (London, 2001), p.142.
5. Letter from Dickens to Cattermole.

By |2013-10-15T17:08:37+00:00October 15th, 2013|Casebooks|11 Comments

Dying the Good Death: The Kate Granger Story

_8

Recently, I had the privilege of interviewing Kate Granger, a 31-year-old physician who was diagnosed with an aggressive form of sarcoma in 2011 and given less than 5 years to live.

Kate made headlines in British newspapers when she announced that she was going to tweet from her deathbed, using her own death as a communication tool for opening up discussions about mortality. Kate and I talked about everything from the ‘over-medicalisation’ of death in the 21st century, to how she wants to die at home surrounded by friends and family.

Click here to read the interview on Medium.

I hope you find her message as inspiring as I did.

By |2013-10-09T20:51:43+00:00October 9th, 2013|Casebooks|6 Comments

Death Salon Cabaret: The Uncommon Corpse

DSC1

I was 17 years old when I saw my first dissected body at a chiropractic school of medicine just outside of Chicago. Since then, I’ve seen thousands more, some more disturbing than others. There have been disembodied body parts floating in jars; whole bodies splayed open, covered in shellac and nailed to wooden platforms; even corrosion casts uncovering the tiniest details of the human vascular system. Each one has provoked a reaction inside me. Each one has deepened my resolve to share the stories of the people who died and the anatomists who cut open and preserved their bodies in earlier centuries.

On Friday, October 18th I’ll be talking about some of the more unsettling specimens I’ve encountered in my research at the Death Salon Cabaret in Los Angeles. The night will be hosted by Lord Whimsy, and will consist of fast-paced talks interspersed with artistic performances focusing on ‘the idea of the rare corpse that will not simply fade into the background of history…the bodies that were not lost to the ground or the pyre but insisted on staying longer and forcing the living to face their mortality.’

HBSpeakers include Paul Koudounaris, author of The Empire of Death and Heavenly Bodies—two books which explore Europe’s catacombs and ossuaries with lavish photographs (see right); Jeff Jorgenson, owner of Elemental Cremation & Burial and advocate for green death practices; Bess Lovejoy, author of Rest in Pieces: The Curious Fates of Famous Corpses; Sarah Troop, host of The Cabinet of Curiosities Podcast; Joy Nash, Los Angeles-based actor; and Christine Colby, managing editor of Penthouse.

There will also be haunting musical performances by Gothic Beauty, Jill Tracy, and ‘Death Troubadour,’ Adam Arcuragi.

So if you find yourself in LA next month—and you aren’t faint of heart—come to the Death Salon Cabaret at the Bootleg Theatre. I promise you won’t be disappointed…though you may be advised not to eat before my presentation!

Rumour has it that even the Grim Reaper will be there for photos… I just hope he’s taller than me.

Click here to book your tickets

By |2013-09-27T20:59:19+00:00September 27th, 2013|Casebooks|7 Comments

DO NOT SIT! A History of the Birthing Chair

ChairI was standing on the second floor of Surgeons’ Hall in Edinburgh waiting for my film crew to begin rolling for my upcoming documentary, Medicine’s Dark Secrets, when I spied a chair (left) in the corner. At that point in the day, I was exhausted and my attention to detail was diminishing with each passing second. Heartened by the sight of a chair, I quickly made my way towards my desired rest stop. Just as I began my descent into blissful comfort, however, I noticed a sign with big bold lettering: Museum Object: Please DO NOT SIT!

Just seconds before plopping my full weight down onto an antique chair, I awkwardly manoeuvred myself back into a standing position and looked around to make sure no one had seen my faux pas.

Upon closer inspection, I realised how obvious my mistake had been. This was no ordinary chair. It had a semi-circle cut from the seat, and looked tremendously uncomfortable. Indeed, I’d have to sit with my legs straddling either side of this awkward contraption to even remain balanced on it.

This was an 18th-century birthing chair.

_8

Today, the idea of giving birth while sitting upright in a wooden chair may seem torturous.  But long before delivery rooms, stirrups, forceps and foetal monitors, a woman gave birth at home in a chair with the aid of her midwife and other female friends, relatives and neighbours. These women were known as the ‘gossips’, for they spread the word to all the women in the community when another went into labour. The ‘gossips’ supported the mother-to-be during this time by praying with her, preparing special foods, and helping the midwife with any other menial tasks that needed doing.

When the time came, the pregnant woman would be propped up in the birthing chair. The midwife would sit below her, ready to catch the baby, while other women supported and comforted her from above. After the delivery, the exhausted mother would then be lead back to her bed, which remained unsullied from the birth itself.

_8Overtime, birthing became the purview of the medical community. Midwives were replaced by male-midwives (the precursor to the modern-day obstetrician), who introduced forceps into the delivery. Birthing chairs were modified to accommodate these changes. Take, for example, the one on the left. The arm and foot rests on this wooden chair could be adjusted for the mother’s comfort; and (most importantly), the back could fold down, converting it into a bed or an operating table—a necessary feature if forceps were to be used.

Birthing chairs were coveted pieces, and often passed down from generation to generation as family heirlooms. Little by little, however, the hospital became the locale of birth and eventually the chairs were discarded.

That said, many examples still exist today in museums around the world. Thinking back on the one in Edinburgh, I am comforted by the sign with its big, bold letters. Clearly, I was not the first to try to sit in the birthing chair; and I doubt I will be the last.

By |2013-09-19T15:24:10+00:00September 19th, 2013|Casebooks|35 Comments

Renaissance Rhinoplasty: The 16th-Century Nose Job

L0058567 Artificial nose, Europe, 1601-1800The 16th century was a particularly bad time for noses. In 1566, the famous astronomer, Tycho Brahe, had his sliced off during a duel and was forced to wear a replacement reportedly made of silver and gold. [1] Others lost theirs in similar fights, or to cancerous tumours that ate away the cartilage on their faces. But the biggest culprit to noses during this period was the new disease sweeping through Europe: syphilis.

Before the discovery of penicillin in 1928, syphilis was incurable. Its symptoms were as terrifying as they were unrelenting. Those who suffered from it long enough could expect to develop unsightly skin ulcers, paralysis, gradual blindness, dementia and what today is known as ‘saddle nose’—a grotesque deformity which occurs when the bridge of the nose caves into the face and the flesh rots away.

As syphilis raged throughout 16th-century Europe, the ‘saddle nose’ became a mark of shame, symbolizing the victim’s moral and bodily corruption. Some, in desperation, turned to surgeons to help disguise their deformities. One man in particular was renowned for his skills: the Italian surgeon, Gaspare Tagliacozzi.

L0032530 G. Tagliacozzi, De curtorum chirurgia per inBefore Tagliacozzi, most surgeons used the ‘Indian Method’ for nasal reconstruction. This involved cutting a nose-sized section of skin from the forehead and attaching it to the bridge of the nose to maintain a steady blood supply. The flap was then twisted into place and sewn over the damaged area, thus providing a suitable ‘replacement’ for the lost nose but leaving one’s forehead scarred.

Tagliacozzi had an entirely different approach. His process involved partially cutting a flap of skin from the upper arm, reshaping it into a nose, and then grafting it to the damaged nasal cavity (see right image). The patient’s arm would then be held in place using bandages for approximately 2 weeks while the graft attached itself to the face. Afterwards, the surgeon severed the new ‘nose’ from the arm and began reshaping and contouring the piece of skin.

A 16th-century contemporary described the surgery:

First they gave the patient a purgative. Then they took pincers and grabbed the skin in the left arm between the shoulder and the elbow and passed a large knife between the pincers and the muscle, cutting a slit in the skin. They passed a small piece of wool or linen under the skin and medicated it until the skin thickened. When it was just right, they cut the nose to fit the end of the little skin flap. Then they snipped the skin on the arm at one end and sewed it to the nose. They bound it there so artfully that it could not be moved in any way until the skin had grown onto the nose. When the skin flap was joined to the nose, they cut the other end from the arm. They skinned the lip of the mouth and sewed the flap of skin from the arm onto it, and medicated it until it was joined to the lip. Then they put a metal form on it, and fastened it there until the nose grew into it to the right proportions. It remained well formed but somewhat whiter than the face. It’s a fine operation and an excellent experience. [2]

The entire procedure could take up to 5 months, and no doubt caused considerable pain and discomfort to the patient during the process.

_*That aside, Tagliacozzi boasted of his skill, claiming that the noses he reconstructed were better than the originals. Yet when he died in 1599, so too did his method.  Over the next several hundred years, surgeons continued to prefer the ‘Indian Method’ when performing rhinoplasty, citing that Tagliacozzi’s technique left the new nose vulnerable to cold winters, when it often turned purple and fell off. On rare occasions, however, the ‘Italian Method’ was employed, such as the case of a soldier whose face was severely damaged in July 1944 (see above picture).

Unlike his surgical techniques, Tagliacozzi’s mantra persisted long after his death and is still quoted by modern-day plastic surgeons, who see him as the ‘father’ of their discipline:

We restore, rebuild, and make whole those parts which nature hath given, but which fortune has taken away. Not so much that it may delight the eye, but that it might buoy up the spirit, and help the mind of the afflicted. [3]

1. Recent forensic tests conducted on Brahe’s skeletal remains suggest that the nose may, in fact, have been made of copper.
2. Originally qtd in William Eamon, The Professor of Secrets: Mystery, Medicine and Alchemy in Renaissance Italy (2010), pp. 95-96.
3. G. Tagliacozzi, De Curtorum Chirurgia per Insitionem (1597).

By |2013-09-04T15:47:26+00:00September 4th, 2013|Casebooks|55 Comments

A Morbid Chat with ‘The Chirurgeon’s Apprentice’

00‘Death has never been more fashionable, if the popularity of Dr. Lindsey Fitzharris’s work is any guide. This glamorous medical historian’s morbid blog, The Chirurgeon’s Apprentice, has attracted 14,000 loyal adherents (and counting) in the space of 2.5 years, and she recently crowd-funded her own TV documentary to the tune of $32,000 in under 60 days. Medicine’s Dark Secrets begins filming in August.’

Read my interview with Teal Cartoons here.

By |2013-07-26T10:58:24+00:00July 26th, 2013|Casebooks|1 Comment

YOU HAVE DIED OF DYSENTERY

Children of the 70s and 80s will likely remember Oregon Trail, the computer game where the player assumes the role of wagon leader and guides a group of settlers through the pioneer landscape of 19th-century America. You would hunt bison, shoot rabbits, ford rivers and pick up other settlers as you made your way from Missouri to Oregon. But just as you really got into the game, this would happen:

00

If you are like me, you probably shouted: ‘NOT AGAIN!’

So what exactly is dysentery, and why did you and all your settlers keep dying from it in Oregon Trail?

Dysentery is an intestinal inflammation that causes severe diarrhea, usually characterised by mucus or blood in the feces. Left untreated, the disease can lead to rapid loss of fluids, dehydration, and eventually death.

There are two forms of dysentery. One is caused by a bacterium, the other, an amoeba. The former is the most common in Western Europe and the United States; and is typically spread through contaminated food and water.

Outbreaks of dysentery were more prevalent during war, where the disease spread rampantly because of the unhygienic conditions of the camps. During the Mexican War (1846-48), a staggering 88% of deaths were due to infectious disease, most of those overwhelmingly dysentery. For every man killed in battle, seven died of disease. The American Civil War was no better. You were more likely to die off the battlefield than on it, and dysentery was the primary cause. [1]

00That said, civilians also died of dysentery with some frequency in the 19th century, especially those who were itinerant. Pioneers travelling the Oregon Trail wouldn’t have faired much better than soldiers fighting in war. They would have travelled in large groups—wagon after wagon trailing one another—and their access to clean water and food would have been severely limited. In 1853, one pioneer wrote in her diary: ‘Still in camp, husband and myself being sick (caused, we suppose, by drinking the river water, as it looks like dirty suds than anything else)’.

Diseases such as tuberculosis, flu, measles and smallpox spread like wildfire through their crowded, makeshift camps. Dysentery would have been one of the leading causes of death amongst these pioneers, although it is difficult to determine just how many died from it as medical records were typically not kept.

00What we do know is that roughly 20,000 people died travelling the 2,000-mile trail in the 19th century. To put that in perspective: there was an average of ten graves per mile. Burials were often hastily done right in the middle of the trail. This would allow wagons and animals to trample down the grave so that the scent of decomposition was erased and wolves wouldn’t feast on the remains. [2]

In another diary from the period, one pioneer writes: ‘A grave three feet deep and wide enough to receive the eleven victims [of a massacre] was dug, and the bodies placed in it. Wolves excavated the grave and devoured the remains…[Volunteers] gathered up the bones, placed them in a wagon box, and again buried them.’

So there you have it.  Life on the Oregon Trail was just as rough as the computer game would have us believe. Food was scarce. Roads were treacherous. And disease was rampant.

I will never again complain about the inconveniences of air travel.

 

1.V. J. Cirillo, ‘“More Fatal than Powder and Shot”: Dysentery in the US Army during the Mexican War, 1846-48’ in Perspectives in Biology and Medicine 52(3): pp. 400-13.
2. Statistics cited on National Oregon Trail/California Trail Center.

By |2013-07-16T16:59:39+00:00July 16th, 2013|Casebooks|7 Comments
Follow Dr Lindsey Fitzharris on WordPress.com