Death & Childhood in Victorian England

CD2I remember many childhood days spent propped up on my grandmother’s couch with a tower of pillows. I’d watch the day peacefully unfold from her picture window. One month, it was bronchitis. The next, it was pneumonia. My mother—then a nursing student—rushed me in and out of doctors’ offices and emergency rooms, where I was poked, prodded and eventually sent home with a bag full of medications.

Principals were notified; classes were missed. Friends brought armfuls of heavy books home each day after school with daily assignments. I’d hear their voices in the other room but never see their faces. Contagion was always a risk.

This was the life of a sick child.

At the time, I felt incredibly sorry for myself. Why couldn’t I enjoy good health like the rest of my girlfriends? Why did I have to stay indoors day after day, and swallow pills that made me nauseated and dizzy?

Years on, however, I began to realise that I was actually very lucky.  This wasn’t just the life of a sick child. This was the life of a sick child in the 1980s.

Today, we often associate death with old age. But we don’t have to go back far in history to find a time when childhood was both dangerous and deadly.

CD3Victorian children were at risk of dying from a lot of diseases that we’ve eradicated or can control in the 21st century, like smallpox, measles, whooping cough, diphtheria, and dysentery (to name just a few). Death was a common visitor to Victorian households; and the younger one was, the more vulnerable he or she would be.

In 1856, Archibald Tait—the future Archbishop of Canterbury—lost five children in just as many weeks to scarlet fever. [1] When the fever wasn’t fatal, it nearly always weakened the child who often died months or even years later from complications.  Indeed, this is the fate of Beth in Louisa May Alcott’s famous book, Little Women (1868/9).

Tuberculosis was also a common killer in the 19th century. On 26 April 1870, Louisa Baldwin (mother of the future prime minister, Stanley Baldwin) wrote in her diary:

I paid a sad call at the Worths where 2 children seem to be at the point of dying, the poor terrible little baby has constant fits & little Madge two years old, who has been ill 12 days with congestion of the lungs. This is the second time I’ve seen them in this illness…we went into next door where we saw poor little Miss Lee evidently very near the end, but sweet and affectionate as ever. [2]

_8No one was immune. The great scientist, Charles Darwin, lost his 10-year-old daughter, Annie [left], to tuberculosis in 1851. In his personal memoir, the grief-stricken father wrote: ‘We have lost the joy of the household, and the solace of our old age…Oh that she could now know how deeply, how tenderly we do still & and shall ever love her dear joyous face.’ [3] By the mid-19th century, tuberculosis accounted for as many as 60,000 children’s deaths per year. [4]

Literature from the period reflects the prevalence of children’s deaths in Victorian England. The dying child makes a frequent appearance in 19th-century novels. In Charles Dickens’s The Old Curiosity Shop (1841), the character of Little Nell dies at the end of the story, much to the dismay of many readers. When describing the scene to his illustrator, George Cattermole, the novelist wrote:

The child lying dead in the little sleeping room, which is behind the open screen. It is winter-time, so there are no flowers; but upon her breast and pillow, and about her bed, there may be strips of holly and berries, and such free green things. Window overgrown with ivy. The little boy who had that talk with her about angels may be by the bedside, if you like it so; but I think it will be quieter and more peaceful if she is alone. I want it to express the most beautiful repose and tranquility, and to have something of a happy look, if death can…I am breaking my heart over this story, and cannot bear to finish it. [5]

CD4Though children died with frequent regularity during the Victorian period, a child’s death was still seen as particularly tragic. Even Dickens could not help but mourn the passing of his young, fictitious character [depicted by Cattermole, right].

As a historian, people often ask me if I would have liked to have lived in the past. My answer is always a resounding ‘NO!’ When you consider that only 40 per cent of children born in the 1850s reached their 60th birthday—and less than 10 per cent reached their 80th—I feel very lucky indeed to have been born in 1982.

My life expectancy is 78.

 

1. D. P. Helm ‘”A Sense of Mercies”: End of Life Care in the Victorian Home’ (Masters Thesis, University of Worcester, 2012), p. 15.
2. Diary of Louisa Baldwin 1870, 26th April 1870. Baldwin papers. 705:775/8229/7 (ii), Worcestershire Record Office. Originally quoted in Helm.
3. The original manuscript is in the Darwin Archive of Cambridge University Library (DAR 210.13). You can find the entire transcript online here.
4. J. Lane, A Social History of Medicine: Health, Healing and Disease in England 1750‐1950 (London, 2001), p.142.
5. Letter from Dickens to Cattermole.

By |2013-10-15T17:08:37+00:00October 15th, 2013|Casebooks|9 Comments

Dying the Good Death: The Kate Granger Story

_8

Recently, I had the privilege of interviewing Kate Granger, a 31-year-old physician who was diagnosed with an aggressive form of sarcoma in 2011 and given less than 5 years to live.

Kate made headlines in British newspapers when she announced that she was going to tweet from her deathbed, using her own death as a communication tool for opening up discussions about mortality. Kate and I talked about everything from the ‘over-medicalisation’ of death in the 21st century, to how she wants to die at home surrounded by friends and family.

Click here to read the interview on Medium.

I hope you find her message as inspiring as I did.

By |2013-10-09T20:51:43+00:00October 9th, 2013|Casebooks|6 Comments

Death Salon Cabaret: The Uncommon Corpse

DSC1

I was 17 years old when I saw my first dissected body at a chiropractic school of medicine just outside of Chicago. Since then, I’ve seen thousands more, some more disturbing than others. There have been disembodied body parts floating in jars; whole bodies splayed open, covered in shellac and nailed to wooden platforms; even corrosion casts uncovering the tiniest details of the human vascular system. Each one has provoked a reaction inside me. Each one has deepened my resolve to share the stories of the people who died and the anatomists who cut open and preserved their bodies in earlier centuries.

On Friday, October 18th I’ll be talking about some of the more unsettling specimens I’ve encountered in my research at the Death Salon Cabaret in Los Angeles. The night will be hosted by Lord Whimsy, and will consist of fast-paced talks interspersed with artistic performances focusing on ‘the idea of the rare corpse that will not simply fade into the background of history…the bodies that were not lost to the ground or the pyre but insisted on staying longer and forcing the living to face their mortality.’

HBSpeakers include Paul Koudounaris, author of The Empire of Death and Heavenly Bodies—two books which explore Europe’s catacombs and ossuaries with lavish photographs (see right); Jeff Jorgenson, owner of Elemental Cremation & Burial and advocate for green death practices; Bess Lovejoy, author of Rest in Pieces: The Curious Fates of Famous Corpses; Sarah Troop, host of The Cabinet of Curiosities Podcast; Joy Nash, Los Angeles-based actor; and Christine Colby, managing editor of Penthouse.

There will also be haunting musical performances by Gothic Beauty, Jill Tracy, and ‘Death Troubadour,’ Adam Arcuragi.

So if you find yourself in LA next month—and you aren’t faint of heart—come to the Death Salon Cabaret at the Bootleg Theatre. I promise you won’t be disappointed…though you may be advised not to eat before my presentation!

Rumour has it that even the Grim Reaper will be there for photos… I just hope he’s taller than me.

Click here to book your tickets

By |2013-09-27T20:59:19+00:00September 27th, 2013|Casebooks|7 Comments

DO NOT SIT! A History of the Birthing Chair

ChairI was standing on the second floor of Surgeons’ Hall in Edinburgh waiting for my film crew to begin rolling for my upcoming documentary, Medicine’s Dark Secrets, when I spied a chair (left) in the corner. At that point in the day, I was exhausted and my attention to detail was diminishing with each passing second. Heartened by the sight of a chair, I quickly made my way towards my desired rest stop. Just as I began my descent into blissful comfort, however, I noticed a sign with big bold lettering: Museum Object: Please DO NOT SIT!

Just seconds before plopping my full weight down onto an antique chair, I awkwardly manoeuvred myself back into a standing position and looked around to make sure no one had seen my faux pas.

Upon closer inspection, I realised how obvious my mistake had been. This was no ordinary chair. It had a semi-circle cut from the seat, and looked tremendously uncomfortable. Indeed, I’d have to sit with my legs straddling either side of this awkward contraption to even remain balanced on it.

This was an 18th-century birthing chair.

_8

Today, the idea of giving birth while sitting upright in a wooden chair may seem torturous.  But long before delivery rooms, stirrups, forceps and foetal monitors, a woman gave birth at home in a chair with the aid of her midwife and other female friends, relatives and neighbours. These women were known as the ‘gossips’, for they spread the word to all the women in the community when another went into labour. The ‘gossips’ supported the mother-to-be during this time by praying with her, preparing special foods, and helping the midwife with any other menial tasks that needed doing.

When the time came, the pregnant woman would be propped up in the birthing chair. The midwife would sit below her, ready to catch the baby, while other women supported and comforted her from above. After the delivery, the exhausted mother would then be lead back to her bed, which remained unsullied from the birth itself.

_8Overtime, birthing became the purview of the medical community. Midwives were replaced by male-midwives (the precursor to the modern-day obstetrician), who introduced forceps into the delivery. Birthing chairs were modified to accommodate these changes. Take, for example, the one on the left. The arm and foot rests on this wooden chair could be adjusted for the mother’s comfort; and (most importantly), the back could fold down, converting it into a bed or an operating table—a necessary feature if forceps were to be used.

Birthing chairs were coveted pieces, and often passed down from generation to generation as family heirlooms. Little by little, however, the hospital became the locale of birth and eventually the chairs were discarded.

That said, many examples still exist today in museums around the world. Thinking back on the one in Edinburgh, I am comforted by the sign with its big, bold letters. Clearly, I was not the first to try to sit in the birthing chair; and I doubt I will be the last.

By |2013-09-19T15:24:10+00:00September 19th, 2013|Casebooks|34 Comments

Renaissance Rhinoplasty: The 16th-Century Nose Job

L0058567 Artificial nose, Europe, 1601-1800The 16th century was a particularly bad time for noses. In 1566, the famous astronomer, Tycho Brahe, had his sliced off during a duel and was forced to wear a replacement reportedly made of silver and gold. [1] Others lost theirs in similar fights, or to cancerous tumours that ate away the cartilage on their faces. But the biggest culprit to noses during this period was the new disease sweeping through Europe: syphilis.

Before the discovery of penicillin in 1928, syphilis was incurable. Its symptoms were as terrifying as they were unrelenting. Those who suffered from it long enough could expect to develop unsightly skin ulcers, paralysis, gradual blindness, dementia and what today is known as ‘saddle nose’—a grotesque deformity which occurs when the bridge of the nose caves into the face and the flesh rots away.

As syphilis raged throughout 16th-century Europe, the ‘saddle nose’ became a mark of shame, symbolizing the victim’s moral and bodily corruption. Some, in desperation, turned to surgeons to help disguise their deformities. One man in particular was renowned for his skills: the Italian surgeon, Gaspare Tagliacozzi.

L0032530 G. Tagliacozzi, De curtorum chirurgia per inBefore Tagliacozzi, most surgeons used the ‘Indian Method’ for nasal reconstruction. This involved cutting a nose-sized section of skin from the forehead and attaching it to the bridge of the nose to maintain a steady blood supply. The flap was then twisted into place and sewn over the damaged area, thus providing a suitable ‘replacement’ for the lost nose but leaving one’s forehead scarred.

Tagliacozzi had an entirely different approach. His process involved partially cutting a flap of skin from the upper arm, reshaping it into a nose, and then grafting it to the damaged nasal cavity (see right image). The patient’s arm would then be held in place using bandages for approximately 2 weeks while the graft attached itself to the face. Afterwards, the surgeon severed the new ‘nose’ from the arm and began reshaping and contouring the piece of skin.

A 16th-century contemporary described the surgery:

First they gave the patient a purgative. Then they took pincers and grabbed the skin in the left arm between the shoulder and the elbow and passed a large knife between the pincers and the muscle, cutting a slit in the skin. They passed a small piece of wool or linen under the skin and medicated it until the skin thickened. When it was just right, they cut the nose to fit the end of the little skin flap. Then they snipped the skin on the arm at one end and sewed it to the nose. They bound it there so artfully that it could not be moved in any way until the skin had grown onto the nose. When the skin flap was joined to the nose, they cut the other end from the arm. They skinned the lip of the mouth and sewed the flap of skin from the arm onto it, and medicated it until it was joined to the lip. Then they put a metal form on it, and fastened it there until the nose grew into it to the right proportions. It remained well formed but somewhat whiter than the face. It’s a fine operation and an excellent experience. [2]

The entire procedure could take up to 5 months, and no doubt caused considerable pain and discomfort to the patient during the process.

_*That aside, Tagliacozzi boasted of his skill, claiming that the noses he reconstructed were better than the originals. Yet when he died in 1599, so too did his method.  Over the next several hundred years, surgeons continued to prefer the ‘Indian Method’ when performing rhinoplasty, citing that Tagliacozzi’s technique left the new nose vulnerable to cold winters, when it often turned purple and fell off. On rare occasions, however, the ‘Italian Method’ was employed, such as the case of a soldier whose face was severely damaged in July 1944 (see above picture).

Unlike his surgical techniques, Tagliacozzi’s mantra persisted long after his death and is still quoted by modern-day plastic surgeons, who see him as the ‘father’ of their discipline:

We restore, rebuild, and make whole those parts which nature hath given, but which fortune has taken away. Not so much that it may delight the eye, but that it might buoy up the spirit, and help the mind of the afflicted. [3]

1. Recent forensic tests conducted on Brahe’s skeletal remains suggest that the nose may, in fact, have been made of copper.
2. Originally qtd in William Eamon, The Professor of Secrets: Mystery, Medicine and Alchemy in Renaissance Italy (2010), pp. 95-96.
3. G. Tagliacozzi, De Curtorum Chirurgia per Insitionem (1597).

By |2013-09-04T15:47:26+00:00September 4th, 2013|Casebooks|54 Comments

A Morbid Chat with ‘The Chirurgeon’s Apprentice’

00‘Death has never been more fashionable, if the popularity of Dr. Lindsey Fitzharris’s work is any guide. This glamorous medical historian’s morbid blog, The Chirurgeon’s Apprentice, has attracted 14,000 loyal adherents (and counting) in the space of 2.5 years, and she recently crowd-funded her own TV documentary to the tune of $32,000 in under 60 days. Medicine’s Dark Secrets begins filming in August.’

Read my interview with Teal Cartoons here.

By |2013-07-26T10:58:24+00:00July 26th, 2013|Casebooks|1 Comment

YOU HAVE DIED OF DYSENTERY

Children of the 70s and 80s will likely remember Oregon Trail, the computer game where the player assumes the role of wagon leader and guides a group of settlers through the pioneer landscape of 19th-century America. You would hunt bison, shoot rabbits, ford rivers and pick up other settlers as you made your way from Missouri to Oregon. But just as you really got into the game, this would happen:

00

If you are like me, you probably shouted: ‘NOT AGAIN!’

So what exactly is dysentery, and why did you and all your settlers keep dying from it in Oregon Trail?

Dysentery is an intestinal inflammation that causes severe diarrhea, usually characterised by mucus or blood in the feces. Left untreated, the disease can lead to rapid loss of fluids, dehydration, and eventually death.

There are two forms of dysentery. One is caused by a bacterium, the other, an amoeba. The former is the most common in Western Europe and the United States; and is typically spread through contaminated food and water.

Outbreaks of dysentery were more prevalent during war, where the disease spread rampantly because of the unhygienic conditions of the camps. During the Mexican War (1846-48), a staggering 88% of deaths were due to infectious disease, most of those overwhelmingly dysentery. For every man killed in battle, seven died of disease. The American Civil War was no better. You were more likely to die off the battlefield than on it, and dysentery was the primary cause. [1]

00That said, civilians also died of dysentery with some frequency in the 19th century, especially those who were itinerant. Pioneers travelling the Oregon Trail wouldn’t have faired much better than soldiers fighting in war. They would have travelled in large groups—wagon after wagon trailing one another—and their access to clean water and food would have been severely limited. In 1853, one pioneer wrote in her diary: ‘Still in camp, husband and myself being sick (caused, we suppose, by drinking the river water, as it looks like dirty suds than anything else)’.

Diseases such as tuberculosis, flu, measles and smallpox spread like wildfire through their crowded, makeshift camps. Dysentery would have been one of the leading causes of death amongst these pioneers, although it is difficult to determine just how many died from it as medical records were typically not kept.

00What we do know is that roughly 20,000 people died travelling the 2,000-mile trail in the 19th century. To put that in perspective: there was an average of ten graves per mile. Burials were often hastily done right in the middle of the trail. This would allow wagons and animals to trample down the grave so that the scent of decomposition was erased and wolves wouldn’t feast on the remains. [2]

In another diary from the period, one pioneer writes: ‘A grave three feet deep and wide enough to receive the eleven victims [of a massacre] was dug, and the bodies placed in it. Wolves excavated the grave and devoured the remains…[Volunteers] gathered up the bones, placed them in a wagon box, and again buried them.’

So there you have it.  Life on the Oregon Trail was just as rough as the computer game would have us believe. Food was scarce. Roads were treacherous. And disease was rampant.

I will never again complain about the inconveniences of air travel.

 

1.V. J. Cirillo, ‘“More Fatal than Powder and Shot”: Dysentery in the US Army during the Mexican War, 1846-48’ in Perspectives in Biology and Medicine 52(3): pp. 400-13.
2. Statistics cited on National Oregon Trail/California Trail Center.

By |2013-07-16T16:59:39+00:00July 16th, 2013|Casebooks|7 Comments

Buried Alive: 19th-Century Safety Coffins

00In 1822, Dr Adolf Gutsmuth set out to conquer his fear of being buried alive by consigning himself to the grave in a ‘safety coffin’ that he had designed himself. For several hours, he remained underground, during which time he consumed a meal of soup, sausages and beer—all delivered to him through a convenient feeding tube built into the coffin.

Gutsmuth wasn’t the first to design something like this. Around 1790, the Duke Ferdinand of Brunswick had the first safety coffin built which included a window to allow light in and a tube to provide a fresh supply of air. The lid of the coffin was then locked and two keys were fitted into a special pocket sewn into his burial shroud: one for the coffin itself and one for the tomb.

The Germans were particularly ingenious when it came to safety coffins, patenting over 30 different designs in the 19th century. The best-known one was the brainchild of Dr Johann Gottfried Taberger, which included a system of ropes that attached the corpse’s hands, feet and head to an above-ground bell.

00Although many subsequent designs tried to incorporate this feature, it was by-and-large a design failure. What Dr Taberger didn’t account for is the fact that the body begins to bloat and swell as it decomposes, causing it to shift inside the coffin. These tiny movements would have set the bells ringing, and visitors to the cemetery running.

Of course, it wasn’t just the Germans who were consumed with taphophobia. Writer and artist, Adrian Teal, recently told the story of an 18th-century Brit who had his coffin stored in the rafters of his house.  And the American doctor, Timothy Clark Smith was so fearful that he would catch the ‘sleeping sickness’ and be buried alive, that he created a grave that continues to intrigue (and frighten) visitors of Evergreen Cemetery in New Haven, Vermont.

00When Dr Smith died (aptly enough on Halloween, 1893), his body was interred in a most unusual crypt, with his face positioned beneath a cement tube that ended at a piece of plate glass which would allow the unfortunate doctor to gaze upward in the event of his premature burial. Visitors to the cemetery used to report that they could peer down inside the grave and see Dr Smith’s decomposing head. Nowadays, all you can see is darkness and a bit of condensation (to see photo, click here).

The American horror writer, Edgar Allen Poe, also seemed unnaturally preoccupied by thoughts of being buried alive as the subject appears with some frequency in his own writings. In the story, Premature Burial (1850), he even describes a Taberger-like coffin with ‘a large bell [suspended from the roof of the tomb], the rope of which, it was designed, should extend through a hole in the coffin, and so be fastened to one of the hands of the corpse.’

If ropes were a failure, then the Russian Count Michel de Karnice-Karnicki’s design was an even bigger catastrophe. In 1897, he buried one of his assistants in order to demonstrate the features of his safety coffin. If the device detected movement from within, it was rigged to open a tube which would allow air to flow while simultaneously raising a flag and ringing a bell. Unfortunately, nothing went to plan and the demonstration failed miserably. The assistant survived. Karnice-Karnicki’s reputation did not.

If all this seems a bit superstitious to your modern sensibilities, consider the fact that safety coffins are still available for purchase today. In 1995, Fabrizio Caselli invented a model that includes an emergency alarm, two-way intercom, a flashlight, oxygen tank, heartbeat sensor and heart stimulator.

The fear of premature burial is far from dead, dear readers.

By |2013-06-26T15:19:10+00:00June 26th, 2013|Casebooks|25 Comments

Ray-Ban’s Predecessor? A Brief History of Tinted Spectacles

L0059071 Turn pin spectacles, steel wire, eye preservers, double foldA recent conversation with Matthew Ward from History Needs You piqued my curiosity about a pair of spectacles in the Wellcome Collection [pictured left]. At first glance, you may think these oddly tinted glasses belong to the wardrobe department of a whimsical Tim Burton film. And yet, these glasses are over 200 years old, made not for the likes of Johnny Depp, but rather an 18th-century gentleman.

This got me wondering: were these Georgian spectacles a precursor to modern-day sunglasses? Or were they something altogether different?

Here’s what I discovered.

grabimg.phpIn 1750, the optician, James Ayscough, began making double-hinged spectacles with tinted lenses, like the ones pictured above. Ayscough felt that white lenses created an an ‘offensive glaring Light, very painful and prejudicial to the Eyes.’ Instead, he advised ‘green or blue glass, tho’ it tinge every Object with its own Colour.’ This would take ‘off the glaring Light from the Paper,’ and render ‘every Object so easy and pleasant, that the tenderest Eye, may thro’ it view any thing intently, without Pain.’ [1]

Were these avant-garde spectacles the Ray-Bans of its day? Not quite.

Ayscough didn’t devise these lenses to protect his patients’ eyes from the sun. Rather, he believed that white glass had a ‘softer Body than any other’ and therefore would ‘not receive so true a Figure in the polishing, as a Glass of a harder Nature.’ This resulted in a distorted lens full of ‘Specks and Veins’ which would only further impair a person’s already imperfect vision. Ayscough held tinted glass in such esteem that he even recommended it be used for the construction of telescopes and microscopes. [2]

While Ayscough double-hinged design was something of a new rage in the 18th century, he was not the first to use tinted glass when making spectacles (although he was one of the first to write extensively on the subject). Already by the mid-1600s, people were purchasing and wearing tinted glasses throughout England.

One such person was the famous diarist, Samuel Pepys.

Many people believe that coloured spectacles were prescribed to syphilitic patients who suffered from photosensitivity brought on by the advancement of the disease into the ocular region. There has been much speculation on whether Pepys—whose own brother died of syphilis in 1663—also suffered from lues venerea, and whether this led to his decision to purchase green tinted glasses from the spectacle-maker, John Turlington.

00Although it makes for an intriguing tale, Pepys never mentions the glasses in relation to syphilis (nor does he allude to any syphilitic symptoms other than a mouth ulcer in 1660). Rather, he writes that his ‘eyes are very bad, and will be worse if not helped.’ And so on 24 December 1666, ‘I did buy me a pair of green spectacles, to see whether they will help my eyes.’ [3]

For Pepys, the purchase seems to have come from a desire to alleviate eye soreness and nothing else.

Moreover, a quick scan through 18th-century medical texts on syphilis reveals no mention of tinted glasses.  In Daniel Turner’s Syphilis: A Practical Dissertation on the Venereal Disease (1717), he doesn’t even discuss eye-related disorders associated with the pox. Contrastingly, in the Treatise of the Venereal Disease (1789), the author correctly notes that syphilis can cause inflammation of the eye, but he offers no specific remedy for this condition. Similarly, in William Buchan’s Observations Concerning the Prevention and Cure of the Veneral Disease (1796), coloured spectacles are not referenced. Instead, Buchan recommends blistering plasters behind the ear or on the temple to alleviate ocular problems related to the advancement of syphilis.

It should also be noted that spectacles, like the ones featured in this article, would have been fairly expensive. Even if medical practitioners had offered them as treatment for photosensitivity brought on by ocular syphilis, the majority of those suffering from the disease would have been unable to afford them.

 

1. James Ayscough, A Short Account of the Nature and Use of Spectacles (1750), p 13.
2. Ibid.
3. Diary of Samuel Pepys, vol. 48 (24 December 1666). For more on Pepys’s eye disorders, see Graham W. Wilson, ‘The Big Brown Eyes of Samuel Pepys,’ in Archives of Ophthalmology, 120 (July 2002): pp. 969-975. For information on Pepys’ general health, see D. Powers, ‘The Medical History of Mr and Mrs Samuel Pepys,’ in the Lancet (1895): pp. 1357- 1360.

By |2013-06-21T16:00:17+00:00June 21st, 2013|Casebooks|19 Comments

The Chirurgeon’s Apprentice gets a Facelift!

theChirurgeonsApprentice.OnBlack.FINAL

After 2.5 years of blogging, I’m excited to unveil a brand new look for The Chirurgeon’s Apprentice, complete with a new logo! Each of the symbols in the logo represent a story told on this site: A History of the Barber Pole; Syphilis: A Love Story; The Anatomy of a Broken Heart and The Falciform Amputation Knife.

I hope you enjoy the website’s facelift as much as I do!

To kick things off, I’ll be posting a brand new article later this week. I don’t want to spoil the surprise. Let’s just say it’s appropriately suited to your morbid tastes.

By |2013-06-10T19:16:28+00:00June 10th, 2013|Casebooks|8 Comments