Overcrowded and Underfunded: 18th-Century Hospitals and the NHS Crisis

The problem of overcrowded hospitals in Britain is now an annually recurring one. Every year, especially in winter, operations are cancelled, treatments postponed and patients sent home because there simply isn’t bed space for them. A combination of increased admissions of the elderly in the winter months, seasonal outbreaks such as flu and norovirus, and the impact of weather-related accidents all serve to pile on the pressure to an already-embattled healthcare system.

Embattled Doctor!

According to the BBC, NHS and social care services are ‘at breaking point’, with an open letter warning the government that ‘things cannot go on like this’.http://www.bbc.co.uk/news/uk-29501588. The story is now a perennial one. Every year (and in fact every couple of months) a mix of underfunding, overcrowding and staff stress puts the NHS in the headlines. Winter almost always exacerbates the problem. A year ago the outgoing NHS Chief Executive David Nicholson warned that the “toxic overcrowding” of accident and emergency departments in Britain not only impacted upon service levels but could have far more serious effects including higher levels of patient mortality and unsustainable levels of staff stress. The president of the ‘College of Emergency Medicine’ went even further, stating that the whole system was sailing dangerous close to complete failure. With the Daily Telegraph claiming that many patients were afraid to ask for help from staff pushed almost to their limits, the United Kingdom is perhaps still in the midst of what it last year called, “David Cameron’s care crisis”.

Ann-NHS-demonstrator-dres-007 Image from http://www.TheGuardian.com

It is indeed easy to think of this situation as a uniquely modern one, linked to the seemingly continual squeeze on budgets. Surely this wouldn’t have happened in the past, where well-run hospitals staffed by starchy matrons ran their (spotlessly clean) wards with military precision? In fact, if we peer back through time to hospitals even before the NHS, the situation can look remarkably familiar.

In 1772 Dr John Sharp, a philanthropist and trustee of the charity established by the late Lord Crewe, established a charitable infirmary in the impressive medieval castle at Bamburgh on the north east coast of England. Sharp’s brother William was a celebrated surgeon at St Bartholomew’s hospital in London and so the infirmary was able to benefit from the advice of a top medical man. As such it was equipped with the latest medical technologies, from mechanically operated hot and cold seawater baths to electrical machines and even an infirmary carriage to take invalid patients down to the beach for a restorative dip. In terms of many other institutions this was state of the art.

Dr Sharp

Many hospitals of the time relied on subscriptions – donations by wealthy benefactors – for their building and running. For patients to be admitted required a letter of recommendation from a subscriber. It was therefore very difficult just to turn up and ask for treatment. Bamburgh was different. Funded completely by the charity it had an open surgery – effectively an accident and emergency centre – on weekends, which meant that anyone, but especially the poor, could attend and be seen with relative ease. A quick note from a local clergyman confirming their status as a poor ‘object’ was sufficient. Unsurprisingly, though, this very accessibility meant that it was extremely popular.

In the first year of the charity, the numbers of patients through its doors was a modest 206. In 1775 this had more than doubled, and in 1781 it treated 1106. By the end of that decade, the infirmary was regularly treating more than 1500 patients every year, and was expending more than £250 every year on treatments and drugs. As well as outpatients, the infirmary contained around 20 beds. To give some perspective, these numbers were at times comparable with some of the ‘flagship’ hospitals in major Georgian towns such as Bath and Birmingham.

Bamburgh Castle

A staff consisting of a surgeon, two assistants and several ancillary staff, alone catered for the influx of patients. On any given attendance day between 60 and 100 patients could attend, and this put immense strain on both facilities and staff. In 1784 a freezing winter and ‘melancholy weather’ caused many poor people to perish, and admissions to rise dramatically. Outbreaks of infection also increased the pressure. The ‘malignant smallpox’ in neighbouring parishes was a constant threat to families, while the winter of 1782 also brought an outbreak of influenza at the neighbouring military barracks at Belford. This elicited a plea for infected soldiers to be treated at Bamburgh – a request declined by Dr Sharp for fear of infecting the rest of his patients.

The resident surgeon, Dr Cockayne, keenly felt these increasing pressures. Writing to Dr Sharp in the 1780s he noted both the continual increase in duties and the ‘vast number of patients admitted’ all of which added to his great worry and trouble. In the politest possible terms he asked for a rise in his wages, a request that led to him moving from ad hoc payments to a permanent wage.

The overcrowding at Bamburgh certainly chimes with the problems faced by the NHS on a daily basis. In simple terms there are simply too few staff to look after too many patients. The demands of an ever-changing medical environment increase the workload for staff, and these lead to further questions about pay and conditions. But it is interesting to consider that while Bamburgh infirmary faced the same socio-medical conditions as do hospitals today the question of funding was markedly different. Bamburgh was a well-funded institution. It had abundant money to spend on facilities and equipment and did so. And yet, the pressures of increasing numbers, and the unpredictability of admissions, still threatened to overwhelm it. Does this suggest that at least some problems are not simply reducible to finance?

Many suggestions have been put forward, from streamlining the allocation of beds to increasing the range of conditions treatable by pharmacists and GPs and even treating some conditions in the patient’s own homes. Whatever the answer it is clear that hospital overcrowding is not a new problem. Medical professionals in the past were all too familiar with the challenge of meeting increasing and uneven demand with limited resources.

Sit up Straight! Bad posture and the ‘Neck Swing’ in the 18th century.

Posture is a problematic issue for medicine. Having established a link between ‘bad’ posture and all manner of conditions, from spinal curvature and back pain to nerve damage and headaches, slouching is high on the government’s hit list. Why? Let’s be clear about it, back pain is as much an economic issue as a medical one. While clearly there are many causes for back pain, the BBC recently calculated that it costs the NHS over £1.3 million every day. Add to that the costs to businesses of lost working time and it is painful to the economy as well as the body. It is no coincidence that the NHS website has a number of micro-sites dedicated to suggestions for improving the way we sit and stand.

Posture chart

All manner of devices can be bought with the aim of straightening us up. Leaf through the pages of those glossy little free catalogues that often appear in the post (the ones which routinely have walk-in baths, shooting sticks and things for kneeling on in the garden…you get the picture?) and you’ll notice a panoply of postural devices. There are corsets to force you back into position, as well as all manner of back braces to pull your shoulders back. You can buy cushions for your favourite chair that encourage you to sit in a ‘better’ (for which read less comfortable!) way, as well as special chairs that encourage you to kneel. Even, recently, desks that you stand at, instead of slumping in front of the PC screen. All tested. All clinically proven. All, usually, very expensive. Someone is making money off our drooping shoulders and crooked spines.

But devices to make us sit or stand ‘straight’ are certainly nothing new. As we’ll see, the Stuarts and Georgians got there first with a variety of more or less painful solutions. What have changed are attitudes towards posture. For the Georgians, posture was partly medical, certainly, but perhaps more of a social and cultural issue. Put another way, the ‘polite’ body stood straight and tall; to hunch over was unnatural and uncouth.

In the eighteenth century, the ideal body was straight and well proportioned. But even a cursory glance around the inhabitants of a Georgian town would confirm that many – perhaps the majority – were very far from this ideal. Vitamin deficiency caused by poor diet stunted growth, while a variety of diseases experienced through life could leave their mark. Accidents and bone-breakages might be cursorily treated, but a broken leg could easily leave a person with a limp. Also, many conditions, which today are easily treatable, were then left to run rampant through the body. All this meant that the ‘standard’ Georgian body was often far from the ideal.

It would be easy to assume that people simply accepted their lot and got on with their lives. Doubtless many did. But the eighteenth century also witnessed an increasing willingness to shape the body to try and bring it more in line with this elusive ideal. In 1741, Nicholas Andry published his famous ‘orthopedia’, in which he likened the human body to a tree, which needed support as it grew and, later, as it declined. His famous image of the so-called ‘Tree of Andry’ illustrates this well.

The eighteenth century was a golden age of corrective devices. Just like today you could buy a vast number of corsets and stays, which aimed not only to correct medical deformities, like ruptures, but to help women to try and meet the most fashionable body shape, of a miniscule waist and broad bust. The experience of wearing some of these devices must have been at best uncomfortable and, at worst, excruciating.

There were, for example, steel ‘backs’ – large plates of metal inserted and lashed inside the back of the wearer’s clothing, which ‘encouraged’ them to stop slouching. Metal ‘stays’ gave the illusion of a harmonious form while simultaneously forcing the sufferer’s body back into a ‘natural’ shape. Here’s a typical advert from an eighteenth-century newspaper showing the range of available goods:

“London Daily Post and General Advertiser, February 16th, 1739
‘This is to give NOTICE
THAT the Widow of SAMUEL JOHNSON, late of Little Britain, Near West Smithfield, London, carries on the Business of making Steel Springs, and all other Kinds of Trusses, Collars, Neck Swings, Steel-Bodice, polish’d Steel-Backs, with various Instruments for the Lame, Weak or Crooked.
N.B. She attends the Female Sex herself”

Perhaps some of the most uncomfortable devices were those to correct deformities of the neck. For both sexes, having a straight neck was extremely desirable. For men, keeping the chin up was a sign of masculine strength, poise and posture. Those who slouched were mumbling weaklings, destined never to get on in business or the social sphere. The allure of the soft female neck, by contrast, lay in its swan-like grace; a crooked neck ruined the allusion of femininity and threatened the chances of a good match.

Sheldrake illustration

Many makers supplied products to help sufferers of neck problems. Metal collars, hidden under clothing, forced the chin up. If it sagged, it would rest on an uncomfortably hard metallic edge. Perhaps the most extreme of these devices, however, was the ‘neck-swing’ supposedly introduced into England from France by one ‘Monsieur Le Vacher’. This heavy apparatus fitted around, and supported, the wearer’s head and neck, after which they were suspended, feet off the ground, in an effort to elongate the spine and promote a straighter back. We have one unique testimony of someone who tried it. She described how, every morning, she was: “suspended in a neck- swing, which is merely a tackle and pulley fixed to the ceiling of the room; the pulley is hooked to the head-piece of the collar, and the whole person raised so that the toes only touch the ground”. In this awkward position she remained, sometimes for long periods of time.

There seemed to be something of a vogue for postural devices in the eighteenth century, to the extent that they even entered popular culture. In the anonymous Village Memoirs: In a Series of Letters Between a Clergyman and his Family in the Country, and his Son in Town, the titular clergyman noted the vagaries of bodily fashions: “To remedy the ill effects of a Straight line, an uniform curve is now adopted – but alteration is not always improvement – and it reminds me of the conduct of the matron, who, to prevent her daughter from dropping her chin into her bosom, threw it up into the air by the aid of a steel collar – Hogarth’s Analysis has as yet been read to very little purpose’

It is therefore interesting to note how the dialogue of posture has changed over time. Georgian postural devices sought to return the body to a state of nature, or meet an ideal of appearance. While they certainly encouraged the returning of sufferers to productivity, this was less important than creating the impression of a harmonious whole. Today it might be argued to be the other way around. While cosmetic appearance is undoubtedly important, the emphasis is firmly upon health and minimizing the pressure on a creaking health servce. Our impressions of the body rarely remain static. How will the bodily ‘ideal’ translate itself in future?

17th-century remedies and the body as an experiment

I have long argued that, for people in the past, the body was a site of experiment. Today, we are constantly told that medicines should be handled with caution. In the accompanying (usually terrifying) leaflets included with most medicines, we are told in great detail how to use them, how not to use them and, most worryingly, the list of possible side-effects, which often seem to outweigh the benefits. One of the potential side-effects in my box of mild painkillers, for example, is a headache…the reason I usually head for the painkillers! But medicines, say the manufacturers, should only be used as directed by a medical professional. Care should be taken with the dosage, and they should not be used for more than a few days. If symptoms persist, head for the nearest A&E and don’t book any holidays!

Image from http//:www.theboredninja.com
Image from http//:www.theboredninja.com

We are a society who is certainly prepared to self-dose – something attested to by the shelves full of proprietary medicines in modern pharmacies. Indeed there is a broader issue of distrust with modern biomedicine, leading people to try out alternative and healers. The resurgence of medical herbalism in recent years, the popularity of herbal ‘magic bullets’ from Royal Jelly to Glucosamine and treatments from acupuncture to Yoga all attest to our willingness to consider alternatives.

Medicines

But all of these ‘alternatives’ are controlled. When we buy over-the-counter remedies they are generally mild and, unless deliberately consumed in large quantities, not dangerous. They are also strongly regulated, and have to pass years of testing before they make it onto the shelves. Alternatives are now generally regulated, with professional practitioners, while herbal medicines from health food shops are also subject to increasing regulation and scrutiny. Alternative practitioners now have available qualifications and endorsements. All in all, while we certainly consider alternatives, we are doing so within a defined, controlled and measured environment.
Early-modern people, however, held a different view of both their bodies and the concept of how medicine worked. In their view, medicine was a process and one that required continual experimentation to find what worked and what didn’t. Even a cursory glance over an early-modern remedy collection confirms this. Some remedies are highlighted – sometimes by a pointing hand or a face, to signify their value. Sometimes words like ‘probatum’ (it is proved) attest to their efficacy, or even notes like ‘this cured me’ or, my favourite, the simple ‘this I like’. Others, however, were clearly unsuitable and might be crossed out many times with thick strokes, highlighting the dissatisfaction of the patient.

A page from Wellcome Library MS 71113, p.10. See article by Elaine Leong at http://recipes.hypotheses.org/tag/lady-anne-fanshawe
A page from Wellcome Library MS 71113, p.10. See article by Elaine Leong at http://recipes.hypotheses.org/tag/lady-anne-fanshawe

It is worth mentioning that the whole concept of ‘working’ has shifted over time. Today, a remedy ‘works’ if it makes us feel better. In the seventeenth century, however, a medicine ‘worked’ if it had an effect. Therefore if a purgative was taken as a measure against, say, a cold, then provided it made the subject purge it was regarded as having ‘worked’, regardless of whether the cold got better. In this sense medicine was experimental. People consistently adapted, modified and changed recipes, adding or replacing substances, until they found something they were happy with.

This process of experimentation was, though, potentially deadly. Use too much of the wrong type of herb, plant or substance, and the results could truly be dangerous. It is often forgotten that plants are full of chemicals. It is entirely easy to suffer an overdose using plant material as it is with modern tablets. The contents of early-modern remedies are often the butt of jokes. Using everything from animal matter, live or dead, to breast milk, spiders’ webs and so on is difficult to fathom from several centuries distance, even though it was perfectly logical to people at the time. In fact, little actual work has yet been done to assess exactly how much damage could potentially be done by people using things like animal or human dung in their efforts to make themselves better. It would be interesting to actually work out the levels of various compositions in some medical remedies, to gauge their potential for harm. This is not helped by the often vague doses provided in recipes. Whilst some directions might be fairly specific in terms of weight measurements, others might rely on including ‘as much as will lye on a sixpence’ or, worse, a handful. Depending on the size of the recipe-preparer’s hand, this could vary considerably!

But this experimentation also meant that virtually everyone was a scientist, involved in testing and measuring remedies against their own bodies. In some cases, though, the element of experiment was literal. Many elite gentlemen followed an interest in science, and especially chemistry, as part of their wider intellectual pursuits. In the early 1700s, the wealthy London lawyer John Meller, latterly of Erddig in Flintshire, kept a notebook entitled ‘My Own Physical Observations’ in which he recorded details of his chemical experiments, and sometimes upon himself! Some of his experiments, for example, appear to be related to finding substances to purge himself. On more than one occasion he seems to have gone too far and suffered the consequences. We can only imagine the circumstances which led him to record that one purge had “proved too hot” for him!
17th century toilet from Plas Mawr, Conwy (image from education.gtj.org.uk

Our early-modern ancestors were arguably more in tune with their bodies than we are today. They continually sought new ways to relieve themselves of illnesses and symptoms, accumulating those that seemed to make things better and discarding the rest. Whilst we also do this to some degree, the stakes were much higher for them. We are protected to some degree by the various safeguards in place, and also perhaps by a reluctance to put our own health at risk.
Many early-modern remedies must, though, have been harmful and some might have resulted in permanent damage to internal organs, or even death.

Sickness and medicine are often referred to in military terms, with ‘magic bullet’ cures helping people to ‘battle’ their illnesses. In a sense though our forebears were engaged in single combat, each remedy, each experiment, carrying both high risk and high reward. Remember this the next time you reach for your packet of painkillers!

Name and Shame: performance and reputation in early modern medicine

Last week the issue of the performance of surgeons came under scrutiny. The health secretary, Jeremy Hunt, threatened to ‘name and shame’ any surgeons who refused to publish their performance data, including mortality rates, in league tables. http://www.bbc.co.uk/news/health-22899448 Surgeons have raised many objections to the plans including potential stigmatisation of those seen as under-performing (an issue that is itself problematic say some authorities), the potential for misleading figures and, no doubt, a dent to professional pride.

The measurement of performance  – at least in terms of quantitative measurement and aggregation – together with the publication of results are a modern phenomenon in the medical profession. It is interesting to consider the issue of performance, and of public perceptions of medical practitioners in the past.

In the early modern period, for example, reputation was most certainly a central factor in people’s choice of medical practitioner. They wanted at least some reassurance that the man about to lance their boil or cut for the stone was not some cack-handed amateur who would leave them bleeding to death on the kitchen table. But reputation worked at a deeper level than this. In rural communities, for example, people effectively became healers by reputation; once a cure had been attributed to them, word of the power of the healer would spread and a position cemented. This was generally the way that so-called ‘cunning folk’ and ‘irregular’ healers gained prominence.

It is interesting to consider early-modern perceptions of ‘performance’ though. If we were to apply a modern measure to seventeenth-century practitioners, what sorts of figures would emerge? For many reasons we have no means of accurately measuring the ‘figures’ for early modern doctors. Beyond parish registers there were no official figures for causes of death outside London (if we include the Bills of Mortality) and nothing like today’s patient records from which to infer case histories. Some physicians did keep case books, and these can often reveal interesting stories, but not enough to aggregate.

What does seem likely though is that, at least by modern measures, 17th-century doctors were probably highly inefficient. Mortality rates, at least for surgeons, were undoubtedly far greater than today. Major surgery (such as opening the chest cavity) was seldom done due to the overwhelming risk of losing the patient. Before anaesthetic, any surgical intervention was risky whether due to hypovolemic shock caused when the body loses too much blood, the physical trauma caused by the pain and wound infliction or, perhaps even more so, the risk of secondary infection after surgery due to unwashed hands and instruments, and dirty conditions. Even relatively minor procedures such as bloodletting carried the risk of introducing infection, and a certain amount of deaths must surely have been attributable to blood poisoning or infection caused in this way.

All of this begs the question of why, if it was so risky, did people elect to visit surgeons at all? Why did some surgeons, especially into the eighteenth century, gain prominence and even fame if they stood a fair chance of killing their patients? Surely people would not have given such people the time of day if it were proved that they responsible for the deaths of far more people than they saved?

The answer is that people simply had a different expectation of what medicine and surgery could do for them. This was a world of sickness in which the patient, while by no means powerless, relied on an array of defences to support them in their fight to return to health. These included domestic medicine, family and friends, books – if they were literate, and also medical practitioners. Rather than one consultation with one general practitioner, as today, people commonly consulted many healers until they found one they were happy with. They might combine treatments and seek the opinions of several, whilst still falling back on their own tried and trusted remedies.

But did they expect practitioners to heal them? They certainly hoped that they would, but also understood that they might not. Let’s imagine for a moment that an early-modern person learned that the mortality rate amongst the patients of their prospective surgeon were in excess of 70% A surgeon with those sorts of rates in today’s league tables might well not last long on the register. But a seventeenth-century person might well view things a bit differently. Whilst acknowledging the potential danger, they could well view this as a risk worth taking – as a last-ditch effort to make them well again.

This explains why people went to doctors at all, and brings us back to reputation. If a practitioner had healed at least some people then they were potentially worth visiting. The fact that many people died under their ‘care’ was not necessarily viewed as their fault; it was an artefact of living in what everyone acknowledged were dangerous times for the sick. Therefore, doctors who had had at even some success were a potential lifeline. More than this, they could be held up as figures of approbation, despite what might be seen as a good record of not curing! What they did, however, was offered some degree of hope where otherwise there might be none. In that case, half a loaf was better than none.

Performance, even today, relies on much more than bare statistics. The reputation of practitioners is still important; we would all ideally want to see the ‘best’ specialist or the most eminent surgeon. It is worth considering how statistics can only tell part of the story though, and the ways in which our perceptions of reputation have shifted over time.

Norovirus and the reporting of epidemics through history

This winter has already witnessed an unprecedented increase in cases of Norovirus – the so-called ‘winter vomiting bug’. For some reason, across the globe, the infection has spread with increasing virulence and also lingered longer than normal in parts of the world now moving from spring to summer.  Norovirus is an especially durable and adaptable virus. It is perfectly suited to what it does; spreading from person to person either through airborne contact with minute particles of vomit, or through surface contact with the virus…on some surfaces it can last for up to two weeks. Given that I have a pathological phobia of vomiting, this one is the stuff of nightmares!

In Britain, the Health Protection Agency is the public face of public health and is charged with providing a virtual barometer of sickness. Their website contains a list of the current maladies doing the rounds and, in the case of flu and norovirus, weekly updates on the numbers of the stricken. The site also contains tips on how to prevent the spread of the virus and some advice (if little comfort) to those who have already succumbed.

To my mind, the information on the HPA website is extremely reminiscent of the information disseminated to the public in past times of epidemic disease – say the seventeenth-century plagues. It strikes me that authorities throughout history have had to balance the need to provide practical details of encroaching sickness with the need to avoid spreading panic. The language of sickness reporting in fact has a long history, and show remarkably similar patterns.

The reporting of the numbers of sufferers, for example, is something that was certainly an important element in the way the Great Plague of 1665 was reported. In seventeenth-century London, the so-called ‘Bills of Mortality’ gave a weekly update on deaths in the city, in the form of a published pamphlet. Information for these pamphlets was gleaned from the ‘searchers of the dead’ – people (often women) who were employed to examine fresh corpses to discern the cause of their demise. Their diagnoses were diverse. In one bill dating from 1629, the causes range from predictable conditions such as measles, cold and cough and gout to other, stranger, ones such as ‘teeth and worms’, ‘excessive drinking’ and ‘suddenly’!

As the plague increased though, the Bills of Mortality became rapidly dominated by these numbers, and Londoners pored over the pages every week to gauge the seriousness of the situation. News of the contagion was a regular topic of conversation and people were eager to learn if things were getting better or worse. The newly burgeoning cheap presses of the mid seventeenth century went into action, with everything from treatises on the causes of the plague to ‘strange newes’ about the latest outbreaks or figures and even popular cures.

The authorities were clearly worried about the danger of epidemic sickness, and took measures to try and limit its spread. One of these was to try and restrict popular gatherings such as fairs, to try and prevent the disease running rampant. This Royal proclamation from 1637, for example, entreated people not to attend the popular Sturbridge Fair that year, the king ‘Forseeing the danger that might arise to his subjects in generall”.

So, the authorities published the numbers of sufferers, took preventative measures against the spread of contagion and, in general, maintained a dialogue with the public, updating them on disease types, currency and potential ways to avoid them. The popular press also served to stir up fears, however, and perpetuated public dialogue about infection. Disease and health have always been topics of conversation but, in times of contagion, they tend to become more concentrated, and people become more engaged in dialogue about them.

Fast forward to 2013 and it is remarkable how similar the situation still is. The HPA website, for example, gives a weekly update on numbers of norovirus sufferers, not only in terms of clinically-reported cases, but of an assumption that for every reported case there are a further 288 or so unreported cases – people who simply decide to stay home and self-medicate. Indeed, at the present time, people are being actively discouraged from attending doctors’ surgeries, and hospital wards are being closed to the public. The impression is one of a wave of contagion breaking over the British Isles and, for me at least, one that is coming to get me!

There is indeed a fine line to tread between reporting facts and sparking panic. When SARS first emerged, there was a great deal of information (and misinformation), with various ‘experts’ calling it variously a massive threat to humanity, or simply the latest in the processional line of epidemics to afflict humankind.  A few years ago, a virtual global panic was instigated by the apparent mutation of avian flu, or bird flu. This outbreak made ‘pandemic’ the buzzword of the late 2000s and, again, much space was devoted (and indeed still is to some degree) on educating people on what it is, who has got it, and how to avoid it. In 2005, a UN health official warned that bird flu was capable of killing 150 million people worldwide. According to Dr David Nabarro, speaking to the BBC at the time “”It’s like a combination of global warming and HIV/Aids 10 times faster than it’s running at the moment,”. The World Health Organization, perhaps seeing the potential panic that this could cause, immediately distanced itself from the comment. The fact that the outbreak was ultimately relatively mild emphasises the problem that epidemic disease causes for health officials. How to alert people without scaring them?

None of this is helped by the press who, like their seventeenth-century counterparts, are keen to give the largest mortality figures, or emphasize the spread of diseases. In June 2012, for example, Reuters were still warning that a global bird flu pandemic could happen at any moment.  http://www.reuters.com/article/2012/06/21/us-birdflu-pandemic-potential-idUSBRE85K1ES20120621

The same pattern is now happening with the norovirus – although clearly this does not carry the same levels of danger. Here we are talking about contagion, rather than mortality.  Let’s take the headline on the Western Mail newspaper of 20th December though: “Norovirus: Now more wards are closing as hospitals in Wales hit”. The breathy style of this banner line emphasises its rapidity, not just a straight report, “NOW” it’s coming. What purpose do these reports ultimately serve? Put another way, why do we need to be told? Logically, if preventative measures are possible then it makes sense to tell as many people as possible. But often this is not the purpose of newspaper copy in times of sickness which, to me, almost seems at times to be deliberately provocative.

The answer seems to be a deeply-set human interest in sickness, ultimately linked to our own mortality. Even in this apparently scientific and modern age of medicine, there are still many things which are incurable, and many diseases which have the ability to wipe us out at a stroke. It is this uncomfortable reality which perhaps continues to fascinate and frighten us. We live in an age of control, but some things are still beyond our control, and it is perhaps this innate fear of disease – of our own transience – which makes these headlines ultimately so compelling.

The NHS Bill – an historical perspective.

At the very least, the NHS Bill is provoking lively and vigorous debate. Just the other week, the proposed legislation was referred to by Ed Milliband as “David Cameron’s Poll Tax”! Objections against the changes put forward are too many and too wide-ranging to explore in detail here. But, succinctly, the main bone of contention lies in the expansion of outsourcing of NHS services to private companies – in effect the privatisation (‘modernisation’ some prefer) of the NHS – and its possible effects upon the quality and cost of patient care in England. But just what is it about privatisation in the bill that worries people?

Privatisation is certainly a loaded term; for some it carries the implicit assumption that something will be lost in the process– that things could get worse for consumers rather than better. Are we even somehow resentful of the loss or degradation of our once-proud institutions like the post office and the NHS? Given that the latter only dates from 1948, this seems less likely although there is certainly a residual fondness for what has been, for the most part, a success story of public health.

It is worth considering the provision of healthcare in Britain in the past, and especially in terms of the question of private enterprise. Four hundred years ago, the concept of public healthcare simply did not exist – this was the original ‘medical marketplace’. How, then, did this manifest itself in the sickness experiences of our forebears? How did these proto-consumers of healthcare cope with this situation, and what types of medicine and practitioner were available to them? What, ultimately, can we learn from them?

The early modern period was characterised by a diversity of medical service providers. These included university-trained and licensed physicians who often catered for wealthy clients, and who were largely based in large towns and cities. Surgery was a separate branch of medicine, while apothecaries, although nominally banned from doing so, also provided medical advice as well as remedies and ingredients as they were more accessible and more affordable for many people. At a local level were an undifferentiated mass of medical practitioners, ranging from specialists, such as occulists, bonesetters and wart-charmers, to travelling ‘doctors’ who would claim to cure anything from toothache to the ‘itch’ for a few pennies. Even the local blacksmith could be called upon to knock out a rotten tooth.

This was a true consumer market with a massive variety of choices for the early modern patient.  Most people self-medicated. Some grew their own herbs, but many remedies and ingredients were available locally, even in rural villages. Surprising as it might sound, given our perceptions of contemporary living conditions, maintaining a healthy lifestyle was also important. People invested in healthy ‘regimens’ – daily steps to staying fit from fresh air and exercise to early modern equivalents of the tonic or health drink.

So if medicine in the early modern period was fully private, was it better? Clearly, conditions in the seventeenth-century differ markedly from that which the proposed NHS bill would create. In effect, this aims to drive down costs by putting more services out to tender giving the customer – the patient – access to care through different providers but still essentially free at the point of delivery. The early modern marketplace though, was patchy and uneven, with the availability of care and cure varyying widely geographically, demographically and economically. In terms of public health, for example, authorities might intervene to contain epidemic outbreaks, but this did not generally extend to treatment or tangible support for the afflicted.

The closest thing to ‘official’ medical support could be found in local parish poor relief funds. Here the parish might pay for the treatment of a sick parishioner, sometimes even paying for them to travel if the most appropriate specialist was not nearby. Friends or neighbours might also be employed by the parish to care for a sick person. This phenomenon actually resonates with current questions surrounding the boundaries of public care provision. In very recent times, for example, the language of deserving/undeserving has returned to political discussions about welfare provision – a terminology very familiar to our forebears. Could a similar scaling back as that mooted for things like housing or child benefit eventually affect the willingness of the state to fund certain lifestyle-related conditions, say through smoking, binge-drinking or overeating?

Turning the question around, are things actually better now? Free healthcare, massively more effective drugs and treatments and a similar diversity of practitioners suggest so, but stories about people extracting their own teeth as they could neither find an NHS dentist to take them on, nor afford private care, are reminders of the failures that can still exist. According to a recent survey in a popular newspaper, four in ten adults consider dental care a luxury, while the cost of prescriptions in England is set to rise in April 2012.

Nonetheless, it is worth noting that we already engage widely with a private medical market. Like our early modern counterparts, we are vigorous self-medicators. The first recourse for many of us is the chemist (the local apothecary) where we purchase over-the-counter palliatives, despite the option of a cheaper prescription. Many visit private practitioners such as medical herbalists, whether professionals or one of the increasing number of high-street outlets.  Also, the option to purchase bespoke treatment remains a way to bypass waiting lists and, dare I say it, get a ‘better’ service, perhaps in more comfortable surroundings. ‘Lifestyle’ in the form of health food and drinks, spa treatments and even private gym memberships attest to our continuing desire to stay healthy and try and fend off illness before it arrives – a sentiment very familiar to those in the seventeenth-century. This is a market worth billions.

So to raise the question again, what are we afraid of? There is already, as these examples suggest, a broad acceptance of the idea of private enterprise in medicine. Whether alternative therapies, such as high-street herbalists, should be banned hasn’t really been debated. Whether they should be available on the NHS has. The potential problem with the intervention of the private sector, and here the experience of the early modern period does bear relevance, is the potential risk of uneven quality of care. People across the country in the seventeenth century faced widely varying quality in medical provision, based not only on their ability to pay, but on the lack of centralised training or regulation. The NHS provides a safety net that people in the past simply didn’t have. The danger in throwing the doors open to different companies, say in parallel to the privatisation of rail services, is that quality will again vary regionally and demographically; rather than having consistent levels of services across the whole country, and for people at all levels of society, patients’ care will suffer. This is something that the government will have to think carefully about. Things were not always better in the past.