Caring for the elderly: Lessons from history?

According to various reports in the press this week (before petrol took over as the predominant theme!), Arlene Phillips (ex Strictly judge and choreographer etc) has delivered a public rebuke to David Cameron about the state of geriatric care provision in the UK today. Depending on which newspaper you read, Ms Phillips either “grilled”, “challenged”, “confronted” or “blasted” Cameron about conditions she had recently witnessed in  a Hereford hospital. Among the litany of failures she raised were elderly patients calling out for help and being ignored, and some being left alone for long periods.  As some officials are noting, there is a “very real crisis” in elderly care provision, not least in the issue of how to train nurses and carers to undertake what is a very challenging situation. At a time when life expectancy is ever increasing, the economic effects of an ageing population are very much in policymakers’ minds at the moment. But how to maintain dignity in later years appears less of a concern. In fact, how to care for, and indeed treat, the elderly has been a constant issue throughout history. It is worth looking back at the seventeenth century to see how things, in some ways, may even have been better.

It is easy to think of the seventeenth century as a relatively young society. After all, the average life expectancy at the time was roughly 37 years. But this average is greatly skewed by the extremely high levels of infant mortality, which bring the average down. It was not the case that people reached their mid thirties and then quietly prepared to drop off the perch. If you reached this age then, due to the fact that you had probably developed a good level of immunity to many common conditions, you stood a fair chance of making your three-score years and ten. As such, there was a significant (although not easily quantifiable) elderly population in Britain at that time.

In principle, the early modern period witnessed a ‘gerontocratic’ system where age was respected. It was no coincidence that the majority of officeholders, whether in central politics or local administration, were middle-aged or above. The latter years were seen as a time of mellow reflection; a time when people could look back on a life well-lived, set their affairs in order and prepare their souls for the final journey. Unlike fiery youth whose hot-headed tempers regularly got them into trouble, those in their later years had learned restraint and, perhaps more importantly, were regarded as repositories of accumulated experience. This was one view of age.

The other was more derogatory, and the figure of the bumbling old fool was a comic staple in cheap literature. Age took its toll on mental facility and physical capacity, leaving the elderly person weak and vulnerable. In humoural medical terms, age made the body cold and moist, while bitterness and regret clouded once-sharp minds. At the extreme end, it was poor, elderly women, perhaps living alone on the fringes of society, who were most susceptible to suspicions and accusations of witchcraft.

A combination of hard physical work, poor diet and a lack of effective palliative medicines all took their toll on early modern bodies. People would have looked older at a much younger age than today. A man of forty in 1650 would probably look closer to a modern man’s sixty. There were also a myriad different deadly diseases which could weed out the elderly and weak, with winter being a particularly dangerous time. Winter was the time of epidemic fevers and influenzas which could sweep the already vulnerable away, if harsh weather and lack of adequate heating hadn’t brought the coup de grace already. Sickness might be acute and fatal, but it could also be painful and lingering. In these cases, who looked after the elderly? Were they simply put out to pasture and forgotten about?

The structure of early modern families meant that it was very unusual for people, and especially married couples, to continue living with their parents into adulthood. It was a prerequisite for prospective couples to be able to demonstrate that they could afford to buy and run their own home. As such, many elderly people in this period lived with their spouse or, as commonly occurred in a period of high mortality, by themselves as widows or widowers. But we shouldn’t assume that this necessarily meant that no help was available.

There were, for example, a wide variety of ways in which the community could intervene to look after the elderly or poor in their midst. Visiting the sick and elderly, for example, was a common social ritual, and one undertaken by neighbours as well as local clergy. Surviving diaries from ministers attest to the numbers of daily visits that this might include, and while little physical care was provided, company was doubtless important.

Secondly, the parish, through the monies collected via the Poor Law, could provide money to give support to the old and infirm. This might consist of money to purchase clothes, food  or firewood – especially in hard winters. It might provide funds to cover medical expenses such as visits by a practitioner or to purchase specific remedies. In some cases, though, the parish might even pay people to actually move in and care for the elderly and infirm. In such cases, the elderly person had access to a relative level of care that many do not have today. The allotted person might cook and clean, perhaps administer medicines and, most importantly, provide companionship. The gratitude of the recipients of this type of care is reflected in the fact that they often bequeathed what little they had to these (to use the modern term) ‘carers’, even if it was a few bedclothes or ‘wearing apparel’. This level of care for that period is quite significant – and also pretty shameful given reports of the state of modern geriatric care alluded to by Ms Phillips.

We can’t argue that old age in the early modern period was some sort of golden age – it wasn’t. It was often a bleak, painful and harsh reality, especially for the very poorest who had no family, no money and few friends. There are plenty of accounts of such people who saw out their days in hovel-like cottages with only a few posessions and a meagre diet to bring them comfort. But there was a moral invective to help those less fortunate which perhaps operated at a deeper level than that of today. Whilst we undoubtedly have the ability to provide far superior medical care, far more comfortable surroundings, to extend lives far beyond that of our early modern counterparts, and provide a greater quality of life for those in their later years, it is important that we pay attention to mind and body – to dignity and respect – and this is a lesson to which we can turn to the past for inspiration.

 

 

 

Writing regional histories

Next week, I’ll be participating at an event here in Swansea to talk about writing Welsh history. (Wednesday 28th March, 5.30, Wallace building, cheese and wine reception, all welcome!). Aside from launching our new books, my colleagues and I will be talking about the benefits – and pitfalls – of the writing and academic study of Welsh history. As Wales continues to find its own feet post-devolution, and now a decade after the first wave of new national histories that emerged at the turn of the Millenium, what direction is Welsh history now heading in? In many ways, it could be argued that Welsh history is actually more in the mainstream now than ever (if, indeed, it’s really been away). Just finishing on BBC Wales is the new, ‘landmark’ series “The Story of Wales”, fronted by Huw Edwards, which is due to be broadcast on BBC network later in the year.

I’m a Welsh historian. I work in Wales, and on Wales. My work centres on Welsh medical history of the early modern period – it’s not a crowded field. One question I have often struggled with, especially at the start of my research, is that of how should my work approach the central question of ‘Welshness’. What sort of history was I writing? Was it a national history of Wales, or was it something broader, a study of early modern history which happened to use Wales as its subject? I found that the fundamental issue was in fact one of emphasis; should I explore ‘Welsh medicine’ or, instead, ‘medicine in Wales’. This almost sounds like something of a pointless question, but the difference is important.

Looking for a ‘Welsh medicine’ implies that Wales had unique elements certainly, but is also somewhat exclusive. If I took this route then it seemed to me that the options would be limited. In fact, I actually don’t believe there was a ‘Welsh medicine’, at least not in the period I study. Instead, there was a vibrant medical culture fitting in with broader patterns across early modern Europe, and certainly with elements that might be seen as adding a Welsh ‘colour’ to interpretations. But this has implications for how a study could be written. Had I drawn an imaginary line down Offa’s Dyke and simply divorced Wales from the rest of the UK, then my study could simply have ended up as a box-ticking exercise – one where I simply told the story of medicine in Wales and quickly concluded that things weren’t all that much different. Did Welsh people believe in the humours? Yes. Tick. Did they use a variety of herbal, animal and magical remedies. Yes. Tick. Were there a variety of practitioners? Yes. Tick, and so on.

But taking ‘medicine in Wales’ as my launch point seemed to allow me more freedom to take Welsh sources and ask different questions of them. Here was the opportunity to use Wales as an example to address bigger questions in medical history. From the available sources, it was eminently possible to use Welsh sources as a lens to view such issues as, for example, care of the sick – very much a current theme in medical history. I recognised quite early on that Welsh medical remedy collections would add something different to debates about the transmission of medical knowledge. Also, evidence from Welsh village shops, and the contents of ordinary homes, afforded a unique opportunity to see how the ‘medical market’ operated in an area culturally and geographically remote from London.

I wanted to write Welsh history but to do so in a way that showed how far Wales was connected to its near and far neighbours. Now, suggesting that English-language books had an effect on the medical language of Wales, making it more Anglicised, and also how Wales was dependent on large English towns and cities for its supply of patent medicines might not be popular in some quarters. But what would we prefer? I would much rather show Wales as it was, part of a much bigger picture, than depict it in isolation. Understanding that our country was not insular and remote, and that Welsh people had access to a far broader network of knowledge than we often give them credit for, is a far more satisfying option for me. If I were to criticise the new ‘Story of Wales’, I might question where the rest of the UK is; can we really understand what Wales is, and what it was, by studying it in isolation?

This is entirely different to saying that we shouldn’t study Welsh history at all. A strong part of what I do is to look at how factors such as the geography and topography of Wales itself coloured attitudes to medicine (which they surely did). Where you lived certainly contributed to your own experience. Wales was unique in Britain in that it had no universities, no cities and no medical training available in the early mdoern period. This, again, affected and shaped the availability of medical provision. There is certainly also a need for national histories insofar that they give us a narrative framework and context for our own country. The issue is how we use Welsh sources and to what ends.

The argument for any regional history is that it adds to our understanding of the whole. Each region – and here Welsh culture and history often comes into its own – has its own nuances and pecularities. Understanding how they were enmeshed in a bigger web of meanings helps us to get closer to the lived experience of the past. In my own field, for example, many studies of 17th-century medicine – and even in the last 20 years, were often strongly focussed upon London and southern England. Only in recent years have a larger number of regional studies begun to augment our understanding and bring us somewhat closer to a more inclusive meta-narrative of early modern medicine.

I definitely think that Welsh history is on the up and that the writing of academic history in general is becoming more sensitive to the need for regional studies. The fact that my book, and those of my two colleagues Professor Huw Bowen and Dr Martin Johnes, all of which are about Wales, have been published by a major English academic publisher is testament to the fact that there is a growing market for Welsh history outside our own borders. I have certainly never found it necessary to have to justify my subject area and, in fact, the main reaction I get is one of surprise that people have not looked at these sources before. That’s another argument for another post.

But I’m looking forward to next week’s session. It’ll be a great opportunity to gauge how other people feel about this issue and, hopefully, to establish some new ways forward. Please come along and join in if you can make it.

The NHS Bill – an historical perspective.

At the very least, the NHS Bill is provoking lively and vigorous debate. Just the other week, the proposed legislation was referred to by Ed Milliband as “David Cameron’s Poll Tax”! Objections against the changes put forward are too many and too wide-ranging to explore in detail here. But, succinctly, the main bone of contention lies in the expansion of outsourcing of NHS services to private companies – in effect the privatisation (‘modernisation’ some prefer) of the NHS – and its possible effects upon the quality and cost of patient care in England. But just what is it about privatisation in the bill that worries people?

Privatisation is certainly a loaded term; for some it carries the implicit assumption that something will be lost in the process– that things could get worse for consumers rather than better. Are we even somehow resentful of the loss or degradation of our once-proud institutions like the post office and the NHS? Given that the latter only dates from 1948, this seems less likely although there is certainly a residual fondness for what has been, for the most part, a success story of public health.

It is worth considering the provision of healthcare in Britain in the past, and especially in terms of the question of private enterprise. Four hundred years ago, the concept of public healthcare simply did not exist – this was the original ‘medical marketplace’. How, then, did this manifest itself in the sickness experiences of our forebears? How did these proto-consumers of healthcare cope with this situation, and what types of medicine and practitioner were available to them? What, ultimately, can we learn from them?

The early modern period was characterised by a diversity of medical service providers. These included university-trained and licensed physicians who often catered for wealthy clients, and who were largely based in large towns and cities. Surgery was a separate branch of medicine, while apothecaries, although nominally banned from doing so, also provided medical advice as well as remedies and ingredients as they were more accessible and more affordable for many people. At a local level were an undifferentiated mass of medical practitioners, ranging from specialists, such as occulists, bonesetters and wart-charmers, to travelling ‘doctors’ who would claim to cure anything from toothache to the ‘itch’ for a few pennies. Even the local blacksmith could be called upon to knock out a rotten tooth.

This was a true consumer market with a massive variety of choices for the early modern patient.  Most people self-medicated. Some grew their own herbs, but many remedies and ingredients were available locally, even in rural villages. Surprising as it might sound, given our perceptions of contemporary living conditions, maintaining a healthy lifestyle was also important. People invested in healthy ‘regimens’ – daily steps to staying fit from fresh air and exercise to early modern equivalents of the tonic or health drink.

So if medicine in the early modern period was fully private, was it better? Clearly, conditions in the seventeenth-century differ markedly from that which the proposed NHS bill would create. In effect, this aims to drive down costs by putting more services out to tender giving the customer – the patient – access to care through different providers but still essentially free at the point of delivery. The early modern marketplace though, was patchy and uneven, with the availability of care and cure varyying widely geographically, demographically and economically. In terms of public health, for example, authorities might intervene to contain epidemic outbreaks, but this did not generally extend to treatment or tangible support for the afflicted.

The closest thing to ‘official’ medical support could be found in local parish poor relief funds. Here the parish might pay for the treatment of a sick parishioner, sometimes even paying for them to travel if the most appropriate specialist was not nearby. Friends or neighbours might also be employed by the parish to care for a sick person. This phenomenon actually resonates with current questions surrounding the boundaries of public care provision. In very recent times, for example, the language of deserving/undeserving has returned to political discussions about welfare provision – a terminology very familiar to our forebears. Could a similar scaling back as that mooted for things like housing or child benefit eventually affect the willingness of the state to fund certain lifestyle-related conditions, say through smoking, binge-drinking or overeating?

Turning the question around, are things actually better now? Free healthcare, massively more effective drugs and treatments and a similar diversity of practitioners suggest so, but stories about people extracting their own teeth as they could neither find an NHS dentist to take them on, nor afford private care, are reminders of the failures that can still exist. According to a recent survey in a popular newspaper, four in ten adults consider dental care a luxury, while the cost of prescriptions in England is set to rise in April 2012.

Nonetheless, it is worth noting that we already engage widely with a private medical market. Like our early modern counterparts, we are vigorous self-medicators. The first recourse for many of us is the chemist (the local apothecary) where we purchase over-the-counter palliatives, despite the option of a cheaper prescription. Many visit private practitioners such as medical herbalists, whether professionals or one of the increasing number of high-street outlets.  Also, the option to purchase bespoke treatment remains a way to bypass waiting lists and, dare I say it, get a ‘better’ service, perhaps in more comfortable surroundings. ‘Lifestyle’ in the form of health food and drinks, spa treatments and even private gym memberships attest to our continuing desire to stay healthy and try and fend off illness before it arrives – a sentiment very familiar to those in the seventeenth-century. This is a market worth billions.

So to raise the question again, what are we afraid of? There is already, as these examples suggest, a broad acceptance of the idea of private enterprise in medicine. Whether alternative therapies, such as high-street herbalists, should be banned hasn’t really been debated. Whether they should be available on the NHS has. The potential problem with the intervention of the private sector, and here the experience of the early modern period does bear relevance, is the potential risk of uneven quality of care. People across the country in the seventeenth century faced widely varying quality in medical provision, based not only on their ability to pay, but on the lack of centralised training or regulation. The NHS provides a safety net that people in the past simply didn’t have. The danger in throwing the doors open to different companies, say in parallel to the privatisation of rail services, is that quality will again vary regionally and demographically; rather than having consistent levels of services across the whole country, and for people at all levels of society, patients’ care will suffer. This is something that the government will have to think carefully about. Things were not always better in the past.

Past and Present ‘sick roles’

It always amazes me how readily people are prepared to tell complete strangers about their symptoms and maladies. Sitting in the waiting room of a the doctor’s surgery, I have overheard people telling someone they’ve never met some of the most intimate details about about this or that operation, favourite doctor or some related tale of misery.

And it isn’t just in the context of medical institutions. Wherever two people strike up a conversation, it seems to me that, like the weather, the topic of health is somehow (and rather counterintuitively) an accepted topic for discussion. It might be recovery from recent illness; it might be waiting for a forthcoming treatment; it might be. especially among friends or acquaintances, the opportunity to share the latest miracle cure for arthritis, whether Royal Jelly in the 80s, or Glucosamine in the last few years.

This readiness to share information about our health, though, is certainly nothing new. Sickness has always been a social event in some measure. As part of the research for my book, I explored the ways in which early-modern patients experienced sickness and constructed the role of sufferer. What it consistently showed was that sick people in the past – both consciously and unconsciously – adopted and deployed certain language and behaviours to fit with social expectations of the sufferer on the one hand, and in some measure to try and garner help and support for themselves.

In the seventeenth century, for example, the parameters of sickness were somewhat different. It has even been argued that most people felt some degree of illness for most of the time. This is because of the vast range of minor ailments that today would be treated easily by a trip to the chemist, but then were less simple to get rid of. If this is true, then the whole concept of feeling ‘well’ in the first place is shifted. It is probably true to say that people’s expectations of health were lower; through Galenic humoural beliefs of the time, good health was almost an unattainable ideal anyway.

In fact, in the early modern period, there were clear levels of sickness. At the basic level were minor ailments – troubling and worrisome yes, but not considered dangerous. With these sorts of ailments, people essentially carried on with their daily business as best they could. If things got worse, they might stay within doors, doubtless self-dosing with some favoured remedy or, if they could afford it, consulting with the local practitioner. Communities were adept at knowing when a member was ill, and word spread extremely quickly, kicking off a likely stream of concerned visitors. At this level, the sick person might still function in some respects as a member of the household. A woman, for example, might still carry out some domestic tasks if she were able, while many handicrafts were also carried out within doors.

At the more serious end, though, was sickness that required the sufferer to take to their beds. When this occurred, the patient was effectively considered to be seriously ill. Here, medical intervention was likely to be sought, and copious amounts of medication administered. It is worth remembering that early modern medicine worked on the basis that it had to be seen to ‘do’ something, even if that something was to make the patient violently ill. On one level, taking to one’s bed was an obvious reaction to sickness, but it was also a conscious signal to others. There were even some treatment regimes which required the time at which the patient lay down to be recorded as this had a bearing on what medicine could be administered and when.

The words and conduct of the sick were extremely important, not least to their families. The sick person was expected in some measure to take the medicine and advice proffered to them, but in effect to ‘act’ like a sick person. This might include what they said. The wife of the Flintshire diarist Philip Henry, for example, when she was sufferering from a recurrent ague, repeated the same phrase ‘sick, sick, never so sick’, as if to reinforce the point to her family.

This was also a period when medicine, and in particular medical remedies, were part of a common and shared knowledge bank. People at all levels of society shared their favourite remedies and treatments, and this transcended social status. In other words, a servant might offer her master a favourite remedy, which was then duly noted down in a domestic remedy collection. People sent medical advice to friends by post, and sometimes fired off letters describing symptoms and seeking cures. Indeed, sickness was an extremely common theme in early modern correspondence, and sometimes the only reason for writing at all.

For those who couldn’t write, there was a strong verbal culture; people simply knew many remedies and, as the historian Adam Fox has admirably noted, were far more adept at committing large amounts of information to memory.

But by the eighteenth century, interesting changes were afoot, and the figure of the ‘heroic sufferer’ emerged. This was the era of fashionable maladies like gout, nervousness and melancholia. Rather than deploring your sickness, many Georgian society figures embraced it as a sign of their status and also of their romantic duty to suffer. Gout, for example, suggested a rich diet and fast living; in this sense, it was a visible status symbol – despite being incredibly painful as satires by Cruickshank and others suggest.

But the literate sick used letters to construct a whole new persona, telling friends in martyred tones that they were suffering in various unimaginable ways, and would write again if they survived, only to resurface in a fresh missive, much recovered, a week or two later.

The fact is that sickness has always been a topic of conversation, and the similarities between us and our early modern contemporaries continues to astonish me. We still have a lively culture of sharing and discussing medicine – in fact never more so than at a time when the whole structure of the NHS is under question. We still have our own favourite remedies, and self-medicate when we can. But, we also still visit a vast range of practitioners, from faith-healers to accupuncturists and Chinese herbalists.

However far we’ve come in terms of treatments, it’s worth considering the constancy of the human response to feeling, and being, ill.

 

Eye treatments – 17th-century style

I really don’t like eye-tests – they worry me and I don’t know why. I get the same feeling of apprehension as I do before an appointment at the dentist (or, worse, the hygienist!) but without any real justificiation. There is not generally any pain involved at the optician’s. It’s not that sitting with a massive pair of round frames on, being asked in a soft voice if the image in front of you is clearer, worse or just the same while gentle adjustments are made is especially horrifying.

In my vocation though, I regularly mutter a silent thank-you to the powers that be that I wasn’t around in the period I research. Consider what things were like then. Firstly, the range of remedies for eye complaints would…well…make your eyes water. One popular remedy for sore eyes was ‘snail water’ – essentially impaling a snail onto a pin and letting the juice run into the affected eye. Another involved fresh (green or yellow) goose dung, and applying this as part of an ointment. If that didn’t appeal, you could always get a willing family member or friend to blow powdered hen’s dung into your eyes before you went to bed at night.

Then there were the eye ‘specialists’ – the occulists, ready and willing to cater to your opthalmic needs for anything from a few coins if you were poor. The seventeenth-century Welsh diarist Walter Powell, from Llantilio Crossenny in Monmouthshire, was a constant sufferer from eye complaints, including cataracts. He consulted several practitioners, one of whom blooded him using leeches, but to no avail. Eventually, Walter went to see a cataract specialist – one Anthony Attwood, who undertook to perform the technique of ‘couching’ or ‘cooching’.

It sounds nasty, and it was. It involved passing a thin silver or metal instrument into the eye to physically push the cataract back and away from the lens of the eye…all while the patient was awake. Whether the patient was lucid at the time, or instead fuelled with some potent, alcoholic anaesthetic is open to question. But lest it should be assumed that the only result of this procedure was instant blindness, it is worth mentioning that the sturdy Walter endured the procedure three times, and was still able to continue his diary afterwards. Painful these procedures undoubtedly were, but we shouldn’t always assume they were necessarily futile.

So, as yet another leaflet reminding me that it’s been three years since my last eye test drops onto my doormat, I can at least console myself that my local branch of Specsavers is unlikely to get the couching needle out, nor blow some form of animal excreta into my already reddening eyes.

The early modern ‘sickie’

It’s been calculated that sickness absence costs the UK around £10-12 billion pounds every year. We are fortunate to live in a system where our employers usually foot the bill for reasonable sickness absences and generally don’t, at least to our faces, take Scrooge’s line of complaining about paying a day’s wage for no work. But how many of these lost days are through genuine sickness? One recent survey puts the estimated figure of non-genuine sickness in 2010 at an astonishing 30.4 million lost working days, costing the economy £2.7 billion pounds. There are probably many deep and wide-ranging socio-cultural argument to be made about the causes, rights and wrongs of ‘pulling a sickie’, but this is clearly not a small problem.

But what about sickness absences in the past? Before the late nineteenth century there was little in the way of support for workers unless their employees were particularly enlightened or charitable. Aside from a few notable exceptions, workers could simply expect to be docked pay if they did not turn up for work. The net result of this was probably (although I’m not arguing from statistics here) that more people simply went to their jobs for symptoms that might today lead to a day in the house with hot tea, some mild medication and (hopefully) a little bit of sympathy.

In fact, it wasn’t actually until 1983 that statutory sick pay was introduced. So does this mean that our early modern ancestors, for example, had no conception of the sickie? One source I found whilst researching for the book suggests not. Just so’s not to cause myself any copyright issues here, this example can be found on p. 130 of Physick and the Family.

In a book of 1724 notes and accounts belonging to Thomas Foulkes of Holywell in Flintshire, I found a few references which seemed to suggest that he had some suspicions about his maid Margaret. In January of that year, he noted that ‘my mayd Marg’t Jones fell sick this day, and next day did not gett out of bed’. Bearing in mind the ubiquity of sickness and the likelihood of being afflicted with something for the majority of the time, this is not particularly unusual.

But Foulkes, to put it bluntly, was keeping his eye on Margaret. The following week she went AWOL, “rambling home to her mothers” without telling Foulkes she was going. On the fifth of October she “fell sick and lay in her bed until the 8th, and then went home…”. On the one hand it’s possible that Foulkes was just diligent in keeping records. But the suggestion here, especially in his pointed comments about her continued trips back to her mother, is that he was suspicious of whether her symptoms were entirely genuine, and was building up ammunitition when the offences got too many.

We shouldn’t necessarily assume, then, that the ‘sickie’ is a modern phenomenon. Even despite the potential loss of pay (and other references even suggest that some early modern employers paid their servants when sick), the lure of a day at home has clearly got too much for some of us right down the centuries!

In