Speaking from the very inside of myself

Deborah James

 

I don’t know whether it was the vibrant colours, the softness of the felt, or the fragility of the fabric that made Claudia Stein, Research Director of WHO Europe, reach out and take the weave from my hands last year in Newcastle upon Tyne at Fuse’s International Conference on Evidence in Impact in Public Health (http://www.fuse.ac.uk/events/3rdfuseinternationalkeconference/). She stood there in front of me, holding the weave; silent for a moment, looking down and then responded, “This is the next generation of healthy children”. I was delighted. She got the metaphor.

 

 

The weave was a co-creation of the early years’ workforce from health and social care in the NHS England Cumbria and Tyne and Wear region. The project, funded by NHS England, was designed to support the health visiting workforce during the transition of their commissioning from the NHS to local authorities. In 2015, when we were designing the project, we didn’t know how the wave of austerity in public health would be felt on the ground, but we knew that the transition of commissioning, like any change, would produce both gains and losses. During times of change, we narrate our identities to help make coherence through the chaos.  With this in mind, we designed a series of large workshops across the region that focused on perspective taking. I wanted the workshops to provide a space for reflection so that each person could narrate the motivational force that led them into their respective work roles. A time to stop and think about themselves – their story line, and I also wanted to find out how they used the perspectives of service users in the construction of meaning in their own work. In addition, the workshops were designed to provoke curiosity about the system leadership that created the context for the changes they were facing. Could they see the change from the perspective of the policy makers who had called for action?

 

About 100 people from the health and social care workforce came together in five events across the region. In the workshops, the practitioners and workers shared their examples of best practice, worked on developing a shared understanding of the common purpose of their work in small groups, and fed back their ideas individually about the meaning of their work from multiple perspectives (themselves, their colleagues, and service users). More details about the project are here; (https://www.northumbria.ac.uk/media/6588781/supporting-transition.pdf). The participants created the metaphor as they wove their line, their story line on fabric in with the other participants’ fabric. This was the integrated health and social care workforce; an enactment of narrative testimony and witness that drew the subjects into a collective. People, connected with other people, speaking from the inside of themselves – speaking from the first person perspective.

 

Like the workers weaving their story lines, the first person position is the place from which I can access my own values, beliefs and unique understandings of the world. But the first person is also the place from which I give voice to my unformed thoughts, to my not knowing and even to my chaos. It is the first person position that I need so speak from if I stand any chance of creating generative outcomes from the narrative interface with someone else. The first person position is the antithesis of the third person specialist position, the place that, as academics or professionals, we normally set up for ourselves. I think it is an act of courage to speak from the first person. Speaking from the first person reveals something of who I am, whereas speaking from the third person reveals something of what I do.  As Donald Winnicott said (1971), “’I am’ must precede ‘I do’, otherwise ‘I do’ has no meaning.” It strikes me that many of our interventions in the early years start from an assumption that ‘I am’ is established (in the parents and in the child by the time he or she is ready for school). So, our interventions focus on the doing; ‘do this’, or ‘do it this way’, ‘best if you don’t do that’, or perhaps, ‘why do you do that?’ If the doing has no connection to the being there can be no generative sustained change from it. A more developmental set of assumptions might go something like this: our identities, the “I am”, are constantly evolving, that evolution is likely to be related to our ability to exchange positional perspective with another, and this positional exchange is probably most often achieved through story-telling. If these assumptions were robust foundations for supervisory practice, they would help the workforce articulate the mechanisms of change in their work with families. During the workshops, we used survey monkey to ask participants about their work. The health-visiting workforce usually found it difficult to accept the perspectives of other workers and the families on their role. When asked about what they did with families that brought about change, the overwhelming response was that they supported families. In the context of austerity, claiming and naming a supporting role with families is probably not enough to safeguard the ratio of relatively high payed/high skilled work roles. Critical reflexive supervision, with developmental assumptions about the identity work of the self and the other as outlined above could help develop a refined and specific articulation about how ‘supporting’ families leads to change during families own times of transition. It would also help make the case as to why investment in the early years’ workforce will create not only the next generation of healthy children, but also a more open society.

 

In the time since I visited Exeter in October last year and writing this blog, I fractured my wrist. It gave me time to stop and think. When I come to think now, when the truth of a previously trusted positional voice is being frequently questioned and challenged, I think that we, those who may influence intervention policy and practice in the early years,  have an important role in resisting the present drive towards ‘telling how’.  We need to rethink and examine how we can support the evolution of the person – the person who is capable of creatively adapting to the circumstances that cause suffering, the person who is able to act with relational agency to find a way through the chaos.

Sean Connery and a Coca Cola: Alleviating Fatigue in the Air

Natasha Feiner

In the twentieth century crew fatigue was a major health and safety concern in the aviation industry. Alongside other health issues, such as cardiovascular function and eyesight, crew fatigue was tightly regulated. Flight time limitations, which regulated the hours of work and rest of pilots and cabin crew, existed at both the state and company level. As my PhD research shows, however, flight time limitations did not completely eliminate fatigue in the aviation industry. As such, workers often had to find other ways of managing their fatigue. Based largely on the oral testimonies of ex-pilots and cabin crew, what follows here outlines how crews personally managed fatigue in the air.

 

Food and Drink

For some people eating a cooked meal on board was important. Others preferred to graze. Joshua* a cabin attendant who worked for BOAC in the 1960s and 1970s used to ‘cross the Atlantic’ on bananas and cheese rolls. For Jane, a cabin attendant who worked for BOAC in the 1970s, Coca Cola was important:

I must have drunk half of the cans of Coca Cola on the aircraft because it was an instant hit of sugar. Any instant hit of sugar that could keep you going. I don’t touch anything like that now, I’m on mineral water or herbal teas, the odd coffee. But at the time Coca Cola.

For others, alcohol played an important part in managing fatigue. Gerry, who worked as a pilot for BOAC throughout the 1970s and 1980s, suggested that alcohol was used as both a stimulant and a sedative:

But I have to say that that night in America when you got there, it was something, you know go and have a few beers, was a way to stay awake, socialise, I mean I know alcohol eventually sends you to sleep, but initially when it’s still a social thing, you know, it kept you going so that’s what people did.

 

Sleep

In-flight rest was very important for crews operating long-haul flights. From the 1970s, bunk-rest was mandatory on long-haul flights. Though some crew members found this helpful, others struggled to sleep in bunks. William, who worked as a flight engineer in the 1970s and 1980s said that bunks were not ‘conducive to good sleep’. For pilots and flight engineers, an informal culture of flight deck napping – sometimes referred to as ‘controlled rest’ – was important. Although the safety of this was called into question in 1972 after an entire flight deck crew was found to be asleep at the same time en route from Sydney to Honolulu, from the ex-pilots I have spoken to, it seems that the practice remained widespread regardless.[1] As William noted:

It was quite common. It was approved really, everybody you know captain or first officer would say to the others, ‘can I close my eyes for a minute?’ and he’d [the captain] say yes or no.

A similarly informal napping culture also existed in the cabin. According to Eleanor, who worked as cabin crew for BOAC throughout the 1970s and 1980s, crew would informally allow each other breaks. If there was a spare seat in the cabin, crew would ‘pull a blanket’ over themselves and have a twenty-minute nap. For some, sedatives played an important role in pre- and post-flight sleep. A number of the flight engineers and cabin crew I have interviewed have told me that they used sleeping tablets, often prescribed by personal GPs, during their working lives. While some took Temazepam, others were prescribed Magadon. Though some used sleeping tablets frequently on long-haul trips, others treated their prescription as a ‘back up’, as Eleanor explained:

My doctor gave me some. I explained to him that I rarely, I rarely felt exhausted but would it be a good idea, I was seeing him about something else, would it be a good idea if I had some with me so that if I started to feel it building up with the time changes… he’d known me for a long time, he knew I wouldn’t be silly with them and take them as a matter of course. I made it clear to him I didn’t want to. Just, it was a, it was a sort of back up if you like.

 

Socialising

For pilots, socialising with the cabin crew was important. According to ex-pilot Adam, cabin crew played a key role in alleviating the fatigue of flight deck crew. Not only would cabin crew bring pilots meals and coffees, but they intermittently came in to chat. According to Adam this social interaction helped pilots maintain alertness. For some cabin crew, socialising with passengers was the best way of maintaining alertness. For Joshua, when the first signs of fatigue began to show, he would seek out celebrity passengers:

Every aircraft you had a passenger list. Every single person was down there and the important ones were highlighted so if I was feeling really really tired and I didn’t have another hour left in me, I thought I’ll go and bother Sean Connery for a minute.

Reflecting the subjective nature of fatigue, then, different modes of managing fatigue worked for different people.

 

*Pseudonyms are used throughout.

[1] Arthur Reed, ‘Ministry Inquiry over BOAC crew asleep at controls of jet flying 30,000 ft’, The Times, Dec 13 1972.

Balancing Lifestyle and Health: Reflections on a Public Engagement Worksop

‘Balancing Lifestyle and Health’ was a one-day workshop run at Exeter Library by the University of Exeter and Libraries Unlimited on 11 November 2016. Here, Dr Martin Moore discusses the event, summarises its discussion, and highlights where future collaborations might lead… 

For a slightly longer reflection, please see the discussion at the Libraries Unlimited site

‘Balancing Lifestyle and Health’ was a one-day, participant-focused workshop. It brought together a mixed audience of service users, public health practitioners, health service professionals, clinical commissioners, representatives from community and non-governmental organisations, and members of the general public, all to discuss ideas about balanced lifestyles.

The event itself was divided into four one-hour mini-sessions, framed by a short introductory talk by the organisers. Aside from these ten-minute prologues, the sessions were given over to participants to discuss the issues raised, and to record their conversations.

The programme of the day is appended to this post. However, to briefly recap, session one focused on “living well with illness” – a topic covering everything from illness and identity, to managing daily life. Session two began with an exploration of the successful Books on Prescription scheme, after which participants explored the relations between community and lifestyle balance. Session three (fittingly appearing after lunch so as to alleviate any pre-meal guilt), tackled the tricky subject of “balanced diet”. And session four asked participants what is meant by “work-life balance”, and whether any such balance is possible or even desirable.

Broadly speaking, then, the aim of the event was to simultaneously foster critical discussion, and to facilitate new links, between people who might not usually have sustained contact. By putting members of the general public in touch with service commissioners and public health professionals, for instance, we thus sought to help everyone involved come away with new perspectives, and enable them to develop lasting collaborations.

As might be expected from an event covering quite diverse topics, the day’s discussion was wide-ranging. There were, nonetheless, a number of themes that consistently reappeared throughout the day.

Responsibility

Perhaps unsurprisingly, a number of debates and conversations repeatedly centred on the theme of responsibility and reform, where discussants asked and answered “who is ultimately responsible for maintaining our health and wellbeing?”

Responses were diverse. Some participants raised the point that perhaps individuals have to be ultimately responsible. They argued, for instance, that we ultimately decide our eating habits, and that perhaps we have become too reliant on expert advice. The appropriate response across any number of spheres, they suggested, was to create much more of a cultural emphasis on self-care and self-management, and to orient our systems towards signposting and training, rather than “nannying” and treating.

On the other hand, many other participants highlighted broad structural problems that needed to be addressed. Maintaining health, and living well with illness, for instance, was deemed difficult within profit-oriented systems that only valued individuals for what they could consistently produce and monetise. Those “left-behind” and deemed “useless” suffered financially, physically and mentally, whilst those in work were also being pushed to their limits and scared to disclose health problems. Similarly, unhealthy working practices – such as long hours and poor pay – combined with changing geographies of work to threaten to healthy diets and work-life balance. Unreasonable work demands had resulted in quick, individual “al desko” lunches, which restricted sociability and nutrition. Long commutes reduced time and energy resources available for cooking, caring, and leisure pursuits. New working cultures and even regulations were proposed, alongside important educational programmes for children and adults, to ensure healthier diets and more balanced working and social lives.

Cultural Change and Community

Culture, sociability, and community, also reappeared in other discussions.

Throughout the workshop concerns were expressed about community breakdown and cultures of community engagement. There was little consensus about the cause of such decline. Nonetheless, discussants highlighted community and family dislocation as a possible cause for growing rates of isolation, for declining health, and for poorer eating habits. In terms of the latter, the decline of communal eating was felt to prevent transmission of good habits and to divert attention away from social elements of consumption. More positively, participants did praise the work of community-based organisations – such as libraries and arts groups – and felt it offered a way to combat community breakdown. How we can evaluate such work in a pseudo-objective, “outcomes”-focused, funding environment, though, remained a consistent concern.

Technology

Some participants did see a role for technology here. It was suggested that digital technologies had allowed for new communities and new forms of social activity to thrive. Yet, it was admitted that novel technologies could also be a double-edged sword. Some felt that digital networks still helped bring the loss of “real” local community. Likewise, other participants suggested that, although central to “flexible” working, new technologies also allowed work to become a “24 hour” pursuit. On the one hand, then, new technologies could enable people to better integrate work demands with family responsibilities and leisure pursuits. On the other, emails, “smart” phones, and work laptops could also risk generating new expectations that we should always be available and “connected”.

No “One-Fits-All” Model or Solution

Of course, what worked for one person was not taken to work for another. And in some respects, this was the core message of many discussions.

In dietary terms, the ideal diet varied from person to person, and a consensus appeared that there were no universal responses that could be put into operation to produce “better” or “healthier” diets. According to many participants, such a lack of agreement could be traced to an inability to find coherent messages from various expert sources (or sauces). Such advice needed to be taken with a pinch of salt (and aromatic five-spice). (My apologies for peppering this section with so many bad pun(net)s.)

Likewise, some discussants suggested that defining what was “life” what was “work” was difficult, and that we shouldn’t assume that less work would offer greater joy and “balance”. Life (as supposedly separate from work) could actually be very stressful and demanding. Caring duties could be financially and emotionally stressful, for instance. Equally, vocational jobs, although running the risk of exploitation from demanding employers, could also be incredibly enjoyable. Thus, participants proposed, no single model was preferable, and a holistic approach was needed to ensure we could tailor diets and working lives to our own needs and preferences. 

Problematics of Language

And, in many ways, these final discussions brought out one last recurrent theme: that of the problematics of language. Participants consistently questioned phrases like “balance”, “moderation”, and “work-life”, asking who defined such terms and how. This desire for a new – or at least a clearer – language also appeared in discussions of “living well” and “illness”. Both these terms came in for questioning, and it was suggested that talking about “illness” could be misleading. Similarly, the way in which labels and language could shape a sense of self was raised. For some discussants, diagnoses and labels opened up access to resources, provided a sense of solid identity, and helped explain what was wrong (and thus alleviated anxiety and self-recrimination). For others, defining oneself in terms of a disease or illness – or being defined in such way by others – could restrict employment or social activity, and trap people in a stigma. Finding new ways to talk about these issues was considered crucial.

Where Next?

 As can be noted even by the preceding discussion, the event contained a considerable amount of discussion, agreement and debate. And if there was one criticism of the day, it was that we tried to do too much, or at least had to curb discussions that could have filled whole days in themselves.

In spite – or possibly because – of this, the event proved to be extremely well received, and we managed to fulfil our two broad briefs for the day.

However, “Balancing Lifestyle and Health” was only ever intended as an event to lay foundations.

We will soon be making contact with participants to see what sort of events they would like to see in the coming months and years. For ourselves, we believe that running half-day workshops on each of the workshop themes – diet, community, work-life, and living well with illness – would be productive. We would also like to move beyond the city, into Devon more broadly, and perhaps to think in terms of more concrete policy and workplace solutions.

For now, however, we would like to thank everyone who contributed to the day. As organisers we found it incredibly enlightening and invigorating. And we very much look forward to running further events with you in the future.

‘Let thy food become thy medicine’: healthy diets and supplements in English language self-help books 1950-2000

 

Screen Shot 2016-08-25 at 07.09.05

In this post, Nicos Kefalas (a PhD student with the Balance project) discusses his doctoral research into healthy eating and supplementation. Here he considers why self-help books about diet became so popular in the post-war period, and examines the ways in which this literature deployed linguistic devices to construct readers as “empowered” agents of healthy-lifestyles.

In recent years, it seems that not a day passes by without public debate about healthy eating and supplements. Thousands of magazines, newspapers, websites, YouTube videos and blogs are dedicated to issues revolving around food, diet, exercise, supplementation and health. Popular discussants of healthy diets even make use of data from randomized-control trials (RCTs) and population studies, metastudies and large trials, and translate such knowledge into easily digestible graphics on various foods and supplements to demonstrate whether or not they are truly good for overall good health. (Or to weed out products that could be considered as ‘snake oils’).[1]

 

My PhD examines the historical emergence and development of healthy eating, dietary advice and supplements to provide a better understanding of when, how, why and by whom ‘Healthmania’ was promoted. [2] ‘Healthmania’ is the fascination of various institutions of the West with healthy diets, foods and supplements, and the concomitant production of advice and products to help individuals attain better lifestyles, both independently and collectively. From the 1950s onwards, self-help literature, lifestyle magazines, newspapers, advertisers, science and the state accepted and promoted certain foods, diets and supplements for a number of reasons. Building upon a widespread valorisation of science on the one hand, and fascination of the media and public with slim bodies and healthy lifestyles on another, leading scientists, public figures and state bodies were all motivated by a mixture of personal ambition and broader political aims to promote healthy eating and supplementation.

 

Following historical work on Galenic medicine, food and health that has centred on early-modern herbals, manuals and cookery books, I am currently examining English language self-help books published between 1950-2000 which concentrate on nutrition, diets, food and health.

 

‘HEALTHMANIA’ IN SELF-HELP BOOKS’

Self-help authors made individual agency central to their efforts to popularise ‘optimal’ and ‘ideal’ diets. It is clear that the language and advice in self-help books emphasize and promote the notion that readers have the power to manage their own health and a moral obligation to do so. The books heavily imply that the proactive, educated, well-read and well-informed individual can promote their own good health. The readers are initiated into a self-narrative in which they do not need or want the help of doctors and expensive or ineffective treatments. This newfound agency over their own health is reinforced by the growth of capitalism to provide the foods and supplements necessary to sustain a mentality of ‘buying a better body’. Indeed, not only does the self-help literature direct its readers to the nearest health food stores, pharmacies and juice bars, but it also urges readers to consume specific foods (often exotic fruit), drinks and supplements. Some authors of self-help guides even take a further step. For example, Dr Atkins urges his readers to monitor their progress using a home urine-chemistry kit, encouraging the purchase of sophisticated surveillance technology, alongside expensive dietary consumables and further literature.

One of the reasons behind the popularity of the self-help industry lies in the fact that it stood out from mainstream science and advice for better health, which often consisted of the mantra to ‘eat less, exercise more’. By contrast, whilst self-help books sought to empower readers to make positive changes in the present and future, self-help books nonetheless removed the blame for ill-health and obesity from their readers and instead attributed responsibility for past failures to the food industry, ‘modern foods’, fast foods, mainstream science, and modern lifestyles. Each reader is thus given the chance to feel like a revolutionary and an ‘enlightened’ consumer. This narrative was facilitated through the consistent use of motivational language, tone, expressions and symbolism. Taking Atkins’ book as an example once more, one can see how phraseology and capitalisation of sentences urged his readers to stop being bystanders to the damage done to their bodies by modern foods (namely carbohydrates in this case) and start defending their health. Such phrases were ‘A REVOLUTION IN OUR DIET THINKING IS LONG OVERDUE’ and ‘WE ARE THE VICTIMS OF CARBOHYDRATE POISONING’.[3]

 

People turned to self-help books because they addressed contemporary anxieties of the everyday individual. The fear of hunger, for instance, is explicitly addressed by the genre as a whole. These authors knew that people fear hunger, because when they are hungry they lose their control of their bodies, and compromise their diets with filling – though not necessarily healthy – foods. In the short-term, authors reassured readers, this was not necessarily an issue. As Robert Haas advised his readers: ‘There is no point in worrying about cheating occasionally’. He noted, however, that such infidelities to his rules were only acceptable if ‘you stick to the Peak Performance Programme in the long run’.[4]

 

Yet, alongside questions of deprivation, self-help authors had another, seemingly paradoxical, problem to think about, – luxury and efficiency. No one wants to start a boring diet that does not bring the desired results fast enough.  Produced at a time when instant gratification of nearly all desires, cravings and wants can be quite simply achieved, self-help books had to portray eating healthily as an easy process coming from filling, varied, tasty and luxurious foods. For example, Atkins argued that people could lose weight on: ‘Bacon and eggs for breakfast, on heavy cream in their coffee, on mayonnaise on their salads, butter sauce on their lobster; on spareribs, roast duck, on pastrami; on my special cheesecake for dessert’.

 

These and many more themes in the self-help genre play a big part in the popularisation of science and the growth of ‘Healthmania’. The readers of these books are given the knowledge to become active agents of their own health. Change towards better health is elevated to an individualistic and stoic process by which the readers refused to be ill, overweight or obese. The process of becoming healthier offered by the self-help genre is easier with less hunger and more ‘exotic’ and luxurious foods than the average diet recommended by the state and doctors. The advice in these books offers readers a way to avoid the dreaded visits to doctors’ practices, and thus on the surface at least, serves in part to undermine the authority of orthodox medicine. However, even though these books seem to have been against modern scientific theories/practices, self-help authors were using contemporary scientific ideas. These authors offered a self-criticism of the various food and health-related professions they wrote from.  They did so by using widely accepted scientific tools and perspectives to criticise orthodox medicine and dietetics from a marginal perspective.  Indeed self-help books promoted their advice by using the same tools and analytics (macronutrients, micronutrients, calories, specific foods and supplements, the culture of quantification, and often their authors underwent the same training as their ‘mainstream’ counterparts). By following the advice offered by these books, the readers break out of the complacency to ill-health imposed on them by ‘modernity’ and became part of a reactionary kinship. In these works, health itself became a rational, measurable and quantifiable endeavour. More significantly, though, it became a commodity, leading to the explosion of the supplement industry and ‘Healthmania’ in general.

 

 

[1] For instance: http://www.informationisbeautiful.net/visualizations/snake-oil-supplements/

[2] This notion goes beyond the 1980s term coined by Robert Crawford, called healthism which is: ‘the preoccupation with personal health as a primary – often the primary – focus for the definition and achievement of well-being; a goal which is to be attained primarily through the modification of life styles’.

[3] R.C. Atkins, Dr Atkins’ Diet Revolution (New York, Bantam, 1972), pp. 3-5.

[4] R. Haas, (British adaptation by A. Cochrane), Eat To Win (Middlesex, Penguin, 1985), p. 157.

Life Begins at 40

Mark Jackson

It became commonplace during the twentieth century to regard the age of forty (or more recently fifty) as a tipping point in the life cycle, a moment when many people could begin to shed the financial, domestic, parental and occupational worries of youth and middle age and look forward to a more serene and comfortable period of their lives.  The belief that life after forty might present opportunities for, rather than obstacles to, happiness was given legitimacy by a post-Second World War culture that considered increased consumption and economic growth, at least in the West, as the primary route to self-realisation and emotional fulfilment.  Made possible partly by increased life expectancy, the crisis of middle age was recast as an epiphany, a moment of temporary imbalance that was necessary if age-related cognitive and economic decline were to be effectively reversed and individuals inspired to achieve the highest levels of personal satisfaction and well-being.

The origins of late-twentieth century convictions that life begins at forty were, however, less emancipatory than we might imagine.  Rather, they were rooted in reactionary attempts to preserve political stability, economic productivity and family unity.  Four days after the American army entered the First World War in 1917, Mrs Theodore Parsons was interviewed by The Pittsburgh Press, a local daily newspaper.  The author of manuals that encouraged children and young women in particular to embrace physical education as a means of cultivating intellectual advancement, health and beauty, Parsons tied her educational creed to the urgent need for women to train themselves for the `duties that war time may bring’.  `The mothers of a nation’, she argued, `are its supreme asset and as civilization advances it will be as natural for a nation to maintain its mothers as it is to-day to support its army and navy.’  Parsons’ conviction that women, as well as men, were critical to the war effort was not restricted to the young, but extended to the middle aged and elderly.

‘Most old age is premature, and attention to diet and exercise would enable men and women to live a great deal longer than they do to-day.  The best part of a woman’s life begins at forty.’ [1]

Parsons’ words constitute the first modern reference to forty as a transitional age in the fight for freedom and happiness.  But her aim was only incidentally the promotion of individual well-being.  More important for Parsons and her contemporaries were the social and military benefits of healthy ageing.  The notion that life, rather than death, began at forty was taken up most prominently by Walter B. Pitkin, Professor in Journalism at Columbia University.  Pitkin situated his self-help dogma in the context of an emergent American dream.  Science and technology had increased life expectancy, reduced the need for heavy labour in the home and workplace, and made leisure a genuine possibility for many Americans. ‘At forty’, he promised in 1932, `you will be wiser and happier than at thirty.  At fifty you will be clearer, steadier, and surer than at forty.’ [2]  Of course, the collective benefits of enhanced individual health and wealth were evident: greater consumption of services and goods would increase productivity and boost the American economy, fuelling further technological development and economic growth in a cycle of expansion.  Couched in capitalist terms, here perhaps were the seeds of the narcissistic veneration of the midlife transition that triumphed after the war.

In Britain, inter-war attention to the change of life in men and women around the age of forty adopted a different, but no less reactionary, complexion.  During the 1930s, Marie Stopes addressed the effects of ageing on married couples.  In Enduring Passion, first published in 1928, and Change of Life in Men and Women, published eight years later, Stopes questioned the inevitable decline in sexual health and satisfaction that appeared to beset previously passionate couples.  The notion of a crisis around menopause (or the equivalent decline in male virility), she argued, had been exaggerated by popular medical writers.  By preparing more effectively for the challenges generated by the unfolding stages of life, it was possible to prevent what many people regarded as the inevitable conversion of ‘happy lovers’ into ‘drabby tolerant married couples’. [3]  Stopes’ formula for surviving the crisis of middle age became one of the foundational principles of the marriage guidance moment, a development that originated in the late 1930s but subsequently emerged as one of the key features of a post-war settlement intended to restore the stability of the nuclear family.

It did not take long after the Second World War for these conservative dreams of social coherence and domestic stability to be destabilised, but covertly reinforced, by the rampant individualism of the marketplace. ‘Forty-phobia’ became an advertising slogan for companies selling nerve tonics that promised to help patients ‘feel younger as they grow older’. [4]  For men in particular, the purchase of household goods, cars, holidays and suburban houses was promoted as a means of blunting the frustrations created by the contrast between the excitement of conflict and the boredom of corporate and domestic life, a phenomenon neatly captured in the post-war fiction of Sloan Wilson and Joseph Heller. [5]  Self-fulfilment and its spiritual benefits became the mantra of popular advice books aimed at the disaffected, but affluent, middle classes struggling to redefine and relocate themselves in a changing world.

One of the fruits of this process was the creation of a novel expression of self-discovery: the midlife crisis.  Coined in 1965 by the Canadian psychoanalyst and social theorist Elliott Jaques to describe psychodynamic collapse triggered by fear of impending death, the term rapidly came to define midlife madness, a powerful tool for both explaining and legitimating the search for personal fulfilment during the middle years. [6]  The cost in terms of other people’s happiness was regarded by many as less important than the self-realisation, emotional emancipation and spiritual awakening that the crisis supposedly made possible.  But `la crise de la quarantaine’, as it became known in France, was not embraced as a means of enabling healthy ageing in women, as Parsons had anticipated, or as a pathway to a more contented and durable marriage, as Stopes had hoped.  Shaped by late twentieth-century obsessions with the autonomous individual and the gospel of consumption, the notion that life can begin again at forty has been used to reinvigorate a Western capitalist economy that can only be sustained by prolonging productivity and encouraging spending across the whole life course.

Notes

[1] `Now is the time for all women to train for the duties that war time may bring’, The Pittsburgh Press, Tuesday 10 April 1917, p. 20; Mrs Theodore Parsons, Brain Culture through Scientific Body Building, (Chicago, American School of Mental and Physical Development, 1912).

[2] Walter B. Pitkin, Life Begins at Forty, (New York, McGraw-Hill, 1932), pp. 174-5.

[3] Marie Carmichael Stopes, Enduring Passion, (London, Putnam, 1928); Marie Carmichael Stopes, Change of Life in Men and Women, (London, Putnam, 1936).

[4] An early example of this advertising strategy is `Forty-phobia (fear of the forties)’, The Times, 28 April 1938, p. 19.

[5] Sloan Wilson, The Man in the Gray Flannel Suit, (Cambridge, Mass., Da Capo Press, [1955] 2002); Joseph Heller, Something Happened, (London, Jonathan Cape, [1966] 1974).

[6] The midlife crisis is the focus of on-going research, the results of which will appear in Mark Jackson, The Midlife Crisis: A History, (London, Reaktion, forthcoming).

 

How not to kill your argument: looking after men without scapegoating women

Fred Cooper

On July 2nd 2016, the Guardian online newspaper published an anonymous open letter, written by a husband to the “wife who won’t get a job while I work myself to death.” Subtitled “the letter you always wanted to write”, it documented the sense of betrayal felt by a successful lawyer who had come to realise that the woman he once thought of as loyal and kind was “OK with my working myself to death at a high-stress career that I increasingly hate, as long as [she doesn’t] have to return to the workforce.” His wife, he explained, occupied her time with volunteering and leisure while his health deteriorated, he aged rapidly and prematurely, and he felt increasingly “used and alone.” Setting out his position in egalitarian terms, he emphasised his need for real partnership in marriage, and cited the example she was setting to their daughter, who he wanted to be “never as dependent on a man as you are on me.”

The appropriated language of equality barely masked a deep misogyny, a profound anger at the high-octane demands of modern employment misdirected away from workplace culture and the cult of productivity and folded into, in Elaine Showalter’s words, the “fiction that women push the buttons and call the shots.” The wife and her friends were caricatured as self-centred, happy to gripe but not to help: “You all complain about various financial pressures, but never once consider, at least audibly, that you could alleviate the stress on both your budgets and your burnt-out husbands by earning some money yourselves.”

Post-war concerns about the role of women in activating or exacerbating men’s executive stress, here, were re-imagined for a cultural setting in which housewives rather than working mothers were anachronistic, subversive, and dangerous. The anonymous letter bore a striking resemblance to the popular medical writing of a 1960s author and practitioner, Kenneth C. Hutchin, in all but the precise nature of the advice. Each addressed women directly, whether through the device of the disgruntled husband or that of the fatherly doctor. An expert on the illnesses of businessmen, hypertension, and heart disease, Hutchin published a book in 1962 entitled How Not to Kill Your Husband, an expanded version of a 1960 article in Family Doctor, ‘How to keep your husband alive.’

Underneath an illustration of a woman crouched behind her husband’s armchair with a revolver and a bottle of poison, “How to keep your husband alive” explained that, while “the number of women who set out to kill their husbands is surprisingly small, a great many wives could not polish them off better if they tried.” Women, Hutchin argued, rarely realised how delicate their husbands were. Making use of an infant language of risk, he depicted men in their forties and fifties with non-manual, highly responsible jobs as a fragile population, vulnerable to heart attack or stroke. In this instance, wives were not letting their husbands die by omission, but could actively provoke a fatal rupture in the soft, overwrought, morbid bodies fed by high-starch diets and enervated by sedentary labour.

For Hutchin, male health was compromised by two of the central features of post-war companionate marriage; a growing tendency for men to engage in household tasks, and a softened system of patriarchal authority in marital relationships. He illustrated his first concern with two short vignettes, each demonstrating the hidden threat of domestic contribution. In both instances, women pestered their husbands into working when they should be relaxing, with one man becoming “unable to move because of a vice-like pain in his chest” and the other undergoing coronary thrombosis later in the evening. “Dear ladies”, Hutchin concluded, “do stop finding little jobs for your husbands to do.” Work around the home should be “decently accomplished during the day”, whether by tradesmen or by the wife herself, “not waiting there, a reproach and a menace to the tired master of the house.”

Erosion of male leisure went hand in hand with erosion of male authority. Victorian wives who submitted to the “law and wisdom” of their husband’s orders were “probably happier than the modern wife” who distrusted him and disputed his word. Anger and frustration, Hutchin argued, “are dangerous emotions for tired middle-aged men with a poor coronary circulation. The wife who constantly annoys her husband does so at her own risk and his.”

As Margaret Mead observed in 1954, the implication of women’s behaviour in the illnesses of others frequently contained a “subtle form of antifeminism.” Hutchin’s demand for more support and deference in the home and the anonymous husband’s plea for financial contribution and independence indicate a troubling continuity in the identification and criticism of “pathological” female behaviour, a shared willingness to point the finger at women which destabilises the progress implied by the differences between the two critiques.

Feminists from the second half of the twentieth century onwards have made convincing cases for the transformation of identity and experience through work, but these arguments have been at their best when they put women’s needs in centre stage. The importance of feminism as a cultural and political force is sadly overlooked in present debates about preventive medicine, particularly psychiatry; we are far more likely to connect distress and disorder to biological processes than to interrogate the entrenched, systematic, structural ways in which women are driven or drawn into illness. At the same time, we are falling into ways of speaking about men’s distress which demonise and denigrate women, reducing the complexity of their feelings and actions to an imagined impact on male health and male self. There are nuanced and sensitive explorations of, for example, men’s depression, alcoholism, and suicide – look no further than Ali Haggett’s recent research – which demonstrate that these arguments can be had, persuasively and intelligently, without negating the all-too-fragile gains made in public discourses surrounding women. Anonymous husband, I hope you’ll try.

 

Kenneth C. Hutchin, ‘How to keep your husband alive’, Family Doctor, Vol. 10, No.3 (March, 1960), pp. 154-155

Kenneth C. Hutchin, How Not to Kill Your Husband (London, 1962)

Margaret Mead, ‘Some theoretical considerations on the problem of mother-child separation, American Journal of Orthopsychiatry, Vol. XXIV, No. 3, (July 1954), pp. 471-483

Elaine Showalter, The Female Malady: Women, Madness and English Culture, 1830-1980 (New York, 1987)

 

The Costs of Balance? Diet and Regulation in Twentieth Century Diabetes Care

For the past few months, I have been working on an article about dietary management and patienthood in twentieth-century diabetes care. One key theme that has reoccurred throughout this research might be summed up by the question: “balance at what cost?” This is a question that patients and practitioners have had to ask consistently (if implicitly) since the 1920s, and I believe its history may have something to contribute to present discussions.

Balance: A Bit of Everything or Denial of Some Things?

Whether in popular culture or medical discourse, achieving and maintaining balance has frequently been discussed in overwhelmingly positive terms.

For patients with diabetes, medical advice for much of the twentieth and twenty-first centuries has been that metabolic balance is potentially life-saving.[i] That is, glycaemic control has been seen as the best safeguard against a range of short-term problems, as well as the means to prevent the onset of – or reduce the likelihood of developing – devastating long-term complications.

Yet, whilst we might associate balance with phrases like “all things in moderation”, balance in diabetes care has historically been construed in terms of restriction and regulation. Dietary management, that is, has often entailed either removing, carefully controlling, or substantially increasing the intake of certain foodstuffs. Amongst other things, hyper-vigilance and the inability to have a little of everything have been two of the costs associated with balance for individuals (and often their familial and social relations).

Changing Dietary Regulations

This is not to say that patterns of dietary prescription have remained static over time, especially for patients who required insulin. (Weight reduction through calorie control remained a constant imperative for overweight patients not using insulin.)

For instance, in the first few decades of the twentieth century, doctors discussed metabolic balance in relation to a careful restriction of carbohydrates. In fact, in some cases this even extended to banning certain foodstuffs. Lists of “forbidden foods” were often included on dietary sheets, and one doctor even declared that he had a “feeling that it is better for the patient to forget about the taste of bread altogether than be exposed to the temptation when he takes a fraction.”[ii]

These strictures altered considerably in following decades. During the 1930s and 1940s, doctors began to prescribe higher carbohydrate allowances, and in the following decades they dropped careful fat and protein regulation to help reflect more “normal” dietary patterns. After two decades worth of debate about the role of “free diets” in care, higher fibre intakes became a feature of discussions in the 1970s and 1980s, following the testing of theories designed in colonial sites of research. By century’s end, programmes for “normalisation” slowly began to move to the fore, resulting in strategies like DAFNE.[iii] Gradually, “forbidden” foods became controlled foods.

Calculating Costs

Regardless of these moving goal posts, patients and their families rarely adhered fully with their medical instructions. For some, this was simply a case of not being able to afford diets. The financial cost of diabetes diets regularly ran considerably higher than average expenditure on non-medical diets.

For other patients, the logistics of careful management were just as much of a barrier. Surveillance – with its constant vigilance, measurement, inscription and assessment – required more than considerable energy. It also demanded action, and access to a whole material culture. (Pens, records, dietary sheets, scales, jugs, and urine testing equipment.) The time and materials required were rarely accessible in the workplace, and regulating an individual’s diet often had unacceptable consequences for spouses, families, employers, and friends. Indeed, for patients regulating their diet in public, one result might be to mark them out, risking negative comments, social awkwardness and anxieties, as well as problems at social events, such as meals or celebrations. Under such circumstances, patients were either physically unable to manage their diets, pressured into certain actions, or encouraged to weigh their priorities and find (quite rationally) that other relationships and activities were worth the possible costs.

This also to say nothing about the extent to which prescriptions were adapted for cultural, class-based, or dietary preferences. Or, indeed, to discuss palatability. As one patient (who was also a doctor) challenged readers in The Lancet: “you try getting through 3-4oz butter with 1oz of bread to spread it on!”[iv]

Patients and Costing Balance

What I have taken away from this research is that many of the costs that follow the pursuit of balance are deeply embedded into social and cultural life, and can’t all be altered by educational, motivational, or bureaucratic programmes and imperatives.[v] The challenges of the past, in other words, are not dead, but are in many respects still with us.

In fact, we might ask whether there is a broader discussion to be had, one concerning the values attached to balance, and the extent to which reasons for patients not following advice should be conceived as “problems”. In many respects, this has been the starting point for many strategies designed to investigate and improve “compliance” and “adherence” in the past four decades.[vi]

To be sure, I believe that patient education is vital. Equally, co-operation between health care teams, patients and families is often integral to effective care, just as glycaemic control can be protective of long-term health. (Though not a guarantee.)

Nonetheless, at some point it will be necessary to consider the limits to these strategies, and – to a certain degree – the desirability of consistent attempts to problematize and alter behaviour. For those of us without the condition, it is also worth thinking clearly about the costs involved in management before rushing to moral judgement. One patient (again a practitioner) perhaps put it best when writing just after the Second World War:

“Each meal should be an elegant satisfaction of appetite rather than a problem in arithmetic and a trial of self-abnegation”.[vii]

Things have changed significantly since 1946, but daily challenges to patients are still considerable, and the mental, emotional and physical affects of management remain.

Notes

[i] Diabetes UK, ‘Self-Monitoring of Blood Glucose Levels For Adults With Type 2 Diabetes’, Diabetes UK Online, 2016. Available at: https://www.diabetes.org.uk/About_us/What-we-say/Diagnosis-ongoing-management-monitoring/Self-monitoring-of-blood-glucose-levels-for-adults-with-Type-2-diabetes/. Accessed on: 25th March 2016.

[ii] George Graham, ‘On the Present Position of Insulin Therapy’, The Lancet, Vol.204, 1924, 1265-6.

[iii] Or Dose Adjusted For Normal Eating: Anon, ‘DAFNE’, DAFNE Online, 2016. Available at: http://www.dafne.uk.com. Accessed on 25th March 2016.

[iv] Anon, ‘Disabilities: 21. Diabetes’, The Lancet, Vol. 253, 1949, p.116.

[v] Which have been popular solutions of the recent past: NHS England, Action on Diabetes, (January 2014). Online Document. Accessed on 16 June 2015. Available at: https://www.england.nhs.uk/ourwork/qual-clin-lead/diabetes-prevention/action-for-diabetes/. With some notable successes: R.P. Narayanan, J.M. Mason, J. Taylor, A.F. Long, T. Gambling, J.P. New, J.M. Gibson, R.J. Young, ‘Telemedicine to improve glycaemic control: 3-year results from the Pro-Active Call Centre Treatment Support (PACCTS) trial’, Diabetic Medicine, 2012. Available online: http://0-onlinelibrary.wiley.com.lib.exeter.ac.uk/enhanced/doi/10.1111/j.1464-5491.2011.03352.x.

[vi] Jeremy A. Greene, ‘Therapeutic Infidelities: ‘Noncompliance’ Enters the Medical Literature, 1955-1975’, Social History of Medicine, Vol.17, 2004, 327-43.

[vii] C.C. Forsyth, T.W.G. Kinnear, and D.M. Dunlop, ‘Diet in Diabetes’, BMJ, Vol.1, 1951, p.1099.

A Question of ‘Public Engagement’

Ayesha Nathoo

 

Over the last year, I have been involved in a number of public events related to my work on the history of therapeutic relaxation. These have included talks, displays and practical workshops at the “Being Human” festival of the humanities, the “Secret Garden Party” and the “Wilderness festival” (in collaboration with Guerilla Science and NOW live events), a “Friday Late Spectacular” and a discussion evening on “Rest and Relaxation in the Modern World” as part of Hubbub, at the Wellcome Collection, London.

 

 

The aims, scale, content and audiences varied for each of these events, but together they have left me reappraising my role as a historian, and reflecting on notions of expertise in such public forums. The central topics which comprise my research – ‘rest’, ‘balance’, ‘stress’ and ‘relaxation’ – affect us all, and many audience members were drawn to the events because of pre-existing interests in these matters. Others stumbled across the events by chance with little idea of what to expect or gain from them. In the music festivals, the historical material distinguished my workshops from the myriad other practice-based workshops on offer (such as yoga, mindfulness and massage); elsewhere the practical content differentiated my work from other more traditional academic contributions.

 

I am particularly interested in understanding relaxation as a taught practice, and the material, audio and visual culture that has furthered its development over the last hundred years in Western, secular, therapeutic contexts. Aural instruction given in classes or on cassettes were key methods for teaching and learning relaxation in the post-war decades and are central to understanding the growth of such practices in both clinical and domestic settings. As well as the instructions, the tone of voice, pace, pauses, and type of medium would have affected how relaxation was understood, distributed and practiced, so I have been keen to track down and share some of the original audio recordings to better understand these experiential and pedagogical aspects. If these have been unavailable I have instead endeavoured to recreate the ways in which different protagonists have taught relaxation, piecing together printed publications, archival material and oral historical resources to do so.

 

Many of those who participated in the workshops were curious to learn more about the techniques that I discussed – such as yoga, meditation or autogenic training – and their relationship to health and wellbeing. Yet as I was presenting the material primarily as a historian, rather than as a practitioner or advocate of any particular therapeutic practice, some unexpected tensions seemed to arise. Whilst the historical material inspired much interest, most centrally I found that people wanted to evaluate the specific techniques: What was going to work best for them for particular ailments or for general wellbeing? What is the best strategy for alleviating anxiety or chronic pain? Would I recommend relaxation for childbirth? Did I have copies of relaxation instructions that they could take away? Why was I talking about Progressive Muscle Relaxation, one lady asked, when the Alexander Technique (which she had taught for 20 years) was far superior? Was mindfulness meditation really a form of relaxation? Was it best to concentrate on relaxing the mind or the body? What is the evidence base for therapeutic relaxation? Why bother with daily relaxation training if there is a more accessible pharmaceutical option?

 

Although comparable questions have partly spurred my own interest in this research area, speaking as a historian I have felt uneasy about offering responses. The short, practical, personal questions are not compatible with in-depth answers that address broader medical, social and political contexts, such as the rise of individualism and the mass media, and changes in healthcare, lifestyle and biomedical models. Yet that is what has created and shaped successive demands for and critiques of therapeutic relaxation; contemporary concerns and understandings derive from these past contexts. This is the long and complex narrative that I am researching and whilst I certainly hope that it will have policy implications and be relevant to today’s medical landscape, I do not feel equipped to offer personal advice. Neither am I sure that I should be doing so.

 

I would speculate that this kind of professional reticence is a majority view amongst historians, and yet it is somewhat frustrating for interested lay audiences. If a professional researcher is investigating a particular subject, then why should they not state their opinions based on the knowledge gained from the research? I have come across this at various other points during past research, on topics ranging from the media coverage of the possible link between autism and the MMR vaccination to organ transplantation and donation. ‘Should I give my child the vaccination?’, mothers repeatedly asked me. ‘Were there any reasons not to sign the organ register?’ ‘Did I think there should be an opt-out clause to increase donation rates?’ It is not that I had not given enough thought to these matters – I had extensively mulled over them – yet I questioned my role as a historian to authoritatively influence other people’s present-day decisions, certainly without allowing the time and space to substantiate my points of view. The aim, however, would not be to give a ‘balanced’ view in the sense of ‘objectively’ presenting a full range of arguments for and against.

 

The personal is the historical: Knowledge and memories of the past shape views and actions for the future. And so a historian’s personal stance can generally be inferred from their authored work, amongst the layers of interpretations and the selection of sources. Perhaps then scholars should meet the challenge of more explicitly articulating their own views in public contexts, where audience members often lead conversations and set agendas and where the boundaries of expertise are fluid. As ‘public engagement’ becomes an increasingly significant part of academic life, it seems timely and important to open up these discussions.

‘On Balance: Lifestyle, Mental Health and Wellbeing’: Musings on Multidisciplinarity, from a Historian.

 

Ali Haggett

The first of three major conferences to be held in conjunction with the Lifestyle, Health and Disease project took place on the 25th and 26th of June at the University’s Streatham Campus. Focusing broadly on the strand of research that is concerned with mental health and wellbeing, the remit of the conference was to explore the ways in which changing notions of ‘balance’ have been used to understand the causes of mental illness; to rationalise new approaches to its treatment; and to validate advice relating to balance in work and family life. Drawing on a range of approaches and methodologies, the multidisciplinary conference attracted scholars from Britain and the United States, with diverse backgrounds, which included: history, anthropology, psychiatry, psychology and clinical medicine. On the evening of the 24th June, as a prelude to the event, we began by hosting a public panel debate, on the topic of ‘Defeating Depression: What Hope?’ at the Royal Albert Memorial Museum in Exeter. A photo gallery and a summary of the evening can be found on the Exeter Blog

 

Still at the formative stages of research, I hoped that the contributions from other scholars might provoke new lines of enquiry, or stimulate interesting alternative approaches to our work. One of the questions I am particularly interested in is: why does the concept of balance in mental health and wellbeing become influential at certain times through our recent history? As the conference progressed, and with the public panel event also in mind, I found myself wondering what a future historian might make of the contemporary themes and concerns that emerged from this conference. It struck me that many of the anxieties that were articulated by non-historians were not new, but that they had surfaced at regular junctures in modern history. At the heart of a number of papers, and evident from the contributions to the public debate, was a palpable dissatisfaction with the status quo – with ‘modern’ and perhaps ‘unbalanced’ ways of living and their effects on health. These concerns are reminiscent of those put forward much earlier, during the early and mid twentieth century, by proponents of the social medicine movement who were critical of rising consumerism, the breakdown of traditional values and kinship ties, and who were keen to reduce the burden of sickness by pressing for social improvements.[1] Misgivings about the current ‘neo-liberal’ political climate were evident, where, in some circles, the principles of free-market individualism are held to undermine collective action, community spirit and kinship, leading to disempowerment and ultimately to ill health. The prevailing interventionist, biomedical model of medicine practised in the West did not escape criticism. Some of the concerns raised resonated strongly with the ideas put forward by proponents of the ‘biopsychosocial’ model of medicine from the 1970s, which highlighted the importance of the social context of disease.[2] A number of papers raised important questions about the ways in which the current medical model appears to foreground the treatment of mental illness and underplay approaches to prevention. Speakers from the conference and contributions to the public debate noted, with some disquiet, that responsibility for protecting and maintaining mental health had increasingly shifted to the individual, instead of the ‘group’, the employer or the wider socio-economic environment.

 

Perhaps unsurprisingly, anxieties about mental illness and the field of psychiatry that first materialised during the 1960s and developed within the ‘anti-psychiatry’ movement were still conspicuous at the conference – anxieties about the classification, diagnosis and labelling of mental disorders; unease about the misapplication of ‘norms’, rating scales and the concept of ‘risk management’ in medicine. The disquiet that emerged during the 1960s was of course also intimately associated with the contemporary counter culture and broader concerns about the conformity and emptiness of the post-war world. Such ideas were evident in the literature of the period from authors such as George Orwell, William H. Whyte, David Riesman and Herbert Marcuse, who all variously disapproved of the social and cultural changes that took place in mid-century Britain and the United States.[3]

 

Defined by the Oxford Dictionary as ‘a situation in which different elements are in the correct proportions’, the concept of ‘balance’ remains at the core of all debates about mental health, whether we are talking about chemical imbalance, work-life balance or cognitive and mindful approaches to human behaviour. The papers delivered at the conference by my fellow historians neatly exposed the ways in which many of the themes discussed have emerged in the past and often reveal more about broader concerns, tensions and uncertainty about new ways of living and their effects on health than they do about epidemiological trends in mental illness. While historians are uniquely placed to add this important context, the joy of combining insights from several disciplines is that we are able to begin to redefine problems and reach solutions through new understandings. On a personal level, the contributions from other disciplines reminded me that perhaps, as an idealistic historian, I am sometimes distanced from the harsh realities of clinical practice. Collectively, the papers also prompted me to think about new ways of conceptualising and measuring what is ‘balanced’ in life and in health – and perhaps also to question the ways in which balance is somehow taken to be inherently desirable, or essential. There is no doubt that the global burden of mental ill health has become one of the most pressing social and medical problems of our time. Overcoming the challenges faced will require the knowledge of more than one discipline. As scholars engaged in research into mental health and wellbeing, we are all, in different ways, and with different approaches – and often with different opinions – ultimately seeking a shared goal of fostering ways to improve mental health and wellbeing in our society.

 

The conference organisers would like to thank the Wellcome Trust for supporting the conference and to the following speakers for their contributions:

Professor David Healy, Dr Matthew Smith, Professor Jonathan Metzl, Dr Nils Fietje, Professor Ed Watkins, Dr James Davies, Dr Ayesha Nathoo, Professor Michelle Ryan, Mr Frederick Cooper, Professor Femi Oyebode, Dr James Fallon and Dr Alex Joy.

[1] As examples: Stephen Taylor, ‘The suburban neurosis’, Lancet, 26 March 1938 and James L. Halliday, Psychosocial Medicine: A Study of the Sick Society (London, William Heinemann, 1948).

[2] See George L. Engel, ‘The need for a new medical model: a challenge for biomedicine’, Science (1977), 196, 129-36.

[3] An interesting discussion of the political and social context within which the antipsychiatry movement grew can be found in Nick Crossley, ‘R. D. Laing and the British anti-psychiatry movement: a socio-historical analysis’, Social Science and Medicine (1998), Vol. 47, No. 7, 877-89.

The Health of Pilots: Burnout, Fatigue, and Stress in Past and Present

Natasha Feiner

On 24 March 2015 a Germanwings Airbus crashed 100 kilometres northwest of Nice in the French Alps after a constant descent that began one minute after the last routine contact with air traffic control. All 144 passengers and six aircrew members were killed.

The crash, tragic as it was, attracted significant media attention and it was not long before attention turned to co-pilot Andreas Lubitz. German prosecutors said that they found indications that Lubitz had concealed an illness from his employer, hiding a sick note on the day of the crash. Whilst some media coverage looked to Lubitz’s history of depression, others investigated ‘burnout’. Der Spiegel reporter Matthias Gebauer tweeted in March that Lubitz was suffering with ‘burnout-syndrome’ when he took time out of pilot training in 2009.[1]

The term ‘burnout’ was coined by Herbert Freudenberger in 1974 and is still widely used in Germany (and to a lesser extent, the UK and America) today. Symptoms include long-term exhaustion and diminished interest in work, which is often assumed to be the result of chronic occupational stress.

The recent media discussion of burnout among pilots as a result of the Germanwings crash has brought the issue of pilot health into sharp relief. Several countries have implemented new cockpit regulations and there has been significant discussion of how pilots (and the airlines that employ them) should best deal with stress, personal problems, and exhaustion. These issues have their historical antecedent in late-twentieth century discussions of ‘pilot fatigue’.

It is widely acknowledged today that commercial airline pilots are employed in one of the most stressful occupations of the modern age. Before the Second World War this issue was rarely discussed outside academic circles. Traditionally conceived by the public as heroic and superhuman, early pilots were held up as paragons of masculine strength and vigour, able to manage great responsibilities with little (if any) impact on their physical or mental health.

Although fatigue was first recognised as a potential problem in the 1950s, it was not until the 1960s that the relationship between flying, fatigue, and the health of pilots was first discussed in the mainstream media. A number of newspaper articles highlighted the stressful nature of the pilot’s job and (from the early 1970s) a number of alarmist articles reported incidents of pilots falling asleep at their controls. In one report a pilot flying over Japan was said to have “nodded off” and then woken to find the rest of his flight crew asleep:

‘In the report… the BOAC captain said that when he felt himself dozing he shook himself, looked around the flight deck and found his two co-pilots and flight engineer asleep. “I immediately called for black coffee to bring everyone round” [he said]’.[2]

The increased media interest in ‘pilot fatigue’ coincided with a period of industrial strife amongst pilots who were experiencing radical changes not only in the type of aircraft they were asked to fly, but also in terms of management and working conditions. These issues came to the fore in 1961 when airline BEA released their summer flying schedules. The proposed schedules were intensive and many BEA pilots questioned the implications for safety. Long duty periods and inadequate rest breaks would, it was argued, cause dangerous fatigue that may increase the likelihood of accidents.

BEA relented and allowed an investigation of ‘pilot fatigue’. Carried out by physician of aeronautics H. P. Ruffell Smith, the investigation used a system of points for measuring flight time limitations, replacing the traditional hours system. The subsequent report suggested that BEA pilots should not fly more than 18 points per day, and extra points were awarded for especially stressful or fatiguing operations, such as take-off and landing. Ruffell-Smith’s report was never published and BEA did not enforce his recommendations. The problem of ‘pilot fatigue’ was not solved.

In the years that followed a number of high profile air disasters occurred, many of which were later attributed to ‘pilot fatigue’. In 1966 a Britannia plane crashed in Ljubljana, Yugoslavia, killing 98 people. One year later another plane crashed, this time in Stockport, killing 72 people. Then, in 1972 a BEA Trident plane crashed in Staines, killing 118 people. The Trident crash, in particular, caught media attention as the pilot in charge of the plane, Stanley Key, had made ‘numerous complaints’ about the length of the working day prior to his death.[3]

As a result of this, in 1972 pilots’ union BALPA revived its campaign to reduce working hours, shifting their focus to the dangers ‘pilot fatigue’ posed to passengers. By emphasising the potential dangers of fatigue, BALPA was able to convince airlines to carry out a further investigation into flight time limitations and pilot workload. Based on the results of the investigation, in 1975 the Civil Aviation Authority published strict regulations on flight times with the aim of avoiding ‘excessive fatigue’[4].

Whilst the problem of ‘pilot fatigue’ did not come to a neat conclusion in 1975 (BALPA continues to campaign on the issue to this day) the working conditions of pilots were drastically improved by the introduction of strict flight time limitations.[5] Such drastic changes would not, arguably, have taken place without the support of the British media. The alarmist nature of newspaper reports on the subject of ‘pilot fatigue’ forced airlines to take the health of pilots seriously, for fear of further frightening (and consequently losing) customers.

One would hope that the British media could play a similarly positive role today, following the Germanwings tragedy, by encouraging a re-evaluation of mental health policy by airlines (as well as by employers more generally). Although many initial newspaper reports about Lubitz were (sadly) insensitive and stigmatising, several recent articles have used a of discussion the Germanwings crash as a platform for encouraging greater awareness and understanding of mental health.[6] The tragedy may yet engender a re-evaluation of mental health and stress in the workplace, as the Trident crash did for ‘pilot fatigue’ in 1972.

 

[1] Gebauer is quoted in this news report: http://www.independent.co.uk/news/world/europe/germanwings-crash-copilot-andreas-lubitz-who-crashed-plane-suffered-burnout-says-friend-10137076.html [last accessed 23/06/15]

[2] The Times, Dec 13 1972, page 1.

[3] The Times, Nov 29 1972, page 4.

[4] The Avoidance of Excessive Fatigue in Aircrews: Requirements Document, (London, 1975), p. 1.

[5] For more information on BALPA’s current ‘Focus on Fatigue’ campaign see: http://www.balpa.org/Campaigns/Focus-on-Fatigue.aspx [last accessed 23/06/15].

[6] Alastair Campbell (‘Time to Change’ ambassador) on the stigma and taboo surrounding mental health: http://www.huffingtonpost.co.uk/alastair-campbell/andreas-lubitz-would-we-be-blaming-cancer_b_6961386.html [last accessed 23/06/15].