Category Archives: Uncategorized

Speaking from the very inside of myself

Deborah James

 

I don’t know whether it was the vibrant colours, the softness of the felt, or the fragility of the fabric that made Claudia Stein, Research Director of WHO Europe, reach out and take the weave from my hands last year in Newcastle upon Tyne at Fuse’s International Conference on Evidence in Impact in Public Health (http://www.fuse.ac.uk/events/3rdfuseinternationalkeconference/). She stood there in front of me, holding the weave; silent for a moment, looking down and then responded, “This is the next generation of healthy children”. I was delighted. She got the metaphor.

 

 

The weave was a co-creation of the early years’ workforce from health and social care in the NHS England Cumbria and Tyne and Wear region. The project, funded by NHS England, was designed to support the health visiting workforce during the transition of their commissioning from the NHS to local authorities. In 2015, when we were designing the project, we didn’t know how the wave of austerity in public health would be felt on the ground, but we knew that the transition of commissioning, like any change, would produce both gains and losses. During times of change, we narrate our identities to help make coherence through the chaos.  With this in mind, we designed a series of large workshops across the region that focused on perspective taking. I wanted the workshops to provide a space for reflection so that each person could narrate the motivational force that led them into their respective work roles. A time to stop and think about themselves – their story line, and I also wanted to find out how they used the perspectives of service users in the construction of meaning in their own work. In addition, the workshops were designed to provoke curiosity about the system leadership that created the context for the changes they were facing. Could they see the change from the perspective of the policy makers who had called for action?

 

About 100 people from the health and social care workforce came together in five events across the region. In the workshops, the practitioners and workers shared their examples of best practice, worked on developing a shared understanding of the common purpose of their work in small groups, and fed back their ideas individually about the meaning of their work from multiple perspectives (themselves, their colleagues, and service users). More details about the project are here; (https://www.northumbria.ac.uk/media/6588781/supporting-transition.pdf). The participants created the metaphor as they wove their line, their story line on fabric in with the other participants’ fabric. This was the integrated health and social care workforce; an enactment of narrative testimony and witness that drew the subjects into a collective. People, connected with other people, speaking from the inside of themselves – speaking from the first person perspective.

 

Like the workers weaving their story lines, the first person position is the place from which I can access my own values, beliefs and unique understandings of the world. But the first person is also the place from which I give voice to my unformed thoughts, to my not knowing and even to my chaos. It is the first person position that I need so speak from if I stand any chance of creating generative outcomes from the narrative interface with someone else. The first person position is the antithesis of the third person specialist position, the place that, as academics or professionals, we normally set up for ourselves. I think it is an act of courage to speak from the first person. Speaking from the first person reveals something of who I am, whereas speaking from the third person reveals something of what I do.  As Donald Winnicott said (1971), “’I am’ must precede ‘I do’, otherwise ‘I do’ has no meaning.” It strikes me that many of our interventions in the early years start from an assumption that ‘I am’ is established (in the parents and in the child by the time he or she is ready for school). So, our interventions focus on the doing; ‘do this’, or ‘do it this way’, ‘best if you don’t do that’, or perhaps, ‘why do you do that?’ If the doing has no connection to the being there can be no generative sustained change from it. A more developmental set of assumptions might go something like this: our identities, the “I am”, are constantly evolving, that evolution is likely to be related to our ability to exchange positional perspective with another, and this positional exchange is probably most often achieved through story-telling. If these assumptions were robust foundations for supervisory practice, they would help the workforce articulate the mechanisms of change in their work with families. During the workshops, we used survey monkey to ask participants about their work. The health-visiting workforce usually found it difficult to accept the perspectives of other workers and the families on their role. When asked about what they did with families that brought about change, the overwhelming response was that they supported families. In the context of austerity, claiming and naming a supporting role with families is probably not enough to safeguard the ratio of relatively high payed/high skilled work roles. Critical reflexive supervision, with developmental assumptions about the identity work of the self and the other as outlined above could help develop a refined and specific articulation about how ‘supporting’ families leads to change during families own times of transition. It would also help make the case as to why investment in the early years’ workforce will create not only the next generation of healthy children, but also a more open society.

 

In the time since I visited Exeter in October last year and writing this blog, I fractured my wrist. It gave me time to stop and think. When I come to think now, when the truth of a previously trusted positional voice is being frequently questioned and challenged, I think that we, those who may influence intervention policy and practice in the early years,  have an important role in resisting the present drive towards ‘telling how’.  We need to rethink and examine how we can support the evolution of the person – the person who is capable of creatively adapting to the circumstances that cause suffering, the person who is able to act with relational agency to find a way through the chaos.

Balancing Lifestyle and Health: Reflections on a Public Engagement Worksop

‘Balancing Lifestyle and Health’ was a one-day workshop run at Exeter Library by the University of Exeter and Libraries Unlimited on 11 November 2016. Here, Dr Martin Moore discusses the event, summarises its discussion, and highlights where future collaborations might lead… 

For a slightly longer reflection, please see the discussion at the Libraries Unlimited site

‘Balancing Lifestyle and Health’ was a one-day, participant-focused workshop. It brought together a mixed audience of service users, public health practitioners, health service professionals, clinical commissioners, representatives from community and non-governmental organisations, and members of the general public, all to discuss ideas about balanced lifestyles.

The event itself was divided into four one-hour mini-sessions, framed by a short introductory talk by the organisers. Aside from these ten-minute prologues, the sessions were given over to participants to discuss the issues raised, and to record their conversations.

The programme of the day is appended to this post. However, to briefly recap, session one focused on “living well with illness” – a topic covering everything from illness and identity, to managing daily life. Session two began with an exploration of the successful Books on Prescription scheme, after which participants explored the relations between community and lifestyle balance. Session three (fittingly appearing after lunch so as to alleviate any pre-meal guilt), tackled the tricky subject of “balanced diet”. And session four asked participants what is meant by “work-life balance”, and whether any such balance is possible or even desirable.

Broadly speaking, then, the aim of the event was to simultaneously foster critical discussion, and to facilitate new links, between people who might not usually have sustained contact. By putting members of the general public in touch with service commissioners and public health professionals, for instance, we thus sought to help everyone involved come away with new perspectives, and enable them to develop lasting collaborations.

As might be expected from an event covering quite diverse topics, the day’s discussion was wide-ranging. There were, nonetheless, a number of themes that consistently reappeared throughout the day.

Responsibility

Perhaps unsurprisingly, a number of debates and conversations repeatedly centred on the theme of responsibility and reform, where discussants asked and answered “who is ultimately responsible for maintaining our health and wellbeing?”

Responses were diverse. Some participants raised the point that perhaps individuals have to be ultimately responsible. They argued, for instance, that we ultimately decide our eating habits, and that perhaps we have become too reliant on expert advice. The appropriate response across any number of spheres, they suggested, was to create much more of a cultural emphasis on self-care and self-management, and to orient our systems towards signposting and training, rather than “nannying” and treating.

On the other hand, many other participants highlighted broad structural problems that needed to be addressed. Maintaining health, and living well with illness, for instance, was deemed difficult within profit-oriented systems that only valued individuals for what they could consistently produce and monetise. Those “left-behind” and deemed “useless” suffered financially, physically and mentally, whilst those in work were also being pushed to their limits and scared to disclose health problems. Similarly, unhealthy working practices – such as long hours and poor pay – combined with changing geographies of work to threaten to healthy diets and work-life balance. Unreasonable work demands had resulted in quick, individual “al desko” lunches, which restricted sociability and nutrition. Long commutes reduced time and energy resources available for cooking, caring, and leisure pursuits. New working cultures and even regulations were proposed, alongside important educational programmes for children and adults, to ensure healthier diets and more balanced working and social lives.

Cultural Change and Community

Culture, sociability, and community, also reappeared in other discussions.

Throughout the workshop concerns were expressed about community breakdown and cultures of community engagement. There was little consensus about the cause of such decline. Nonetheless, discussants highlighted community and family dislocation as a possible cause for growing rates of isolation, for declining health, and for poorer eating habits. In terms of the latter, the decline of communal eating was felt to prevent transmission of good habits and to divert attention away from social elements of consumption. More positively, participants did praise the work of community-based organisations – such as libraries and arts groups – and felt it offered a way to combat community breakdown. How we can evaluate such work in a pseudo-objective, “outcomes”-focused, funding environment, though, remained a consistent concern.

Technology

Some participants did see a role for technology here. It was suggested that digital technologies had allowed for new communities and new forms of social activity to thrive. Yet, it was admitted that novel technologies could also be a double-edged sword. Some felt that digital networks still helped bring the loss of “real” local community. Likewise, other participants suggested that, although central to “flexible” working, new technologies also allowed work to become a “24 hour” pursuit. On the one hand, then, new technologies could enable people to better integrate work demands with family responsibilities and leisure pursuits. On the other, emails, “smart” phones, and work laptops could also risk generating new expectations that we should always be available and “connected”.

No “One-Fits-All” Model or Solution

Of course, what worked for one person was not taken to work for another. And in some respects, this was the core message of many discussions.

In dietary terms, the ideal diet varied from person to person, and a consensus appeared that there were no universal responses that could be put into operation to produce “better” or “healthier” diets. According to many participants, such a lack of agreement could be traced to an inability to find coherent messages from various expert sources (or sauces). Such advice needed to be taken with a pinch of salt (and aromatic five-spice). (My apologies for peppering this section with so many bad pun(net)s.)

Likewise, some discussants suggested that defining what was “life” what was “work” was difficult, and that we shouldn’t assume that less work would offer greater joy and “balance”. Life (as supposedly separate from work) could actually be very stressful and demanding. Caring duties could be financially and emotionally stressful, for instance. Equally, vocational jobs, although running the risk of exploitation from demanding employers, could also be incredibly enjoyable. Thus, participants proposed, no single model was preferable, and a holistic approach was needed to ensure we could tailor diets and working lives to our own needs and preferences. 

Problematics of Language

And, in many ways, these final discussions brought out one last recurrent theme: that of the problematics of language. Participants consistently questioned phrases like “balance”, “moderation”, and “work-life”, asking who defined such terms and how. This desire for a new – or at least a clearer – language also appeared in discussions of “living well” and “illness”. Both these terms came in for questioning, and it was suggested that talking about “illness” could be misleading. Similarly, the way in which labels and language could shape a sense of self was raised. For some discussants, diagnoses and labels opened up access to resources, provided a sense of solid identity, and helped explain what was wrong (and thus alleviated anxiety and self-recrimination). For others, defining oneself in terms of a disease or illness – or being defined in such way by others – could restrict employment or social activity, and trap people in a stigma. Finding new ways to talk about these issues was considered crucial.

Where Next?

 As can be noted even by the preceding discussion, the event contained a considerable amount of discussion, agreement and debate. And if there was one criticism of the day, it was that we tried to do too much, or at least had to curb discussions that could have filled whole days in themselves.

In spite – or possibly because – of this, the event proved to be extremely well received, and we managed to fulfil our two broad briefs for the day.

However, “Balancing Lifestyle and Health” was only ever intended as an event to lay foundations.

We will soon be making contact with participants to see what sort of events they would like to see in the coming months and years. For ourselves, we believe that running half-day workshops on each of the workshop themes – diet, community, work-life, and living well with illness – would be productive. We would also like to move beyond the city, into Devon more broadly, and perhaps to think in terms of more concrete policy and workplace solutions.

For now, however, we would like to thank everyone who contributed to the day. As organisers we found it incredibly enlightening and invigorating. And we very much look forward to running further events with you in the future.

‘Let thy food become thy medicine’: healthy diets and supplements in English language self-help books 1950-2000

 

Screen Shot 2016-08-25 at 07.09.05

In this post, Nicos Kefalas (a PhD student with the Balance project) discusses his doctoral research into healthy eating and supplementation. Here he considers why self-help books about diet became so popular in the post-war period, and examines the ways in which this literature deployed linguistic devices to construct readers as “empowered” agents of healthy-lifestyles.

In recent years, it seems that not a day passes by without public debate about healthy eating and supplements. Thousands of magazines, newspapers, websites, YouTube videos and blogs are dedicated to issues revolving around food, diet, exercise, supplementation and health. Popular discussants of healthy diets even make use of data from randomized-control trials (RCTs) and population studies, metastudies and large trials, and translate such knowledge into easily digestible graphics on various foods and supplements to demonstrate whether or not they are truly good for overall good health. (Or to weed out products that could be considered as ‘snake oils’).[1]

 

My PhD examines the historical emergence and development of healthy eating, dietary advice and supplements to provide a better understanding of when, how, why and by whom ‘Healthmania’ was promoted. [2] ‘Healthmania’ is the fascination of various institutions of the West with healthy diets, foods and supplements, and the concomitant production of advice and products to help individuals attain better lifestyles, both independently and collectively. From the 1950s onwards, self-help literature, lifestyle magazines, newspapers, advertisers, science and the state accepted and promoted certain foods, diets and supplements for a number of reasons. Building upon a widespread valorisation of science on the one hand, and fascination of the media and public with slim bodies and healthy lifestyles on another, leading scientists, public figures and state bodies were all motivated by a mixture of personal ambition and broader political aims to promote healthy eating and supplementation.

 

Following historical work on Galenic medicine, food and health that has centred on early-modern herbals, manuals and cookery books, I am currently examining English language self-help books published between 1950-2000 which concentrate on nutrition, diets, food and health.

 

‘HEALTHMANIA’ IN SELF-HELP BOOKS’

Self-help authors made individual agency central to their efforts to popularise ‘optimal’ and ‘ideal’ diets. It is clear that the language and advice in self-help books emphasize and promote the notion that readers have the power to manage their own health and a moral obligation to do so. The books heavily imply that the proactive, educated, well-read and well-informed individual can promote their own good health. The readers are initiated into a self-narrative in which they do not need or want the help of doctors and expensive or ineffective treatments. This newfound agency over their own health is reinforced by the growth of capitalism to provide the foods and supplements necessary to sustain a mentality of ‘buying a better body’. Indeed, not only does the self-help literature direct its readers to the nearest health food stores, pharmacies and juice bars, but it also urges readers to consume specific foods (often exotic fruit), drinks and supplements. Some authors of self-help guides even take a further step. For example, Dr Atkins urges his readers to monitor their progress using a home urine-chemistry kit, encouraging the purchase of sophisticated surveillance technology, alongside expensive dietary consumables and further literature.

One of the reasons behind the popularity of the self-help industry lies in the fact that it stood out from mainstream science and advice for better health, which often consisted of the mantra to ‘eat less, exercise more’. By contrast, whilst self-help books sought to empower readers to make positive changes in the present and future, self-help books nonetheless removed the blame for ill-health and obesity from their readers and instead attributed responsibility for past failures to the food industry, ‘modern foods’, fast foods, mainstream science, and modern lifestyles. Each reader is thus given the chance to feel like a revolutionary and an ‘enlightened’ consumer. This narrative was facilitated through the consistent use of motivational language, tone, expressions and symbolism. Taking Atkins’ book as an example once more, one can see how phraseology and capitalisation of sentences urged his readers to stop being bystanders to the damage done to their bodies by modern foods (namely carbohydrates in this case) and start defending their health. Such phrases were ‘A REVOLUTION IN OUR DIET THINKING IS LONG OVERDUE’ and ‘WE ARE THE VICTIMS OF CARBOHYDRATE POISONING’.[3]

 

People turned to self-help books because they addressed contemporary anxieties of the everyday individual. The fear of hunger, for instance, is explicitly addressed by the genre as a whole. These authors knew that people fear hunger, because when they are hungry they lose their control of their bodies, and compromise their diets with filling – though not necessarily healthy – foods. In the short-term, authors reassured readers, this was not necessarily an issue. As Robert Haas advised his readers: ‘There is no point in worrying about cheating occasionally’. He noted, however, that such infidelities to his rules were only acceptable if ‘you stick to the Peak Performance Programme in the long run’.[4]

 

Yet, alongside questions of deprivation, self-help authors had another, seemingly paradoxical, problem to think about, – luxury and efficiency. No one wants to start a boring diet that does not bring the desired results fast enough.  Produced at a time when instant gratification of nearly all desires, cravings and wants can be quite simply achieved, self-help books had to portray eating healthily as an easy process coming from filling, varied, tasty and luxurious foods. For example, Atkins argued that people could lose weight on: ‘Bacon and eggs for breakfast, on heavy cream in their coffee, on mayonnaise on their salads, butter sauce on their lobster; on spareribs, roast duck, on pastrami; on my special cheesecake for dessert’.

 

These and many more themes in the self-help genre play a big part in the popularisation of science and the growth of ‘Healthmania’. The readers of these books are given the knowledge to become active agents of their own health. Change towards better health is elevated to an individualistic and stoic process by which the readers refused to be ill, overweight or obese. The process of becoming healthier offered by the self-help genre is easier with less hunger and more ‘exotic’ and luxurious foods than the average diet recommended by the state and doctors. The advice in these books offers readers a way to avoid the dreaded visits to doctors’ practices, and thus on the surface at least, serves in part to undermine the authority of orthodox medicine. However, even though these books seem to have been against modern scientific theories/practices, self-help authors were using contemporary scientific ideas. These authors offered a self-criticism of the various food and health-related professions they wrote from.  They did so by using widely accepted scientific tools and perspectives to criticise orthodox medicine and dietetics from a marginal perspective.  Indeed self-help books promoted their advice by using the same tools and analytics (macronutrients, micronutrients, calories, specific foods and supplements, the culture of quantification, and often their authors underwent the same training as their ‘mainstream’ counterparts). By following the advice offered by these books, the readers break out of the complacency to ill-health imposed on them by ‘modernity’ and became part of a reactionary kinship. In these works, health itself became a rational, measurable and quantifiable endeavour. More significantly, though, it became a commodity, leading to the explosion of the supplement industry and ‘Healthmania’ in general.

 

 

[1] For instance: http://www.informationisbeautiful.net/visualizations/snake-oil-supplements/

[2] This notion goes beyond the 1980s term coined by Robert Crawford, called healthism which is: ‘the preoccupation with personal health as a primary – often the primary – focus for the definition and achievement of well-being; a goal which is to be attained primarily through the modification of life styles’.

[3] R.C. Atkins, Dr Atkins’ Diet Revolution (New York, Bantam, 1972), pp. 3-5.

[4] R. Haas, (British adaptation by A. Cochrane), Eat To Win (Middlesex, Penguin, 1985), p. 157.

Life Begins at 40

Mark Jackson

It became commonplace during the twentieth century to regard the age of forty (or more recently fifty) as a tipping point in the life cycle, a moment when many people could begin to shed the financial, domestic, parental and occupational worries of youth and middle age and look forward to a more serene and comfortable period of their lives.  The belief that life after forty might present opportunities for, rather than obstacles to, happiness was given legitimacy by a post-Second World War culture that considered increased consumption and economic growth, at least in the West, as the primary route to self-realisation and emotional fulfilment.  Made possible partly by increased life expectancy, the crisis of middle age was recast as an epiphany, a moment of temporary imbalance that was necessary if age-related cognitive and economic decline were to be effectively reversed and individuals inspired to achieve the highest levels of personal satisfaction and well-being.

The origins of late-twentieth century convictions that life begins at forty were, however, less emancipatory than we might imagine.  Rather, they were rooted in reactionary attempts to preserve political stability, economic productivity and family unity.  Four days after the American army entered the First World War in 1917, Mrs Theodore Parsons was interviewed by The Pittsburgh Press, a local daily newspaper.  The author of manuals that encouraged children and young women in particular to embrace physical education as a means of cultivating intellectual advancement, health and beauty, Parsons tied her educational creed to the urgent need for women to train themselves for the `duties that war time may bring’.  `The mothers of a nation’, she argued, `are its supreme asset and as civilization advances it will be as natural for a nation to maintain its mothers as it is to-day to support its army and navy.’  Parsons’ conviction that women, as well as men, were critical to the war effort was not restricted to the young, but extended to the middle aged and elderly.

‘Most old age is premature, and attention to diet and exercise would enable men and women to live a great deal longer than they do to-day.  The best part of a woman’s life begins at forty.’ [1]

Parsons’ words constitute the first modern reference to forty as a transitional age in the fight for freedom and happiness.  But her aim was only incidentally the promotion of individual well-being.  More important for Parsons and her contemporaries were the social and military benefits of healthy ageing.  The notion that life, rather than death, began at forty was taken up most prominently by Walter B. Pitkin, Professor in Journalism at Columbia University.  Pitkin situated his self-help dogma in the context of an emergent American dream.  Science and technology had increased life expectancy, reduced the need for heavy labour in the home and workplace, and made leisure a genuine possibility for many Americans. ‘At forty’, he promised in 1932, `you will be wiser and happier than at thirty.  At fifty you will be clearer, steadier, and surer than at forty.’ [2]  Of course, the collective benefits of enhanced individual health and wealth were evident: greater consumption of services and goods would increase productivity and boost the American economy, fuelling further technological development and economic growth in a cycle of expansion.  Couched in capitalist terms, here perhaps were the seeds of the narcissistic veneration of the midlife transition that triumphed after the war.

In Britain, inter-war attention to the change of life in men and women around the age of forty adopted a different, but no less reactionary, complexion.  During the 1930s, Marie Stopes addressed the effects of ageing on married couples.  In Enduring Passion, first published in 1928, and Change of Life in Men and Women, published eight years later, Stopes questioned the inevitable decline in sexual health and satisfaction that appeared to beset previously passionate couples.  The notion of a crisis around menopause (or the equivalent decline in male virility), she argued, had been exaggerated by popular medical writers.  By preparing more effectively for the challenges generated by the unfolding stages of life, it was possible to prevent what many people regarded as the inevitable conversion of ‘happy lovers’ into ‘drabby tolerant married couples’. [3]  Stopes’ formula for surviving the crisis of middle age became one of the foundational principles of the marriage guidance moment, a development that originated in the late 1930s but subsequently emerged as one of the key features of a post-war settlement intended to restore the stability of the nuclear family.

It did not take long after the Second World War for these conservative dreams of social coherence and domestic stability to be destabilised, but covertly reinforced, by the rampant individualism of the marketplace. ‘Forty-phobia’ became an advertising slogan for companies selling nerve tonics that promised to help patients ‘feel younger as they grow older’. [4]  For men in particular, the purchase of household goods, cars, holidays and suburban houses was promoted as a means of blunting the frustrations created by the contrast between the excitement of conflict and the boredom of corporate and domestic life, a phenomenon neatly captured in the post-war fiction of Sloan Wilson and Joseph Heller. [5]  Self-fulfilment and its spiritual benefits became the mantra of popular advice books aimed at the disaffected, but affluent, middle classes struggling to redefine and relocate themselves in a changing world.

One of the fruits of this process was the creation of a novel expression of self-discovery: the midlife crisis.  Coined in 1965 by the Canadian psychoanalyst and social theorist Elliott Jaques to describe psychodynamic collapse triggered by fear of impending death, the term rapidly came to define midlife madness, a powerful tool for both explaining and legitimating the search for personal fulfilment during the middle years. [6]  The cost in terms of other people’s happiness was regarded by many as less important than the self-realisation, emotional emancipation and spiritual awakening that the crisis supposedly made possible.  But `la crise de la quarantaine’, as it became known in France, was not embraced as a means of enabling healthy ageing in women, as Parsons had anticipated, or as a pathway to a more contented and durable marriage, as Stopes had hoped.  Shaped by late twentieth-century obsessions with the autonomous individual and the gospel of consumption, the notion that life can begin again at forty has been used to reinvigorate a Western capitalist economy that can only be sustained by prolonging productivity and encouraging spending across the whole life course.

Notes

[1] `Now is the time for all women to train for the duties that war time may bring’, The Pittsburgh Press, Tuesday 10 April 1917, p. 20; Mrs Theodore Parsons, Brain Culture through Scientific Body Building, (Chicago, American School of Mental and Physical Development, 1912).

[2] Walter B. Pitkin, Life Begins at Forty, (New York, McGraw-Hill, 1932), pp. 174-5.

[3] Marie Carmichael Stopes, Enduring Passion, (London, Putnam, 1928); Marie Carmichael Stopes, Change of Life in Men and Women, (London, Putnam, 1936).

[4] An early example of this advertising strategy is `Forty-phobia (fear of the forties)’, The Times, 28 April 1938, p. 19.

[5] Sloan Wilson, The Man in the Gray Flannel Suit, (Cambridge, Mass., Da Capo Press, [1955] 2002); Joseph Heller, Something Happened, (London, Jonathan Cape, [1966] 1974).

[6] The midlife crisis is the focus of on-going research, the results of which will appear in Mark Jackson, The Midlife Crisis: A History, (London, Reaktion, forthcoming).

 

How not to kill your argument: looking after men without scapegoating women

Fred Cooper

On July 2nd 2016, the Guardian online newspaper published an anonymous open letter, written by a husband to the “wife who won’t get a job while I work myself to death.” Subtitled “the letter you always wanted to write”, it documented the sense of betrayal felt by a successful lawyer who had come to realise that the woman he once thought of as loyal and kind was “OK with my working myself to death at a high-stress career that I increasingly hate, as long as [she doesn’t] have to return to the workforce.” His wife, he explained, occupied her time with volunteering and leisure while his health deteriorated, he aged rapidly and prematurely, and he felt increasingly “used and alone.” Setting out his position in egalitarian terms, he emphasised his need for real partnership in marriage, and cited the example she was setting to their daughter, who he wanted to be “never as dependent on a man as you are on me.”

The appropriated language of equality barely masked a deep misogyny, a profound anger at the high-octane demands of modern employment misdirected away from workplace culture and the cult of productivity and folded into, in Elaine Showalter’s words, the “fiction that women push the buttons and call the shots.” The wife and her friends were caricatured as self-centred, happy to gripe but not to help: “You all complain about various financial pressures, but never once consider, at least audibly, that you could alleviate the stress on both your budgets and your burnt-out husbands by earning some money yourselves.”

Post-war concerns about the role of women in activating or exacerbating men’s executive stress, here, were re-imagined for a cultural setting in which housewives rather than working mothers were anachronistic, subversive, and dangerous. The anonymous letter bore a striking resemblance to the popular medical writing of a 1960s author and practitioner, Kenneth C. Hutchin, in all but the precise nature of the advice. Each addressed women directly, whether through the device of the disgruntled husband or that of the fatherly doctor. An expert on the illnesses of businessmen, hypertension, and heart disease, Hutchin published a book in 1962 entitled How Not to Kill Your Husband, an expanded version of a 1960 article in Family Doctor, ‘How to keep your husband alive.’

Underneath an illustration of a woman crouched behind her husband’s armchair with a revolver and a bottle of poison, “How to keep your husband alive” explained that, while “the number of women who set out to kill their husbands is surprisingly small, a great many wives could not polish them off better if they tried.” Women, Hutchin argued, rarely realised how delicate their husbands were. Making use of an infant language of risk, he depicted men in their forties and fifties with non-manual, highly responsible jobs as a fragile population, vulnerable to heart attack or stroke. In this instance, wives were not letting their husbands die by omission, but could actively provoke a fatal rupture in the soft, overwrought, morbid bodies fed by high-starch diets and enervated by sedentary labour.

For Hutchin, male health was compromised by two of the central features of post-war companionate marriage; a growing tendency for men to engage in household tasks, and a softened system of patriarchal authority in marital relationships. He illustrated his first concern with two short vignettes, each demonstrating the hidden threat of domestic contribution. In both instances, women pestered their husbands into working when they should be relaxing, with one man becoming “unable to move because of a vice-like pain in his chest” and the other undergoing coronary thrombosis later in the evening. “Dear ladies”, Hutchin concluded, “do stop finding little jobs for your husbands to do.” Work around the home should be “decently accomplished during the day”, whether by tradesmen or by the wife herself, “not waiting there, a reproach and a menace to the tired master of the house.”

Erosion of male leisure went hand in hand with erosion of male authority. Victorian wives who submitted to the “law and wisdom” of their husband’s orders were “probably happier than the modern wife” who distrusted him and disputed his word. Anger and frustration, Hutchin argued, “are dangerous emotions for tired middle-aged men with a poor coronary circulation. The wife who constantly annoys her husband does so at her own risk and his.”

As Margaret Mead observed in 1954, the implication of women’s behaviour in the illnesses of others frequently contained a “subtle form of antifeminism.” Hutchin’s demand for more support and deference in the home and the anonymous husband’s plea for financial contribution and independence indicate a troubling continuity in the identification and criticism of “pathological” female behaviour, a shared willingness to point the finger at women which destabilises the progress implied by the differences between the two critiques.

Feminists from the second half of the twentieth century onwards have made convincing cases for the transformation of identity and experience through work, but these arguments have been at their best when they put women’s needs in centre stage. The importance of feminism as a cultural and political force is sadly overlooked in present debates about preventive medicine, particularly psychiatry; we are far more likely to connect distress and disorder to biological processes than to interrogate the entrenched, systematic, structural ways in which women are driven or drawn into illness. At the same time, we are falling into ways of speaking about men’s distress which demonise and denigrate women, reducing the complexity of their feelings and actions to an imagined impact on male health and male self. There are nuanced and sensitive explorations of, for example, men’s depression, alcoholism, and suicide – look no further than Ali Haggett’s recent research – which demonstrate that these arguments can be had, persuasively and intelligently, without negating the all-too-fragile gains made in public discourses surrounding women. Anonymous husband, I hope you’ll try.

 

Kenneth C. Hutchin, ‘How to keep your husband alive’, Family Doctor, Vol. 10, No.3 (March, 1960), pp. 154-155

Kenneth C. Hutchin, How Not to Kill Your Husband (London, 1962)

Margaret Mead, ‘Some theoretical considerations on the problem of mother-child separation, American Journal of Orthopsychiatry, Vol. XXIV, No. 3, (July 1954), pp. 471-483

Elaine Showalter, The Female Malady: Women, Madness and English Culture, 1830-1980 (New York, 1987)

 

The Costs of Balance? Diet and Regulation in Twentieth Century Diabetes Care

For the past few months, I have been working on an article about dietary management and patienthood in twentieth-century diabetes care. One key theme that has reoccurred throughout this research might be summed up by the question: “balance at what cost?” This is a question that patients and practitioners have had to ask consistently (if implicitly) since the 1920s, and I believe its history may have something to contribute to present discussions.

Balance: A Bit of Everything or Denial of Some Things?

Whether in popular culture or medical discourse, achieving and maintaining balance has frequently been discussed in overwhelmingly positive terms.

For patients with diabetes, medical advice for much of the twentieth and twenty-first centuries has been that metabolic balance is potentially life-saving.[i] That is, glycaemic control has been seen as the best safeguard against a range of short-term problems, as well as the means to prevent the onset of – or reduce the likelihood of developing – devastating long-term complications.

Yet, whilst we might associate balance with phrases like “all things in moderation”, balance in diabetes care has historically been construed in terms of restriction and regulation. Dietary management, that is, has often entailed either removing, carefully controlling, or substantially increasing the intake of certain foodstuffs. Amongst other things, hyper-vigilance and the inability to have a little of everything have been two of the costs associated with balance for individuals (and often their familial and social relations).

Changing Dietary Regulations

This is not to say that patterns of dietary prescription have remained static over time, especially for patients who required insulin. (Weight reduction through calorie control remained a constant imperative for overweight patients not using insulin.)

For instance, in the first few decades of the twentieth century, doctors discussed metabolic balance in relation to a careful restriction of carbohydrates. In fact, in some cases this even extended to banning certain foodstuffs. Lists of “forbidden foods” were often included on dietary sheets, and one doctor even declared that he had a “feeling that it is better for the patient to forget about the taste of bread altogether than be exposed to the temptation when he takes a fraction.”[ii]

These strictures altered considerably in following decades. During the 1930s and 1940s, doctors began to prescribe higher carbohydrate allowances, and in the following decades they dropped careful fat and protein regulation to help reflect more “normal” dietary patterns. After two decades worth of debate about the role of “free diets” in care, higher fibre intakes became a feature of discussions in the 1970s and 1980s, following the testing of theories designed in colonial sites of research. By century’s end, programmes for “normalisation” slowly began to move to the fore, resulting in strategies like DAFNE.[iii] Gradually, “forbidden” foods became controlled foods.

Calculating Costs

Regardless of these moving goal posts, patients and their families rarely adhered fully with their medical instructions. For some, this was simply a case of not being able to afford diets. The financial cost of diabetes diets regularly ran considerably higher than average expenditure on non-medical diets.

For other patients, the logistics of careful management were just as much of a barrier. Surveillance – with its constant vigilance, measurement, inscription and assessment – required more than considerable energy. It also demanded action, and access to a whole material culture. (Pens, records, dietary sheets, scales, jugs, and urine testing equipment.) The time and materials required were rarely accessible in the workplace, and regulating an individual’s diet often had unacceptable consequences for spouses, families, employers, and friends. Indeed, for patients regulating their diet in public, one result might be to mark them out, risking negative comments, social awkwardness and anxieties, as well as problems at social events, such as meals or celebrations. Under such circumstances, patients were either physically unable to manage their diets, pressured into certain actions, or encouraged to weigh their priorities and find (quite rationally) that other relationships and activities were worth the possible costs.

This also to say nothing about the extent to which prescriptions were adapted for cultural, class-based, or dietary preferences. Or, indeed, to discuss palatability. As one patient (who was also a doctor) challenged readers in The Lancet: “you try getting through 3-4oz butter with 1oz of bread to spread it on!”[iv]

Patients and Costing Balance

What I have taken away from this research is that many of the costs that follow the pursuit of balance are deeply embedded into social and cultural life, and can’t all be altered by educational, motivational, or bureaucratic programmes and imperatives.[v] The challenges of the past, in other words, are not dead, but are in many respects still with us.

In fact, we might ask whether there is a broader discussion to be had, one concerning the values attached to balance, and the extent to which reasons for patients not following advice should be conceived as “problems”. In many respects, this has been the starting point for many strategies designed to investigate and improve “compliance” and “adherence” in the past four decades.[vi]

To be sure, I believe that patient education is vital. Equally, co-operation between health care teams, patients and families is often integral to effective care, just as glycaemic control can be protective of long-term health. (Though not a guarantee.)

Nonetheless, at some point it will be necessary to consider the limits to these strategies, and – to a certain degree – the desirability of consistent attempts to problematize and alter behaviour. For those of us without the condition, it is also worth thinking clearly about the costs involved in management before rushing to moral judgement. One patient (again a practitioner) perhaps put it best when writing just after the Second World War:

“Each meal should be an elegant satisfaction of appetite rather than a problem in arithmetic and a trial of self-abnegation”.[vii]

Things have changed significantly since 1946, but daily challenges to patients are still considerable, and the mental, emotional and physical affects of management remain.

Notes

[i] Diabetes UK, ‘Self-Monitoring of Blood Glucose Levels For Adults With Type 2 Diabetes’, Diabetes UK Online, 2016. Available at: https://www.diabetes.org.uk/About_us/What-we-say/Diagnosis-ongoing-management-monitoring/Self-monitoring-of-blood-glucose-levels-for-adults-with-Type-2-diabetes/. Accessed on: 25th March 2016.

[ii] George Graham, ‘On the Present Position of Insulin Therapy’, The Lancet, Vol.204, 1924, 1265-6.

[iii] Or Dose Adjusted For Normal Eating: Anon, ‘DAFNE’, DAFNE Online, 2016. Available at: http://www.dafne.uk.com. Accessed on 25th March 2016.

[iv] Anon, ‘Disabilities: 21. Diabetes’, The Lancet, Vol. 253, 1949, p.116.

[v] Which have been popular solutions of the recent past: NHS England, Action on Diabetes, (January 2014). Online Document. Accessed on 16 June 2015. Available at: https://www.england.nhs.uk/ourwork/qual-clin-lead/diabetes-prevention/action-for-diabetes/. With some notable successes: R.P. Narayanan, J.M. Mason, J. Taylor, A.F. Long, T. Gambling, J.P. New, J.M. Gibson, R.J. Young, ‘Telemedicine to improve glycaemic control: 3-year results from the Pro-Active Call Centre Treatment Support (PACCTS) trial’, Diabetic Medicine, 2012. Available online: http://0-onlinelibrary.wiley.com.lib.exeter.ac.uk/enhanced/doi/10.1111/j.1464-5491.2011.03352.x.

[vi] Jeremy A. Greene, ‘Therapeutic Infidelities: ‘Noncompliance’ Enters the Medical Literature, 1955-1975’, Social History of Medicine, Vol.17, 2004, 327-43.

[vii] C.C. Forsyth, T.W.G. Kinnear, and D.M. Dunlop, ‘Diet in Diabetes’, BMJ, Vol.1, 1951, p.1099.

A Question of ‘Public Engagement’

Ayesha Nathoo

 

Over the last year, I have been involved in a number of public events related to my work on the history of therapeutic relaxation. These have included talks, displays and practical workshops at the “Being Human” festival of the humanities, the “Secret Garden Party” and the “Wilderness festival” (in collaboration with Guerilla Science and NOW live events), a “Friday Late Spectacular” and a discussion evening on “Rest and Relaxation in the Modern World” as part of Hubbub, at the Wellcome Collection, London.

 

 

The aims, scale, content and audiences varied for each of these events, but together they have left me reappraising my role as a historian, and reflecting on notions of expertise in such public forums. The central topics which comprise my research – ‘rest’, ‘balance’, ‘stress’ and ‘relaxation’ – affect us all, and many audience members were drawn to the events because of pre-existing interests in these matters. Others stumbled across the events by chance with little idea of what to expect or gain from them. In the music festivals, the historical material distinguished my workshops from the myriad other practice-based workshops on offer (such as yoga, mindfulness and massage); elsewhere the practical content differentiated my work from other more traditional academic contributions.

 

I am particularly interested in understanding relaxation as a taught practice, and the material, audio and visual culture that has furthered its development over the last hundred years in Western, secular, therapeutic contexts. Aural instruction given in classes or on cassettes were key methods for teaching and learning relaxation in the post-war decades and are central to understanding the growth of such practices in both clinical and domestic settings. As well as the instructions, the tone of voice, pace, pauses, and type of medium would have affected how relaxation was understood, distributed and practiced, so I have been keen to track down and share some of the original audio recordings to better understand these experiential and pedagogical aspects. If these have been unavailable I have instead endeavoured to recreate the ways in which different protagonists have taught relaxation, piecing together printed publications, archival material and oral historical resources to do so.

 

Many of those who participated in the workshops were curious to learn more about the techniques that I discussed – such as yoga, meditation or autogenic training – and their relationship to health and wellbeing. Yet as I was presenting the material primarily as a historian, rather than as a practitioner or advocate of any particular therapeutic practice, some unexpected tensions seemed to arise. Whilst the historical material inspired much interest, most centrally I found that people wanted to evaluate the specific techniques: What was going to work best for them for particular ailments or for general wellbeing? What is the best strategy for alleviating anxiety or chronic pain? Would I recommend relaxation for childbirth? Did I have copies of relaxation instructions that they could take away? Why was I talking about Progressive Muscle Relaxation, one lady asked, when the Alexander Technique (which she had taught for 20 years) was far superior? Was mindfulness meditation really a form of relaxation? Was it best to concentrate on relaxing the mind or the body? What is the evidence base for therapeutic relaxation? Why bother with daily relaxation training if there is a more accessible pharmaceutical option?

 

Although comparable questions have partly spurred my own interest in this research area, speaking as a historian I have felt uneasy about offering responses. The short, practical, personal questions are not compatible with in-depth answers that address broader medical, social and political contexts, such as the rise of individualism and the mass media, and changes in healthcare, lifestyle and biomedical models. Yet that is what has created and shaped successive demands for and critiques of therapeutic relaxation; contemporary concerns and understandings derive from these past contexts. This is the long and complex narrative that I am researching and whilst I certainly hope that it will have policy implications and be relevant to today’s medical landscape, I do not feel equipped to offer personal advice. Neither am I sure that I should be doing so.

 

I would speculate that this kind of professional reticence is a majority view amongst historians, and yet it is somewhat frustrating for interested lay audiences. If a professional researcher is investigating a particular subject, then why should they not state their opinions based on the knowledge gained from the research? I have come across this at various other points during past research, on topics ranging from the media coverage of the possible link between autism and the MMR vaccination to organ transplantation and donation. ‘Should I give my child the vaccination?’, mothers repeatedly asked me. ‘Were there any reasons not to sign the organ register?’ ‘Did I think there should be an opt-out clause to increase donation rates?’ It is not that I had not given enough thought to these matters – I had extensively mulled over them – yet I questioned my role as a historian to authoritatively influence other people’s present-day decisions, certainly without allowing the time and space to substantiate my points of view. The aim, however, would not be to give a ‘balanced’ view in the sense of ‘objectively’ presenting a full range of arguments for and against.

 

The personal is the historical: Knowledge and memories of the past shape views and actions for the future. And so a historian’s personal stance can generally be inferred from their authored work, amongst the layers of interpretations and the selection of sources. Perhaps then scholars should meet the challenge of more explicitly articulating their own views in public contexts, where audience members often lead conversations and set agendas and where the boundaries of expertise are fluid. As ‘public engagement’ becomes an increasingly significant part of academic life, it seems timely and important to open up these discussions.

Coding Social Normality in Physiological Balance

Martin Moore

Every now and then, your research will turn up a source which so perfectly embodies your research interests that you can’t wait to share it with colleagues.

I had this experience recently when looking for material for my current work on the physiological balance strand of this project. Searching through the collections of the Wellcome Library, I came across this particularly fascinating video:

http://wellcomelibrary.org/player/b1665853x#?asi=0&ai=0

It was produced in 1959 in association with the Hammersmith and University College Hospitals of London, and aimed to walk a particularly thin and challenging clinical line: introducing newly diagnosed patients to the causes and dangers of diabetes, as well as covering its treatment and reassuring them of its manageability.

Of course, the video and its origins can be interpreted in any number of ways. Its hospital production could be seen as highlighting the prominent role of this institution in the care of patients. Its introduction of clinician and dietitian could be seen as evidence of expanding hospital care teams. Or its very existence could be read as testament to yet another way in which diabetes and its care has sat at the forefront of technological and clinical innovation in British medicine.[1]

Diabetes and the Culture of Capitalism

For me, however, it seemed to lay out perfectly how social and political norms have historically been embedded – or in the terms of theorists like Stuart Hall, been encoded – within instructions for ensuring physiological balance, as well as indicating how normality has provided a mechanism for encouraging patients to follow medical advice.[2] Examples can be found even within its first minute.

The video opens, for example, with images of passengers alighting a train, and pedestrians and shoppers passing down a high street. Over the top of these images we hear an RP-accented narrator inform viewers that diabetes is a condition that affects approximately 3 out of every 200 people in Britain, or 6 in every crowded train or street. However, the successfully “balanced diabetics”, we are informed, “go unnoticed in the crowd. They no longer suffer from diabetes, they have learned to manage the condition and they live usefully and normally with other people”. (00.00.43 – 00.01.09)

Ostensibly, the images of passengers, pedestrians and shoppers are included to ground potentially abstract figures of prevalence into familiar and concrete situations. A potentially “silent” disease with a prevalence rate of 1.5 per cent is thereby transformed into a condition that affects on average six people on a train – perhaps people viewers may know and with whom they regularly travel.

Along with the narration, though, these images also serve to represent and reinforce a particular view of normality. Pictures of well-heeled commuters, for instance, clearly link the notion of the “useful” individual to figures of the productive, though professional, worker, whilst footage of shoppers and high-street stores juxtaposes and implicitly connects normality to acts of consumption. The latter likely being a powerful image given post-war Britain’s recent boom in consumer goods, and recent relief of rationing.

Moving attention away from an undifferentiated, supposedly normal populous in this footage, and speaking to the newly diagnosed viewer, the video then promises that such lives and activities can (once again) be in reach of the diabetic if they are “well-balanced”, patently tying the importance of maintaining physiological balance with desired social normality. It is, moreover, a normality consistent with the values of professional work, consumption, and individual responsibility central to the political and economic logic of contemporary British capitalism, then characterised by a certain hybridity. That is, a liberalisation of production and consumption in some markets, but a cultural tendency towards professional economic management and planning more generally.[3] The culture of medicine, in other words, was clearly influenced by the political, social and economic context within which it took place, as well as to emergent themes of individualism in public health medicine more broadly.[4]

Normality, Balance and Patient Discipline

The link between physiological balance, normal social lives, and following medical advice is strengthened a few minutes later in the video, after the doctor has – with the help of some animated scales – described the cause of diabetes as resulting from an imbalance of insulin, dietary sugar and “our requirements”. (00.01.27-00.02.17).

In a subsequent exchange, a dietitian seeks to stress the importance of weight reduction to achieving balance in conversation with a stereotypical “fat” diabetic (to use the video’s terms. Interestingly, this type of patient is represented by a businessman from “the city”, Mr Anderson). Initially, upset by his new dietary prescription, our dietitian interrupts Mr Anderson’s efforts to relate his weight and diet to working conditions, and overcomes his resistance by suggesting that “this [diet] is going to alter your habits, but it won’t be the end of the world for you, and you must do it for your own sake”. (00.03.55-00.04.45). Initially, concerned by the extent to which his new diet might alter his activity at important business dinners, Mr Anderson is begrudgingly convinced by this argument and attention turns to “our thin friend”, Miss Smith (00.04.45-00.04.50). Accepting medical advice, therefore, Mr Anderson was now in a position to balance his physiology, and though this required some alteration to dietary practice, it would benefit himself and allow him to perform his broader role in society.

Future Research

In the future, I hope to be able to follow-up my interest in these representational devices with further research into their use and reception. Through oral history interviews with patients and practitioners, as well as other material, such as magazines and medical journals, I hope to trace how various actors decoded these messages.[5] To ask, in many respects, what the limits of medical intervention and regulation were.

I will also look to broaden thematically into questions of gender, class and ethnicity. This video, for instance, is very much focused on patients seen at the time to occupy the Registrar General’s classes I-III (then deemed the most liable to diabetes), and on white patients, despite the presence of black individuals in both British clinics and the video’s non-medical footage. I want to know when such material altered its boundaries in this respect, and to map this onto the changing demographics of treatment. Similarly, although it raises interesting questions about the power of gender ideals and assumptions in shaping practice, I hope to trace how such ideals affected patients, and how they changed over time.

[1] For a short and accessible overview: R.B. Tattersall, Diabetes: The Biography, (Oxford: Oxford University Press, 2009). For a more in-depth, but very engaging view of this history in the US: Chris Feudtner Bittersweet: Diabetes, Insulin and the Transformation of Illness, (Chapel Hill: University of Carolina Press, 2003).

[2] For an introduction see: Stuart Hall, ‘Encoding/Decoding’, in Meenaskshi Gigi Durham and Douglas M. Kellner, Media and Cultural Studies: Keyworks, 2nd Edition, (Oxford: Blackwell, 2006), 163-73.

[3] For an introduction to debates about post-war economic policy: Neil Rollings, ‘Poor Mr Butskell: A Short Life, Wrecked by Schizophrenia, Twentieth Century British History, Vol.5, (No.2, 1994), 183-205. And on planning and professionalism: Glen O’Hara, From Dreams to Disillusionment: Economic and Social Planning in 1960s, (Basingstoke: Palgrave Macmillan, 2007); Harold Perkin, The rise of professional society: England since 1880, (London: Routledge, 1989).

[4] On contemporary developments in public health, and regulated individualism: Virginia Berridge, ‘Medicine and the Public: The 1962 Report of the Royal College of Physicians and the New Public Health’, Bulletin of the History of Medicine, Vol.81, (No.1, 2007), 286-311; Dorothy Porter, Health Citizenship: Essays in Social Medicine and Biomedical Politics, Berkeley: University of California Press, 2011, 154-181.