Tag Archives: Balance

Life Begins at 40

Mark Jackson

It became commonplace during the twentieth century to regard the age of forty (or more recently fifty) as a tipping point in the life cycle, a moment when many people could begin to shed the financial, domestic, parental and occupational worries of youth and middle age and look forward to a more serene and comfortable period of their lives.  The belief that life after forty might present opportunities for, rather than obstacles to, happiness was given legitimacy by a post-Second World War culture that considered increased consumption and economic growth, at least in the West, as the primary route to self-realisation and emotional fulfilment.  Made possible partly by increased life expectancy, the crisis of middle age was recast as an epiphany, a moment of temporary imbalance that was necessary if age-related cognitive and economic decline were to be effectively reversed and individuals inspired to achieve the highest levels of personal satisfaction and well-being.

The origins of late-twentieth century convictions that life begins at forty were, however, less emancipatory than we might imagine.  Rather, they were rooted in reactionary attempts to preserve political stability, economic productivity and family unity.  Four days after the American army entered the First World War in 1917, Mrs Theodore Parsons was interviewed by The Pittsburgh Press, a local daily newspaper.  The author of manuals that encouraged children and young women in particular to embrace physical education as a means of cultivating intellectual advancement, health and beauty, Parsons tied her educational creed to the urgent need for women to train themselves for the `duties that war time may bring’.  `The mothers of a nation’, she argued, `are its supreme asset and as civilization advances it will be as natural for a nation to maintain its mothers as it is to-day to support its army and navy.’  Parsons’ conviction that women, as well as men, were critical to the war effort was not restricted to the young, but extended to the middle aged and elderly.

‘Most old age is premature, and attention to diet and exercise would enable men and women to live a great deal longer than they do to-day.  The best part of a woman’s life begins at forty.’ [1]

Parsons’ words constitute the first modern reference to forty as a transitional age in the fight for freedom and happiness.  But her aim was only incidentally the promotion of individual well-being.  More important for Parsons and her contemporaries were the social and military benefits of healthy ageing.  The notion that life, rather than death, began at forty was taken up most prominently by Walter B. Pitkin, Professor in Journalism at Columbia University.  Pitkin situated his self-help dogma in the context of an emergent American dream.  Science and technology had increased life expectancy, reduced the need for heavy labour in the home and workplace, and made leisure a genuine possibility for many Americans. ‘At forty’, he promised in 1932, `you will be wiser and happier than at thirty.  At fifty you will be clearer, steadier, and surer than at forty.’ [2]  Of course, the collective benefits of enhanced individual health and wealth were evident: greater consumption of services and goods would increase productivity and boost the American economy, fuelling further technological development and economic growth in a cycle of expansion.  Couched in capitalist terms, here perhaps were the seeds of the narcissistic veneration of the midlife transition that triumphed after the war.

In Britain, inter-war attention to the change of life in men and women around the age of forty adopted a different, but no less reactionary, complexion.  During the 1930s, Marie Stopes addressed the effects of ageing on married couples.  In Enduring Passion, first published in 1928, and Change of Life in Men and Women, published eight years later, Stopes questioned the inevitable decline in sexual health and satisfaction that appeared to beset previously passionate couples.  The notion of a crisis around menopause (or the equivalent decline in male virility), she argued, had been exaggerated by popular medical writers.  By preparing more effectively for the challenges generated by the unfolding stages of life, it was possible to prevent what many people regarded as the inevitable conversion of ‘happy lovers’ into ‘drabby tolerant married couples’. [3]  Stopes’ formula for surviving the crisis of middle age became one of the foundational principles of the marriage guidance moment, a development that originated in the late 1930s but subsequently emerged as one of the key features of a post-war settlement intended to restore the stability of the nuclear family.

It did not take long after the Second World War for these conservative dreams of social coherence and domestic stability to be destabilised, but covertly reinforced, by the rampant individualism of the marketplace. ‘Forty-phobia’ became an advertising slogan for companies selling nerve tonics that promised to help patients ‘feel younger as they grow older’. [4]  For men in particular, the purchase of household goods, cars, holidays and suburban houses was promoted as a means of blunting the frustrations created by the contrast between the excitement of conflict and the boredom of corporate and domestic life, a phenomenon neatly captured in the post-war fiction of Sloan Wilson and Joseph Heller. [5]  Self-fulfilment and its spiritual benefits became the mantra of popular advice books aimed at the disaffected, but affluent, middle classes struggling to redefine and relocate themselves in a changing world.

One of the fruits of this process was the creation of a novel expression of self-discovery: the midlife crisis.  Coined in 1965 by the Canadian psychoanalyst and social theorist Elliott Jaques to describe psychodynamic collapse triggered by fear of impending death, the term rapidly came to define midlife madness, a powerful tool for both explaining and legitimating the search for personal fulfilment during the middle years. [6]  The cost in terms of other people’s happiness was regarded by many as less important than the self-realisation, emotional emancipation and spiritual awakening that the crisis supposedly made possible.  But `la crise de la quarantaine’, as it became known in France, was not embraced as a means of enabling healthy ageing in women, as Parsons had anticipated, or as a pathway to a more contented and durable marriage, as Stopes had hoped.  Shaped by late twentieth-century obsessions with the autonomous individual and the gospel of consumption, the notion that life can begin again at forty has been used to reinvigorate a Western capitalist economy that can only be sustained by prolonging productivity and encouraging spending across the whole life course.

Notes

[1] `Now is the time for all women to train for the duties that war time may bring’, The Pittsburgh Press, Tuesday 10 April 1917, p. 20; Mrs Theodore Parsons, Brain Culture through Scientific Body Building, (Chicago, American School of Mental and Physical Development, 1912).

[2] Walter B. Pitkin, Life Begins at Forty, (New York, McGraw-Hill, 1932), pp. 174-5.

[3] Marie Carmichael Stopes, Enduring Passion, (London, Putnam, 1928); Marie Carmichael Stopes, Change of Life in Men and Women, (London, Putnam, 1936).

[4] An early example of this advertising strategy is `Forty-phobia (fear of the forties)’, The Times, 28 April 1938, p. 19.

[5] Sloan Wilson, The Man in the Gray Flannel Suit, (Cambridge, Mass., Da Capo Press, [1955] 2002); Joseph Heller, Something Happened, (London, Jonathan Cape, [1966] 1974).

[6] The midlife crisis is the focus of on-going research, the results of which will appear in Mark Jackson, The Midlife Crisis: A History, (London, Reaktion, forthcoming).

 

The Costs of Balance? Diet and Regulation in Twentieth Century Diabetes Care

For the past few months, I have been working on an article about dietary management and patienthood in twentieth-century diabetes care. One key theme that has reoccurred throughout this research might be summed up by the question: “balance at what cost?” This is a question that patients and practitioners have had to ask consistently (if implicitly) since the 1920s, and I believe its history may have something to contribute to present discussions.

Balance: A Bit of Everything or Denial of Some Things?

Whether in popular culture or medical discourse, achieving and maintaining balance has frequently been discussed in overwhelmingly positive terms.

For patients with diabetes, medical advice for much of the twentieth and twenty-first centuries has been that metabolic balance is potentially life-saving.[i] That is, glycaemic control has been seen as the best safeguard against a range of short-term problems, as well as the means to prevent the onset of – or reduce the likelihood of developing – devastating long-term complications.

Yet, whilst we might associate balance with phrases like “all things in moderation”, balance in diabetes care has historically been construed in terms of restriction and regulation. Dietary management, that is, has often entailed either removing, carefully controlling, or substantially increasing the intake of certain foodstuffs. Amongst other things, hyper-vigilance and the inability to have a little of everything have been two of the costs associated with balance for individuals (and often their familial and social relations).

Changing Dietary Regulations

This is not to say that patterns of dietary prescription have remained static over time, especially for patients who required insulin. (Weight reduction through calorie control remained a constant imperative for overweight patients not using insulin.)

For instance, in the first few decades of the twentieth century, doctors discussed metabolic balance in relation to a careful restriction of carbohydrates. In fact, in some cases this even extended to banning certain foodstuffs. Lists of “forbidden foods” were often included on dietary sheets, and one doctor even declared that he had a “feeling that it is better for the patient to forget about the taste of bread altogether than be exposed to the temptation when he takes a fraction.”[ii]

These strictures altered considerably in following decades. During the 1930s and 1940s, doctors began to prescribe higher carbohydrate allowances, and in the following decades they dropped careful fat and protein regulation to help reflect more “normal” dietary patterns. After two decades worth of debate about the role of “free diets” in care, higher fibre intakes became a feature of discussions in the 1970s and 1980s, following the testing of theories designed in colonial sites of research. By century’s end, programmes for “normalisation” slowly began to move to the fore, resulting in strategies like DAFNE.[iii] Gradually, “forbidden” foods became controlled foods.

Calculating Costs

Regardless of these moving goal posts, patients and their families rarely adhered fully with their medical instructions. For some, this was simply a case of not being able to afford diets. The financial cost of diabetes diets regularly ran considerably higher than average expenditure on non-medical diets.

For other patients, the logistics of careful management were just as much of a barrier. Surveillance – with its constant vigilance, measurement, inscription and assessment – required more than considerable energy. It also demanded action, and access to a whole material culture. (Pens, records, dietary sheets, scales, jugs, and urine testing equipment.) The time and materials required were rarely accessible in the workplace, and regulating an individual’s diet often had unacceptable consequences for spouses, families, employers, and friends. Indeed, for patients regulating their diet in public, one result might be to mark them out, risking negative comments, social awkwardness and anxieties, as well as problems at social events, such as meals or celebrations. Under such circumstances, patients were either physically unable to manage their diets, pressured into certain actions, or encouraged to weigh their priorities and find (quite rationally) that other relationships and activities were worth the possible costs.

This also to say nothing about the extent to which prescriptions were adapted for cultural, class-based, or dietary preferences. Or, indeed, to discuss palatability. As one patient (who was also a doctor) challenged readers in The Lancet: “you try getting through 3-4oz butter with 1oz of bread to spread it on!”[iv]

Patients and Costing Balance

What I have taken away from this research is that many of the costs that follow the pursuit of balance are deeply embedded into social and cultural life, and can’t all be altered by educational, motivational, or bureaucratic programmes and imperatives.[v] The challenges of the past, in other words, are not dead, but are in many respects still with us.

In fact, we might ask whether there is a broader discussion to be had, one concerning the values attached to balance, and the extent to which reasons for patients not following advice should be conceived as “problems”. In many respects, this has been the starting point for many strategies designed to investigate and improve “compliance” and “adherence” in the past four decades.[vi]

To be sure, I believe that patient education is vital. Equally, co-operation between health care teams, patients and families is often integral to effective care, just as glycaemic control can be protective of long-term health. (Though not a guarantee.)

Nonetheless, at some point it will be necessary to consider the limits to these strategies, and – to a certain degree – the desirability of consistent attempts to problematize and alter behaviour. For those of us without the condition, it is also worth thinking clearly about the costs involved in management before rushing to moral judgement. One patient (again a practitioner) perhaps put it best when writing just after the Second World War:

“Each meal should be an elegant satisfaction of appetite rather than a problem in arithmetic and a trial of self-abnegation”.[vii]

Things have changed significantly since 1946, but daily challenges to patients are still considerable, and the mental, emotional and physical affects of management remain.

Notes

[i] Diabetes UK, ‘Self-Monitoring of Blood Glucose Levels For Adults With Type 2 Diabetes’, Diabetes UK Online, 2016. Available at: https://www.diabetes.org.uk/About_us/What-we-say/Diagnosis-ongoing-management-monitoring/Self-monitoring-of-blood-glucose-levels-for-adults-with-Type-2-diabetes/. Accessed on: 25th March 2016.

[ii] George Graham, ‘On the Present Position of Insulin Therapy’, The Lancet, Vol.204, 1924, 1265-6.

[iii] Or Dose Adjusted For Normal Eating: Anon, ‘DAFNE’, DAFNE Online, 2016. Available at: http://www.dafne.uk.com. Accessed on 25th March 2016.

[iv] Anon, ‘Disabilities: 21. Diabetes’, The Lancet, Vol. 253, 1949, p.116.

[v] Which have been popular solutions of the recent past: NHS England, Action on Diabetes, (January 2014). Online Document. Accessed on 16 June 2015. Available at: https://www.england.nhs.uk/ourwork/qual-clin-lead/diabetes-prevention/action-for-diabetes/. With some notable successes: R.P. Narayanan, J.M. Mason, J. Taylor, A.F. Long, T. Gambling, J.P. New, J.M. Gibson, R.J. Young, ‘Telemedicine to improve glycaemic control: 3-year results from the Pro-Active Call Centre Treatment Support (PACCTS) trial’, Diabetic Medicine, 2012. Available online: http://0-onlinelibrary.wiley.com.lib.exeter.ac.uk/enhanced/doi/10.1111/j.1464-5491.2011.03352.x.

[vi] Jeremy A. Greene, ‘Therapeutic Infidelities: ‘Noncompliance’ Enters the Medical Literature, 1955-1975’, Social History of Medicine, Vol.17, 2004, 327-43.

[vii] C.C. Forsyth, T.W.G. Kinnear, and D.M. Dunlop, ‘Diet in Diabetes’, BMJ, Vol.1, 1951, p.1099.

A Question of ‘Public Engagement’

Ayesha Nathoo

 

Over the last year, I have been involved in a number of public events related to my work on the history of therapeutic relaxation. These have included talks, displays and practical workshops at the “Being Human” festival of the humanities, the “Secret Garden Party” and the “Wilderness festival” (in collaboration with Guerilla Science and NOW live events), a “Friday Late Spectacular” and a discussion evening on “Rest and Relaxation in the Modern World” as part of Hubbub, at the Wellcome Collection, London.

 

 

The aims, scale, content and audiences varied for each of these events, but together they have left me reappraising my role as a historian, and reflecting on notions of expertise in such public forums. The central topics which comprise my research – ‘rest’, ‘balance’, ‘stress’ and ‘relaxation’ – affect us all, and many audience members were drawn to the events because of pre-existing interests in these matters. Others stumbled across the events by chance with little idea of what to expect or gain from them. In the music festivals, the historical material distinguished my workshops from the myriad other practice-based workshops on offer (such as yoga, mindfulness and massage); elsewhere the practical content differentiated my work from other more traditional academic contributions.

 

I am particularly interested in understanding relaxation as a taught practice, and the material, audio and visual culture that has furthered its development over the last hundred years in Western, secular, therapeutic contexts. Aural instruction given in classes or on cassettes were key methods for teaching and learning relaxation in the post-war decades and are central to understanding the growth of such practices in both clinical and domestic settings. As well as the instructions, the tone of voice, pace, pauses, and type of medium would have affected how relaxation was understood, distributed and practiced, so I have been keen to track down and share some of the original audio recordings to better understand these experiential and pedagogical aspects. If these have been unavailable I have instead endeavoured to recreate the ways in which different protagonists have taught relaxation, piecing together printed publications, archival material and oral historical resources to do so.

 

Many of those who participated in the workshops were curious to learn more about the techniques that I discussed – such as yoga, meditation or autogenic training – and their relationship to health and wellbeing. Yet as I was presenting the material primarily as a historian, rather than as a practitioner or advocate of any particular therapeutic practice, some unexpected tensions seemed to arise. Whilst the historical material inspired much interest, most centrally I found that people wanted to evaluate the specific techniques: What was going to work best for them for particular ailments or for general wellbeing? What is the best strategy for alleviating anxiety or chronic pain? Would I recommend relaxation for childbirth? Did I have copies of relaxation instructions that they could take away? Why was I talking about Progressive Muscle Relaxation, one lady asked, when the Alexander Technique (which she had taught for 20 years) was far superior? Was mindfulness meditation really a form of relaxation? Was it best to concentrate on relaxing the mind or the body? What is the evidence base for therapeutic relaxation? Why bother with daily relaxation training if there is a more accessible pharmaceutical option?

 

Although comparable questions have partly spurred my own interest in this research area, speaking as a historian I have felt uneasy about offering responses. The short, practical, personal questions are not compatible with in-depth answers that address broader medical, social and political contexts, such as the rise of individualism and the mass media, and changes in healthcare, lifestyle and biomedical models. Yet that is what has created and shaped successive demands for and critiques of therapeutic relaxation; contemporary concerns and understandings derive from these past contexts. This is the long and complex narrative that I am researching and whilst I certainly hope that it will have policy implications and be relevant to today’s medical landscape, I do not feel equipped to offer personal advice. Neither am I sure that I should be doing so.

 

I would speculate that this kind of professional reticence is a majority view amongst historians, and yet it is somewhat frustrating for interested lay audiences. If a professional researcher is investigating a particular subject, then why should they not state their opinions based on the knowledge gained from the research? I have come across this at various other points during past research, on topics ranging from the media coverage of the possible link between autism and the MMR vaccination to organ transplantation and donation. ‘Should I give my child the vaccination?’, mothers repeatedly asked me. ‘Were there any reasons not to sign the organ register?’ ‘Did I think there should be an opt-out clause to increase donation rates?’ It is not that I had not given enough thought to these matters – I had extensively mulled over them – yet I questioned my role as a historian to authoritatively influence other people’s present-day decisions, certainly without allowing the time and space to substantiate my points of view. The aim, however, would not be to give a ‘balanced’ view in the sense of ‘objectively’ presenting a full range of arguments for and against.

 

The personal is the historical: Knowledge and memories of the past shape views and actions for the future. And so a historian’s personal stance can generally be inferred from their authored work, amongst the layers of interpretations and the selection of sources. Perhaps then scholars should meet the challenge of more explicitly articulating their own views in public contexts, where audience members often lead conversations and set agendas and where the boundaries of expertise are fluid. As ‘public engagement’ becomes an increasingly significant part of academic life, it seems timely and important to open up these discussions.

‘On Balance: Lifestyle, Mental Health and Wellbeing’: Musings on Multidisciplinarity, from a Historian.

 

Ali Haggett

The first of three major conferences to be held in conjunction with the Lifestyle, Health and Disease project took place on the 25th and 26th of June at the University’s Streatham Campus. Focusing broadly on the strand of research that is concerned with mental health and wellbeing, the remit of the conference was to explore the ways in which changing notions of ‘balance’ have been used to understand the causes of mental illness; to rationalise new approaches to its treatment; and to validate advice relating to balance in work and family life. Drawing on a range of approaches and methodologies, the multidisciplinary conference attracted scholars from Britain and the United States, with diverse backgrounds, which included: history, anthropology, psychiatry, psychology and clinical medicine. On the evening of the 24th June, as a prelude to the event, we began by hosting a public panel debate, on the topic of ‘Defeating Depression: What Hope?’ at the Royal Albert Memorial Museum in Exeter. A photo gallery and a summary of the evening can be found on the Exeter Blog

 

Still at the formative stages of research, I hoped that the contributions from other scholars might provoke new lines of enquiry, or stimulate interesting alternative approaches to our work. One of the questions I am particularly interested in is: why does the concept of balance in mental health and wellbeing become influential at certain times through our recent history? As the conference progressed, and with the public panel event also in mind, I found myself wondering what a future historian might make of the contemporary themes and concerns that emerged from this conference. It struck me that many of the anxieties that were articulated by non-historians were not new, but that they had surfaced at regular junctures in modern history. At the heart of a number of papers, and evident from the contributions to the public debate, was a palpable dissatisfaction with the status quo – with ‘modern’ and perhaps ‘unbalanced’ ways of living and their effects on health. These concerns are reminiscent of those put forward much earlier, during the early and mid twentieth century, by proponents of the social medicine movement who were critical of rising consumerism, the breakdown of traditional values and kinship ties, and who were keen to reduce the burden of sickness by pressing for social improvements.[1] Misgivings about the current ‘neo-liberal’ political climate were evident, where, in some circles, the principles of free-market individualism are held to undermine collective action, community spirit and kinship, leading to disempowerment and ultimately to ill health. The prevailing interventionist, biomedical model of medicine practised in the West did not escape criticism. Some of the concerns raised resonated strongly with the ideas put forward by proponents of the ‘biopsychosocial’ model of medicine from the 1970s, which highlighted the importance of the social context of disease.[2] A number of papers raised important questions about the ways in which the current medical model appears to foreground the treatment of mental illness and underplay approaches to prevention. Speakers from the conference and contributions to the public debate noted, with some disquiet, that responsibility for protecting and maintaining mental health had increasingly shifted to the individual, instead of the ‘group’, the employer or the wider socio-economic environment.

 

Perhaps unsurprisingly, anxieties about mental illness and the field of psychiatry that first materialised during the 1960s and developed within the ‘anti-psychiatry’ movement were still conspicuous at the conference – anxieties about the classification, diagnosis and labelling of mental disorders; unease about the misapplication of ‘norms’, rating scales and the concept of ‘risk management’ in medicine. The disquiet that emerged during the 1960s was of course also intimately associated with the contemporary counter culture and broader concerns about the conformity and emptiness of the post-war world. Such ideas were evident in the literature of the period from authors such as George Orwell, William H. Whyte, David Riesman and Herbert Marcuse, who all variously disapproved of the social and cultural changes that took place in mid-century Britain and the United States.[3]

 

Defined by the Oxford Dictionary as ‘a situation in which different elements are in the correct proportions’, the concept of ‘balance’ remains at the core of all debates about mental health, whether we are talking about chemical imbalance, work-life balance or cognitive and mindful approaches to human behaviour. The papers delivered at the conference by my fellow historians neatly exposed the ways in which many of the themes discussed have emerged in the past and often reveal more about broader concerns, tensions and uncertainty about new ways of living and their effects on health than they do about epidemiological trends in mental illness. While historians are uniquely placed to add this important context, the joy of combining insights from several disciplines is that we are able to begin to redefine problems and reach solutions through new understandings. On a personal level, the contributions from other disciplines reminded me that perhaps, as an idealistic historian, I am sometimes distanced from the harsh realities of clinical practice. Collectively, the papers also prompted me to think about new ways of conceptualising and measuring what is ‘balanced’ in life and in health – and perhaps also to question the ways in which balance is somehow taken to be inherently desirable, or essential. There is no doubt that the global burden of mental ill health has become one of the most pressing social and medical problems of our time. Overcoming the challenges faced will require the knowledge of more than one discipline. As scholars engaged in research into mental health and wellbeing, we are all, in different ways, and with different approaches – and often with different opinions – ultimately seeking a shared goal of fostering ways to improve mental health and wellbeing in our society.

 

The conference organisers would like to thank the Wellcome Trust for supporting the conference and to the following speakers for their contributions:

Professor David Healy, Dr Matthew Smith, Professor Jonathan Metzl, Dr Nils Fietje, Professor Ed Watkins, Dr James Davies, Dr Ayesha Nathoo, Professor Michelle Ryan, Mr Frederick Cooper, Professor Femi Oyebode, Dr James Fallon and Dr Alex Joy.

[1] As examples: Stephen Taylor, ‘The suburban neurosis’, Lancet, 26 March 1938 and James L. Halliday, Psychosocial Medicine: A Study of the Sick Society (London, William Heinemann, 1948).

[2] See George L. Engel, ‘The need for a new medical model: a challenge for biomedicine’, Science (1977), 196, 129-36.

[3] An interesting discussion of the political and social context within which the antipsychiatry movement grew can be found in Nick Crossley, ‘R. D. Laing and the British anti-psychiatry movement: a socio-historical analysis’, Social Science and Medicine (1998), Vol. 47, No. 7, 877-89.