Tag Archives: Wellbeing

Life Begins at 40

Mark Jackson

It became commonplace during the twentieth century to regard the age of forty (or more recently fifty) as a tipping point in the life cycle, a moment when many people could begin to shed the financial, domestic, parental and occupational worries of youth and middle age and look forward to a more serene and comfortable period of their lives.  The belief that life after forty might present opportunities for, rather than obstacles to, happiness was given legitimacy by a post-Second World War culture that considered increased consumption and economic growth, at least in the West, as the primary route to self-realisation and emotional fulfilment.  Made possible partly by increased life expectancy, the crisis of middle age was recast as an epiphany, a moment of temporary imbalance that was necessary if age-related cognitive and economic decline were to be effectively reversed and individuals inspired to achieve the highest levels of personal satisfaction and well-being.

The origins of late-twentieth century convictions that life begins at forty were, however, less emancipatory than we might imagine.  Rather, they were rooted in reactionary attempts to preserve political stability, economic productivity and family unity.  Four days after the American army entered the First World War in 1917, Mrs Theodore Parsons was interviewed by The Pittsburgh Press, a local daily newspaper.  The author of manuals that encouraged children and young women in particular to embrace physical education as a means of cultivating intellectual advancement, health and beauty, Parsons tied her educational creed to the urgent need for women to train themselves for the `duties that war time may bring’.  `The mothers of a nation’, she argued, `are its supreme asset and as civilization advances it will be as natural for a nation to maintain its mothers as it is to-day to support its army and navy.’  Parsons’ conviction that women, as well as men, were critical to the war effort was not restricted to the young, but extended to the middle aged and elderly.

‘Most old age is premature, and attention to diet and exercise would enable men and women to live a great deal longer than they do to-day.  The best part of a woman’s life begins at forty.’ [1]

Parsons’ words constitute the first modern reference to forty as a transitional age in the fight for freedom and happiness.  But her aim was only incidentally the promotion of individual well-being.  More important for Parsons and her contemporaries were the social and military benefits of healthy ageing.  The notion that life, rather than death, began at forty was taken up most prominently by Walter B. Pitkin, Professor in Journalism at Columbia University.  Pitkin situated his self-help dogma in the context of an emergent American dream.  Science and technology had increased life expectancy, reduced the need for heavy labour in the home and workplace, and made leisure a genuine possibility for many Americans. ‘At forty’, he promised in 1932, `you will be wiser and happier than at thirty.  At fifty you will be clearer, steadier, and surer than at forty.’ [2]  Of course, the collective benefits of enhanced individual health and wealth were evident: greater consumption of services and goods would increase productivity and boost the American economy, fuelling further technological development and economic growth in a cycle of expansion.  Couched in capitalist terms, here perhaps were the seeds of the narcissistic veneration of the midlife transition that triumphed after the war.

In Britain, inter-war attention to the change of life in men and women around the age of forty adopted a different, but no less reactionary, complexion.  During the 1930s, Marie Stopes addressed the effects of ageing on married couples.  In Enduring Passion, first published in 1928, and Change of Life in Men and Women, published eight years later, Stopes questioned the inevitable decline in sexual health and satisfaction that appeared to beset previously passionate couples.  The notion of a crisis around menopause (or the equivalent decline in male virility), she argued, had been exaggerated by popular medical writers.  By preparing more effectively for the challenges generated by the unfolding stages of life, it was possible to prevent what many people regarded as the inevitable conversion of ‘happy lovers’ into ‘drabby tolerant married couples’. [3]  Stopes’ formula for surviving the crisis of middle age became one of the foundational principles of the marriage guidance moment, a development that originated in the late 1930s but subsequently emerged as one of the key features of a post-war settlement intended to restore the stability of the nuclear family.

It did not take long after the Second World War for these conservative dreams of social coherence and domestic stability to be destabilised, but covertly reinforced, by the rampant individualism of the marketplace. ‘Forty-phobia’ became an advertising slogan for companies selling nerve tonics that promised to help patients ‘feel younger as they grow older’. [4]  For men in particular, the purchase of household goods, cars, holidays and suburban houses was promoted as a means of blunting the frustrations created by the contrast between the excitement of conflict and the boredom of corporate and domestic life, a phenomenon neatly captured in the post-war fiction of Sloan Wilson and Joseph Heller. [5]  Self-fulfilment and its spiritual benefits became the mantra of popular advice books aimed at the disaffected, but affluent, middle classes struggling to redefine and relocate themselves in a changing world.

One of the fruits of this process was the creation of a novel expression of self-discovery: the midlife crisis.  Coined in 1965 by the Canadian psychoanalyst and social theorist Elliott Jaques to describe psychodynamic collapse triggered by fear of impending death, the term rapidly came to define midlife madness, a powerful tool for both explaining and legitimating the search for personal fulfilment during the middle years. [6]  The cost in terms of other people’s happiness was regarded by many as less important than the self-realisation, emotional emancipation and spiritual awakening that the crisis supposedly made possible.  But `la crise de la quarantaine’, as it became known in France, was not embraced as a means of enabling healthy ageing in women, as Parsons had anticipated, or as a pathway to a more contented and durable marriage, as Stopes had hoped.  Shaped by late twentieth-century obsessions with the autonomous individual and the gospel of consumption, the notion that life can begin again at forty has been used to reinvigorate a Western capitalist economy that can only be sustained by prolonging productivity and encouraging spending across the whole life course.


[1] `Now is the time for all women to train for the duties that war time may bring’, The Pittsburgh Press, Tuesday 10 April 1917, p. 20; Mrs Theodore Parsons, Brain Culture through Scientific Body Building, (Chicago, American School of Mental and Physical Development, 1912).

[2] Walter B. Pitkin, Life Begins at Forty, (New York, McGraw-Hill, 1932), pp. 174-5.

[3] Marie Carmichael Stopes, Enduring Passion, (London, Putnam, 1928); Marie Carmichael Stopes, Change of Life in Men and Women, (London, Putnam, 1936).

[4] An early example of this advertising strategy is `Forty-phobia (fear of the forties)’, The Times, 28 April 1938, p. 19.

[5] Sloan Wilson, The Man in the Gray Flannel Suit, (Cambridge, Mass., Da Capo Press, [1955] 2002); Joseph Heller, Something Happened, (London, Jonathan Cape, [1966] 1974).

[6] The midlife crisis is the focus of on-going research, the results of which will appear in Mark Jackson, The Midlife Crisis: A History, (London, Reaktion, forthcoming).


A Question of ‘Public Engagement’

Ayesha Nathoo


Over the last year, I have been involved in a number of public events related to my work on the history of therapeutic relaxation. These have included talks, displays and practical workshops at the “Being Human” festival of the humanities, the “Secret Garden Party” and the “Wilderness festival” (in collaboration with Guerilla Science and NOW live events), a “Friday Late Spectacular” and a discussion evening on “Rest and Relaxation in the Modern World” as part of Hubbub, at the Wellcome Collection, London.



The aims, scale, content and audiences varied for each of these events, but together they have left me reappraising my role as a historian, and reflecting on notions of expertise in such public forums. The central topics which comprise my research – ‘rest’, ‘balance’, ‘stress’ and ‘relaxation’ – affect us all, and many audience members were drawn to the events because of pre-existing interests in these matters. Others stumbled across the events by chance with little idea of what to expect or gain from them. In the music festivals, the historical material distinguished my workshops from the myriad other practice-based workshops on offer (such as yoga, mindfulness and massage); elsewhere the practical content differentiated my work from other more traditional academic contributions.


I am particularly interested in understanding relaxation as a taught practice, and the material, audio and visual culture that has furthered its development over the last hundred years in Western, secular, therapeutic contexts. Aural instruction given in classes or on cassettes were key methods for teaching and learning relaxation in the post-war decades and are central to understanding the growth of such practices in both clinical and domestic settings. As well as the instructions, the tone of voice, pace, pauses, and type of medium would have affected how relaxation was understood, distributed and practiced, so I have been keen to track down and share some of the original audio recordings to better understand these experiential and pedagogical aspects. If these have been unavailable I have instead endeavoured to recreate the ways in which different protagonists have taught relaxation, piecing together printed publications, archival material and oral historical resources to do so.


Many of those who participated in the workshops were curious to learn more about the techniques that I discussed – such as yoga, meditation or autogenic training – and their relationship to health and wellbeing. Yet as I was presenting the material primarily as a historian, rather than as a practitioner or advocate of any particular therapeutic practice, some unexpected tensions seemed to arise. Whilst the historical material inspired much interest, most centrally I found that people wanted to evaluate the specific techniques: What was going to work best for them for particular ailments or for general wellbeing? What is the best strategy for alleviating anxiety or chronic pain? Would I recommend relaxation for childbirth? Did I have copies of relaxation instructions that they could take away? Why was I talking about Progressive Muscle Relaxation, one lady asked, when the Alexander Technique (which she had taught for 20 years) was far superior? Was mindfulness meditation really a form of relaxation? Was it best to concentrate on relaxing the mind or the body? What is the evidence base for therapeutic relaxation? Why bother with daily relaxation training if there is a more accessible pharmaceutical option?


Although comparable questions have partly spurred my own interest in this research area, speaking as a historian I have felt uneasy about offering responses. The short, practical, personal questions are not compatible with in-depth answers that address broader medical, social and political contexts, such as the rise of individualism and the mass media, and changes in healthcare, lifestyle and biomedical models. Yet that is what has created and shaped successive demands for and critiques of therapeutic relaxation; contemporary concerns and understandings derive from these past contexts. This is the long and complex narrative that I am researching and whilst I certainly hope that it will have policy implications and be relevant to today’s medical landscape, I do not feel equipped to offer personal advice. Neither am I sure that I should be doing so.


I would speculate that this kind of professional reticence is a majority view amongst historians, and yet it is somewhat frustrating for interested lay audiences. If a professional researcher is investigating a particular subject, then why should they not state their opinions based on the knowledge gained from the research? I have come across this at various other points during past research, on topics ranging from the media coverage of the possible link between autism and the MMR vaccination to organ transplantation and donation. ‘Should I give my child the vaccination?’, mothers repeatedly asked me. ‘Were there any reasons not to sign the organ register?’ ‘Did I think there should be an opt-out clause to increase donation rates?’ It is not that I had not given enough thought to these matters – I had extensively mulled over them – yet I questioned my role as a historian to authoritatively influence other people’s present-day decisions, certainly without allowing the time and space to substantiate my points of view. The aim, however, would not be to give a ‘balanced’ view in the sense of ‘objectively’ presenting a full range of arguments for and against.


The personal is the historical: Knowledge and memories of the past shape views and actions for the future. And so a historian’s personal stance can generally be inferred from their authored work, amongst the layers of interpretations and the selection of sources. Perhaps then scholars should meet the challenge of more explicitly articulating their own views in public contexts, where audience members often lead conversations and set agendas and where the boundaries of expertise are fluid. As ‘public engagement’ becomes an increasingly significant part of academic life, it seems timely and important to open up these discussions.

‘On Balance: Lifestyle, Mental Health and Wellbeing’: Musings on Multidisciplinarity, from a Historian.


Ali Haggett

The first of three major conferences to be held in conjunction with the Lifestyle, Health and Disease project took place on the 25th and 26th of June at the University’s Streatham Campus. Focusing broadly on the strand of research that is concerned with mental health and wellbeing, the remit of the conference was to explore the ways in which changing notions of ‘balance’ have been used to understand the causes of mental illness; to rationalise new approaches to its treatment; and to validate advice relating to balance in work and family life. Drawing on a range of approaches and methodologies, the multidisciplinary conference attracted scholars from Britain and the United States, with diverse backgrounds, which included: history, anthropology, psychiatry, psychology and clinical medicine. On the evening of the 24th June, as a prelude to the event, we began by hosting a public panel debate, on the topic of ‘Defeating Depression: What Hope?’ at the Royal Albert Memorial Museum in Exeter. A photo gallery and a summary of the evening can be found on the Exeter Blog


Still at the formative stages of research, I hoped that the contributions from other scholars might provoke new lines of enquiry, or stimulate interesting alternative approaches to our work. One of the questions I am particularly interested in is: why does the concept of balance in mental health and wellbeing become influential at certain times through our recent history? As the conference progressed, and with the public panel event also in mind, I found myself wondering what a future historian might make of the contemporary themes and concerns that emerged from this conference. It struck me that many of the anxieties that were articulated by non-historians were not new, but that they had surfaced at regular junctures in modern history. At the heart of a number of papers, and evident from the contributions to the public debate, was a palpable dissatisfaction with the status quo – with ‘modern’ and perhaps ‘unbalanced’ ways of living and their effects on health. These concerns are reminiscent of those put forward much earlier, during the early and mid twentieth century, by proponents of the social medicine movement who were critical of rising consumerism, the breakdown of traditional values and kinship ties, and who were keen to reduce the burden of sickness by pressing for social improvements.[1] Misgivings about the current ‘neo-liberal’ political climate were evident, where, in some circles, the principles of free-market individualism are held to undermine collective action, community spirit and kinship, leading to disempowerment and ultimately to ill health. The prevailing interventionist, biomedical model of medicine practised in the West did not escape criticism. Some of the concerns raised resonated strongly with the ideas put forward by proponents of the ‘biopsychosocial’ model of medicine from the 1970s, which highlighted the importance of the social context of disease.[2] A number of papers raised important questions about the ways in which the current medical model appears to foreground the treatment of mental illness and underplay approaches to prevention. Speakers from the conference and contributions to the public debate noted, with some disquiet, that responsibility for protecting and maintaining mental health had increasingly shifted to the individual, instead of the ‘group’, the employer or the wider socio-economic environment.


Perhaps unsurprisingly, anxieties about mental illness and the field of psychiatry that first materialised during the 1960s and developed within the ‘anti-psychiatry’ movement were still conspicuous at the conference – anxieties about the classification, diagnosis and labelling of mental disorders; unease about the misapplication of ‘norms’, rating scales and the concept of ‘risk management’ in medicine. The disquiet that emerged during the 1960s was of course also intimately associated with the contemporary counter culture and broader concerns about the conformity and emptiness of the post-war world. Such ideas were evident in the literature of the period from authors such as George Orwell, William H. Whyte, David Riesman and Herbert Marcuse, who all variously disapproved of the social and cultural changes that took place in mid-century Britain and the United States.[3]


Defined by the Oxford Dictionary as ‘a situation in which different elements are in the correct proportions’, the concept of ‘balance’ remains at the core of all debates about mental health, whether we are talking about chemical imbalance, work-life balance or cognitive and mindful approaches to human behaviour. The papers delivered at the conference by my fellow historians neatly exposed the ways in which many of the themes discussed have emerged in the past and often reveal more about broader concerns, tensions and uncertainty about new ways of living and their effects on health than they do about epidemiological trends in mental illness. While historians are uniquely placed to add this important context, the joy of combining insights from several disciplines is that we are able to begin to redefine problems and reach solutions through new understandings. On a personal level, the contributions from other disciplines reminded me that perhaps, as an idealistic historian, I am sometimes distanced from the harsh realities of clinical practice. Collectively, the papers also prompted me to think about new ways of conceptualising and measuring what is ‘balanced’ in life and in health – and perhaps also to question the ways in which balance is somehow taken to be inherently desirable, or essential. There is no doubt that the global burden of mental ill health has become one of the most pressing social and medical problems of our time. Overcoming the challenges faced will require the knowledge of more than one discipline. As scholars engaged in research into mental health and wellbeing, we are all, in different ways, and with different approaches – and often with different opinions – ultimately seeking a shared goal of fostering ways to improve mental health and wellbeing in our society.


The conference organisers would like to thank the Wellcome Trust for supporting the conference and to the following speakers for their contributions:

Professor David Healy, Dr Matthew Smith, Professor Jonathan Metzl, Dr Nils Fietje, Professor Ed Watkins, Dr James Davies, Dr Ayesha Nathoo, Professor Michelle Ryan, Mr Frederick Cooper, Professor Femi Oyebode, Dr James Fallon and Dr Alex Joy.

[1] As examples: Stephen Taylor, ‘The suburban neurosis’, Lancet, 26 March 1938 and James L. Halliday, Psychosocial Medicine: A Study of the Sick Society (London, William Heinemann, 1948).

[2] See George L. Engel, ‘The need for a new medical model: a challenge for biomedicine’, Science (1977), 196, 129-36.

[3] An interesting discussion of the political and social context within which the antipsychiatry movement grew can be found in Nick Crossley, ‘R. D. Laing and the British anti-psychiatry movement: a socio-historical analysis’, Social Science and Medicine (1998), Vol. 47, No. 7, 877-89.