Starvation in humans: Evolutionary background and contemporary implications

https://doi.org/10.1016/j.mad.2005.03.018Get rights and content

Abstract

Although there is extensive evidence that caloric restriction (CR) extends lifespan in several species the evidence base for humans is weak. We are still at the stage of applying inductive reasoning and of framing hypotheses to be tested. It is known that a genetic background contributes about 25% to the variation in human longevity, but thought unlikely that any genes conferring longer lifespan have been positively selected to do so. It is more likely that any such benefits are unintended consequences arising from other adaptations. If there is an association between CR and longevity in humans it may have been selected by previous exposures to famine. This paper briefly reviews the historical evidence on the extent and frequency of famines in human history. It is concluded that starvation has been one of the major selective pressures on the human genome and has left abundant evidence of adaptive survival traits. Many of these are mediated through effects on reproduction. However, interpretation of the possible links between these energy-sparing mechanisms and any association between CR and ageing is handicapped by an absence of data on the latter and will remain a matter of debate for many years to come.

Introduction

Evidence from several species studied under rigorous laboratory conditions confirms that caloric restriction (CR) can extend lifespan by a substantial amount (reviewed by Longo and Finch, 2003). As summarised in other papers from this symposium the mechanisms of these effects are still vigorously debated and the possibility of their extrapolation to primates is based, so far, on insubstantial evidence. The exploration of the topic from a human perspective must therefore rely, for the time being, on inductive reasoning, which it is hoped can direct the research agenda. Prominent amongst this reasoning are questions related to the evolution of genetic variants and pathways that might influence the rate of ageing. The current debate generally considers three leading explanatory paradigms for ageing from among the scores that have been suggested (Hughes and Reynolds, 2005, Kirkwood and Austad, 2000): mutation accumulation (an accumulation of late-acting deleterious mutations with the germ line) (e.g. Medawar, 1952); antagonistic pleiotropy (the trade-off between benefit at an early age and harm at later ages) (Williams, 1957); the disposable soma theory (the concept that lifetime-limited somatic resources must be traded between investment in growth, maintenance and reproduction) (Kirkwood, 1977, Kirkwood, 1996). These, and other possible paradigms, are not necessarily mutually exclusive.

A key question is whether these are evolved mechanisms or serendipitous effects arising as unintended consequences. Genes are certainly involved in determining the rate of ageing (Cournil and Kirkwood, 2001) and recent molecular studies are starting to reveal some of the possible mechanisms including genes influencing DNA repair and stress damage (Kirkwood, 2003). This does not, however, imply that such genes have been under positive selection pressure that favours longevity beyond the age of natural reproduction; indeed, because longevity traits display their benefits after periods of peak reproductive capacity (and may actually compete with fecundity traits) and because in wild animals extrinsic hazards such as predation and infections prevent most animals from ageing, it is highly unlikely that such genes would have been selected except by accidental association with other survival traits. Variations to this general reasoning may possibly occur in a few species, including humans, in which elderly family members might aid survival of their grandchildren (Sear et al., 2002).

A supplementary question is whether any genetic effects on ageing have been selected in relation to caloric restriction and might explain the association between CR and longevity. It is pertinent to consider the evolutionary forces that may have moulded the selection of any such associations and the purpose of this paper is to provide the historical and evolutionary background against which to judge the emerging experimental evidence and to frame new hypotheses.

A large part of the human race is currently over-nourished. The extent of this is sufficiently powerful to overcome natural body weight regulatory mechanisms leading to gradual weight gain and obesity (Prentice, 1997). The excess adipose tissue and its macrophage infiltrate are sources of chronic inflammatory mediators that drive a range of pathological outcomes including dislipidaemia, insulin resistance and hypertension. Diabetes, heart disease and cancers are among the many serious co-morbidities that are associated with obesity (World Health Organisation, 1998).

These disease outcomes have been estimated to shorten the lifespan of an obese person by between 8 and 13 years depending on the age of onset of the obesity (Fontaine et al., 2003). Successful weight loss can reduce the incidence of diabetes and other co-morbidities, cause remission of existing symptoms and reduce all-cause mortality (World Health Organisation, 1998). Thus, there is no doubt that caloric restriction of an overweight human will, on average, be associated with a degree of reversal of what would otherwise be an accelerated mortality. In the current paper, it is assumed that this self-evident truism is not the central issue in the debate, but that the question relates to the natural biology of any possible impact on lifespan of CR in non-overweight humans.

This caveat is also worth considering when interpreting the experimental data from small animals and primates kept in captivity (Longo and Finch, 2003). It is likely that these are also in a state of energy excess under most laboratory conditions, as evidenced by gradual fat accumulation and their levels of physical activity are certainly lower than normal. CR paradigms under such conditions might therefore simply be recreating a more natural and healthy body composition in terms of lean:fat ratios.

Modern humans are meal eaters and generally display highly characteristic temporal eating patterns according to their culture. The poorest people may consume only a single meal each day, but two or three meals and additional episodes of snacking are generally the norm. In healthy people, finely tuned hormonal mechanisms orchestrate the inter-meal adaptations in fuel selection that maintain the organism in an appropriate metabolic state for survival. In evolutionary time, the maintenance of vigilance and fight-and-flight responses, together with reproductive functions would have been the dominant selective drivers. In adult humans, these states are readily maintained for periods of up to 18 h or so after a meal. After these 18 h have elapsed, hepatic glycogen stores are severely depleted and the metabolically expensive process of gluconeogenesis is required to maintain optimal cranial function. The status of glycogen stores in muscle will depend on the extent to which they have been exercised and may range from completely depleted to almost fully repleted if the person has been inactive since the last meal. So in general, and based upon the extent to which hepatic glycogen can maintain glucose homeostasis between meals, intervals longer than 18 h can be considered as the onset of caloric restriction. In the special case of pregnancy, a phenomenon described as ‘accelerated starvation’ (Prentice et al., 1983) reduces this interval to something less than 18 h.

During the next 48–96 h, the organism initiates a number of self-protective adaptations designed to maximise the use of its fuel resources, protect protein mass and minimise the loss of essential micronutrients. Primary amongst these are the induction of enzymes that allow the brain to utilise ketone bodies in place of glucose. This reduces the need for gluconeogenesis and hence the rate of protein breakdown. Other adaptations include a reduction in metabolic rate mediated by a T3/T4 shunt in thyroid metabolism (Prentice et al., 1991).

After these first 4 or 5 days of fasting, the system reaches a state of relative stability in the rate at which it loses vital resources though there is a series of further behavioural adaptations that come into play when survival is seriously threatened by famine (Prentice, 2001). The time over which any individual can survive severe or total caloric restriction may range from a little over a month in thin individuals to a year or more in obese ones.

Section snippets

Starvation in humans: from short-term hunger to fatal famine

Food shortages, starvation and famine have probably been among the strongest of any external evolutionary pressures that have moulded human adaptive responses. The current consensus is that hunter-gatherer populations would have repeatedly suffered food shortages of a day or two, but would rarely have faced outright famine (Diamond, 1993). This is because they lived in small groups, had adopted a highly omnivorous and variable diet and were sufficiently mobile to move to fresh hunting grounds

Historical evidence of starvation in human populations

Elsewhere we have described in greater detail a selection of the mass of historical evidence that records the influence of widespread catastrophic famine on the human race (Prentice, 2001). These records document extreme privation accompanied by mass mortality. The frequent references to cannibalism as a means of survival, even by parents of their own children, provide a terrible validation of the extent of human suffering. A small selection of some of the best known historical references to

Contemporary implications of biological adaptations for surviving famine

Table 2 briefly lists the areas of human function through which energy can be stored in readiness for possible famine, or spared in times of actual famine. These have been discussed in greater detail elsewhere (Prentice, 2001, Prentice et al., 2005). In the current context it is only pertinent to consider which, if any, may have possible direct or indirect effects on the rate of ageing and hence might explain the putative link between CR and longevity. Although behavioural adaptations may

Conclusions

In conclusion, there is extensive evidence that the human genome has been under pressure by famine, and this has led to the selection of numerous adaptive traits to protect reproductive capacity and aid survival. However, as argued by Kirkwood (2003) it is teleologically unlikely that any of these will have been positively selected to enhance survival beyond the age of childrearing, but that it is quite possible that they may accidentally do so.

There is very little doubt that modest CR would

References (38)

  • A.G. Dulloo et al.

    An adipose-specific contribution of thermogenesis in body weight regulation

    Int. J. Obes.

    (2001)
  • A. Dulloo et al.

    Uncoupling proteins: their roles in adaptive thermogenesis and substrate metabolism

    Br. J. Nutr.

    (2001)
  • B. Fagan

    Floods Famines and Emperors

    (2000)
  • R.E. Frisch

    Malnutrition and fertility

    Science

    (1982)
  • K.R. Fontaine et al.

    Years of life lost due to obesity

    J. Am. Med. Assoc.

    (2003)
  • K.A. Hughes et al.

    Evolutionary and mechanistic theories of aging

    Annu. Rev. Entomol.

    (2005)
  • W.C. Jordan

    The Great Famine: Northern Europe in the Early Fourteenth Century

    (1996)
  • A.J. Keys et al.

    The Biology of Human Starvation

    (1950)
  • T.B.L. Kirkwood

    Evolution of ageing

    Nature

    (1977)
  • Cited by (78)

    • Physical performance during energy deficiency in humans: An evolutionary perspective

      2023, Comparative Biochemistry and Physiology -Part A : Molecular and Integrative Physiology
    • Protein metabolism and the archaeological record: Implications for ancient subsistence strategies

      2022, Journal of Anthropological Archaeology
      Citation Excerpt :

      Cross-cultural surveys of ethnographic groups, however, suggest that famine was far more common to agriculturalists than hunter-gatherers (Benyshek and Watson, 2006; Berbesque et al., 2013). Hunter-gatherers were freer to move to locales with better food options when the need arose (Prentice, 2005). Speakman (2007) offers the alternative hypothesis that obesity came about through genetic drift under relaxed predation pressure during the past 2 million years given the advent of better group protection signals, tools, and eventually fire.

    • Mild Stress-Induced Hormesis

      2019, The Science of Hormesis in Health and Longevity
    • Particular Alimentations for Nutrition, Health and Pleasure

      2019, Advances in Food and Nutrition Research
      Citation Excerpt :

      There is widespread evidence that starvation and famines during evolution exerted a strong selection effect on the human genome. Amazingly, the time over which an individual can survive severe or total caloric restriction varies from over a month in thin individuals, to a year or more in obese ones (Prentice, 2005). Even today, there are two extreme cases where the regular consumption of real foods is suspended or vastly diminished, yet human life is maintained for extended periods (e.g., several days to even months).

    • Mild Stress-Induced Hormesis: Hopes and Challenges

      2018, The Science of Hormesis in Health and Longevity
    View all citing articles on Scopus
    View full text