We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save this undefined to your undefined account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your undefined account.
Find out more about saving content to .
To save this article to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Humankind has existed for 2·5 million years but only in the past 10 000 years have we been exposed to wheat. Therefore, it could be considered that wheat (gluten) is a novel introduction to humankind's diet! Prior to 1939, the rationing system had already been devised. This led to an imperative to try to increase agricultural production. Thus, it was agreed in 1941 that there was a need to establish a Nutrition Society. The very roots of the Society were geared towards necessarily increasing the production of wheat. This goal was achieved and by the end of the 20th century, global wheat output had expanded by 5-fold. Perhaps, as a result, the epidemiology of coeliac disease (CD) or gluten sensitive enteropathy has changed. CD now affects 1 % or more of all adults. Despite this, delays in diagnosis are common, for every adult patient diagnosed approximately three–four cases are undetected. This review explores humankind's relationship with gluten, wheat chemistry, the rising prevalence of modern CD and the new entity of non-coeliac gluten or wheat sensitivity. The nutritional interventions of a low fermentable oligo-, di- and mono-saccharides and polyols diet and gluten-free diet (GFD) for irritable bowel syndrome and the evidence to support this approach (including our own published work) are also reviewed. There appears to be a rising interest in the GFD as a ‘lifestyler’, ‘free from’ or ‘clean eater’ choice, causing concern. Restrictive diets may lead to potential nutritional implications, with long-term effects requiring further exploration.
Symposium 2: Chrono-nutrition in the urban environment
Conference on ‘Improving nutrition in metropolitan areas’
The urban environment has changed vastly over past decades, which also has had an impact on our sleep and dietary patterns and possibly health outcomes. Some studies have shown that sleep duration and sleep quality has declined over past decades, especially in children. In parallel, our lifestyle and dietary patterns have also changed including more shift work, more meals outside the home or family setting and more irregular eating patterns, including breakfast skipping and late-night eating. This new area of research in nutritional sciences studying the impact of the timing of eating on health outcomes is called chrono-nutrition, and combines elements from nutritional research with chrono-biology. The objectives of this paper were to discuss secular trends in sleep patterns and related dietary patterns, introduce basic concepts and mechanisms of chrono-nutrition and discuss the evidence for the importance of sleep and chrono-nutrition in relation to health outcomes. Overall, chrono-nutrition could mediate the effects between sleep, diet and urbanisation, and more research is needed to elucidate the importance of chrono-nutrition for metabolic health and its impact on public health.
The circadian disruption in shift-workers is suggested to be a risk factor to develop overweight and metabolic dysfunction. The conflicting time signals given by shifted activity, shifted food intake and exposure to light at night occurring in the shift-worker are proposed to be the cause for the loss of internal synchrony and the consequent adverse effects on body weight and metabolism. Because food elicited signals have proven to be potent entraining signals for peripheral oscillations, here we review the findings from experimental models of shift-work and verify whether they provide evidence about the causal association between shifted feeding schedules, circadian disruption and altered metabolism. We found mainly four experimental models that mimic the conditions of shift-work: protocols of forced sleep deprivation, of forced activity during the normal rest phase, exposure to light at night and shifted food timing. A big variability in the intensity and duration of the protocols was observed, which led to a diversity of effects. A common result was the disruption of temporal patterns of activity; however, not all studies explored the temporal patterns of food intake. According to studies that evaluate time of food intake as an experimental model of shift-work and studies that evaluate shifted food consumption, time of food intake may be a determining factor for the loss of balance at the circadian and metabolic level.
Presently, about 12% of the population is 65 years or older and by the year 2030 that figure is expected to reach 21%. In order to promote the well-being of the elderly and to reduce the costs associated with health care demands, increased longevity should be accompanied by ageing attenuation. Energy restriction, which limits the amount of energy consumed to 60–70% of the daily intake, and intermittent fasting, which allows the food to be available ad libitum every other day, extend the life span of mammals and prevent or delay the onset of major age-related diseases, such as cancer, diabetes and cataracts. Recently, we have shown that well-being can be achieved by resetting of the circadian clock and induction of robust catabolic circadian rhythms via timed feeding. In addition, the clock mechanism regulates metabolism and major metabolic proteins are key factors in the core clock mechanism. Therefore, it is necessary to increase our understanding of circadian regulation over metabolism and longevity and to design new therapies based on this regulation. This review will explore the present data in the field of circadian rhythms, ageing and metabolism.
Life on earth has evolved during the past several billion years under relatively bright days and dark night conditions. The wide-spread adoption of electric lights during the past century exposed animals, both human and non-human, to significant light at night for the first time in their evolutionary history. Endogenous circadian clocks depend on light to entrain to the external daily environment and seasonal rhythms depend on clear nightly melatonin signals to assess time of year. Thus, light at night can derange temporal adaptations. Indeed, disruption of naturally evolved light–dark cycles results in several physiological and behavioural changes with potentially serious implications for physiology, behaviour and mood. In this review, data from night-shift workers on their elevated risk for metabolic disorders, as well as data from animal studies will be discussed. Night-shift workers are predisposed to obesity and dysregulated metabolism that may result from disrupted circadian rhythms. Although studies in human subjects are correlative, animal studies have revealed several mechanisms through which light at night may exert its effects on metabolism by disrupting circadian rhythms that are associated with inflammation, both in the brain and in the periphery. Disruption of the typical timing of food intake is a key effect of light at night and subsequent metabolic dysregulation. Strategies to avoid the effects of light at night on body mass dysregulation should be pursued.
Symposium 3: Building a healthier environment
Conference on ‘Improving nutrition in metropolitan areas’
By virtue of reducing dietary energy density, low-calorie sweeteners (LCS) can be expected to decrease overall energy intake and thereby decrease body weight. Such effects will be limited by the amount of sugar replaced by LCS, and the dynamics of appetite and weight control (e.g., acute compensatory eating, and an increase in appetite and decrease in energy expenditure accompanying weight loss). Consistent with these predictions, short-term intervention studies show incomplete compensation for the consumption of LCS v. sugar, and longer-term intervention studies (from 4 weeks to 40 months duration) show small decreases in energy intake and body weight with LCS v. sugar. Despite this evidence, there are claims that LCS undermine weight management. Three claims are that: (1) LCS disrupt the learned control of energy intake (sweet taste confusion hypothesis); (2) exposure to sweetness increases desire for sweetness (sweet tooth hypothesis); (3) consumers might consciously overcompensate for ‘calories saved’ when they know they are consuming LCS (conscious overcompensation hypothesis). None of these claims stands up to close examination. In any case, the results of the intervention studies comparing LCS v. sugar indicate that the effect of energy dilution outweighs any tendency LCS might conceivably have to increase energy intake.
The burden of obesity contributes to increasing health inequality, and placing healthcare systems under huge strain. The modern society could broadly be described to support unhealthful eating patterns and sedentary behaviour; also described as obesogenic. Obesity prevention and treatment has focused on educational and behavioural interventions, with limited overall success. A sustainable approach is to address the environments that promote less healthy eating and high energy intake as well as sedentary behaviour. Approaches which modify the environment have the potential to assist in the prevention of this complex condition. The present paper focuses on food environments within the context of obesogenic environments. Takeaway and fast food, a fixture of our diet, is usually nutrient poor and energy dense. A ‘concentration effect’ has been observed, where there is a clustering of fast food and takeaway outlets in more deprived areas. Access to food and intake are associated; however, there are methodological challenges in associating the effect of the food environment on obesity. While there is an imperfect evidence base relating to the role of the food environment in terms of the obesity crisis; policy, practice, civic society and industry must work together and take action now, where current evidence suggests a change. Shaping the environment to better support healthful eating decisions has the potential to be a key aspect of a successful obesity prevention intervention.
Symposium 4: Interventions to improve nutrition in urban areas
Conference on ‘Improving nutrition in metropolitan areas’
Childhood obesity is a serious challenge for public health. The problem begins early with most excess childhood weight gained before starting school. In 2016, the WHO estimated that 41 million children under 5 were overweight or obese. Once established, obesity is difficult to reverse, likely to persist into adult life and is associated with increased risk of CVD, type 2 diabetes and certain cancers. Preventing obesity is therefore of high importance. However, its development is multi-factorial and prevention is a complex challenge. Modifiable lifestyle behaviours such as diet and physical activity are the most well-known determinants of obesity. More recently, early-life factors have emerged as key influencers of obesity in childhood. Understanding risk factors and how they interact is important to inform interventions that aim to prevent obesity in early childhood. Available evidence supports multi-component interventions as effective in obesity prevention. However, relatively few interventions are available in the UK and only one, TrimTots, has been evaluated in randomised controlled trials and shown to be effective at reducing obesity risk in preschool children (age 1–5 years). BMI was lower in children immediately after completing TrimTots compared with waiting list controls and this effect was sustained at long-term follow-up, 2 years after completion. Developing and evaluating complex interventions for obesity prevention is a challenge for clinicians and researchers. In addition, parents encounter barriers engaging with interventions. This review considers early-life risk factors for obesity, highlights evidence for preventative interventions and discusses barriers and facilitators to their success.
Alongside declining activity levels, energy needs fall in older age and eating less is expected. However, as total food consumption declines, intakes of many nutrients are also likely to fall; while energy requirements may be met, other nutrient needs may not. Although this highlights the importance of nutrient-dense foods and overall diet quality in older age to ensure nutrient intakes are sufficient, maintaining or increasing diet quality may be difficult at a time when food access and preparation are becoming more challenging, and diets may be more monotonous. Poor nutrition, even in developed settings, is common. Older malnourished adults are more likely to have poorer health outcomes, longer hospital stays and increased mortality. Thus, apart from the evident personal costs, the economic burden of disease-related malnutrition is significant, and effective preventive strategies to promote good nutrition among older populations are needed. In particular, there is a need for wider recognition of malnutrition risk among older adults, including implementation of routine screening of nutritional status and early diagnosis. Design of future interventions to support older community-dwelling adults requires a clear understanding of the personal and contextual influences that affect patterns of food choice and consumption, including consideration of the importance of social and psychological factors. In addition, there are opportunities to intervene earlier in the lifecourse; the most effective preventive efforts to promote good nutrition in older age may need to start ahead of age-related changes in physiology and function, including younger adulthood and at the retirement transition.
With the focus of care shifting from the hospital to the community, supportive nutritional care to old people is to become an important issue to address in the community, since undernutrition has serious consequences, both for the quality of life and for the health care costs. Several modifiable nutritional risk factors relate to undernutrition. Unfortunately, the problem with (risk of) undernutrition is aggravated due to a lack of alertness among e.g. health care staff, leading to insufficient attention for systemic screening and nutritional care. Only a few of the existing screening tools have been validated among old people receiving support at home. Few studies have assessed the beneficial effect of nutritional support among old people in their own home, and recently, it was concluded that such have shown limited effects. One reason may be that the nutritional interventions performed have not taken the multiple nutritional risk factors afore-mentioned into consideration when formulating the action/treatment plan and hence not used a multidisciplinary approach. Another reason may be that the intervention studies have not used validated screening tools to identify those old people most likely to benefit from the nutritional support. However, three recent studies have used a multidisciplinary approach and two have proven a beneficial effect on the quality of life of the old people and the health care costs. These findings suggest that when planning nutritional intervention studies for old people receiving support at home, modifiable nutritional risk factors should be taken into consideration, and a multidisciplinary approach considered.
Household food insecurity is a serious public health concern in high-income countries. Canada and the USA regularly monitor household food insecurity, while in other countries, such as the UK, it has been the rapid rise of food bank usage that has drawn increased attention to this longstanding, but largely overlooked, problem. This review evaluates evidence on interventions intended to reduce household food insecurity in high-income countries. Research on social protection interventions suggests both cash transfers and food subsidies (e.g. the US Supplement Nutrition and Assistance Programme) reduce household food insecurity. In contrast, research on community-level interventions, such as food banks and other food programmes, suggests limited impacts. Although food banks have become a common intervention for food insecurity in high-income countries, evidence suggests their reliance on donations of volunteer time and food make them inevitably limited in the assistance they are able to provide. The stigma people feel using food banks may also make them untenable. Alternatives to, or enhanced, food banks such as community shops or community kitchens, have become common, but evidence also suggests they may be limited in effectiveness if they do not reach people experiencing food insecurity. This review highlights the difficulty of trying to address household food insecurity with community-based food interventions when solutions likely lie upstream in social protection policies.
Postgraduate Symposium
Conference on ‘Improving nutrition in metropolitan areas’
The ability to synthesise sufficient vitamin D through sunlight in human subjects can be limited. Thus, diet has become an important contributor to vitamin D intake and status; however, there are only a few foods (e.g. egg yolk, oily fish) naturally rich in vitamin D. Therefore, vitamin D-enriched foods via supplementing the animals’ diet with vitamin D or vitamin D fortification of foods have been proposed as strategies to increase vitamin D intake. Evidence that cholecalciferol (vitamin D3) and calcifediol (25(OH)D3) content of eggs, fish and milk increased in response to vitamin D3 supplementation of hens, fish or cows’ diets was identified when vitamin D-enrichment studies were reviewed. However, evidence from supplementation studies with hens showed only dietary 25(OH)D3, not vitamin D3 supplementation, resulted in a pronounced increase of 25(OH)D3 in the eggs. Furthermore, evidence from randomised controlled trials indicated that a 25(OH)D3 oral supplement could be absorbed faster and more efficiently raise serum 25(OH)D concentration compared with vitamin D3 supplementation. Moreover, evidence showed the relative effectiveness of increasing vitamin D status using 25(OH)D3 varied between 3·13 and 7·14 times that of vitamin D3, probably due to the different characteristics of the investigated subjects or study design. Therefore, vitamin D-enrichment or fortified foods using 25(OH)D3 would appear to have advantages over vitamin D3. Further well-controlled studies are needed to assess the effects of 25(OH)D3 enriched or fortified foods in the general population and clinical patients.
Vitamin D is a unique nutrient. First, it acts as a pro-hormone and secondly, the requirement for vitamin D can be met by both endogenous synthesis from sunlight and by dietary sources. This complicates the determination of dietary requirements for vitamin D, which along with the definition of optimal vitamin D status, have been highly controversial and much debated over recent years. Adolescents are a population group at high risk of low vitamin D status, which is concerning given the important role of vitamin D, and calcium, in promoting normal bone mineralisation and attainment of peak bone mass during this rapid growth phase. Dietary vitamin D recommendations are important from a public health perspective in helping to avoid deficiency and optimise vitamin D status for health. However limited experimental data from winter-based dose–response randomised trials in adolescents has hindered the development of evidence-based dietary requirements for vitamin D in this population group. This review will highlight how specifically designed randomised trials and the approach adopted for estimating such requirements can lead to improved recommendations. Such data indicate that vitamin D intakes of between 10 and about 30 µg/d may be required to avoid deficiency and ensure adequacy in adolescents, considerably greater than the current recommendations of 10–15 µg/d. Finally this review will consider the implications of this on public health policy, in terms of future refinements of vitamin D requirement recommendations and prioritisation of public health strategies to help prevent vitamin D deficiency.
Iodine deficiency (ID) in women of childbearing age remains a global public health concern, mainly through its impact on fetal and infant neurodevelopment. While iodine status is improving globally, ID is still prevalent in pregnancy, when requirements increase. More than 120 countries have implemented salt iodisation and food fortification, strategies that have been partially successful. Supplementation during pregnancy is recommended in some countries and supported by the WHO when mandatory salt iodisation is not present. The UK is listed as one of the ten countries with the lowest iodine status globally, with approximately 60 % of pregnant women not meeting the WHO recommended intake. Without mandatory iodine fortification or recommendation for supplementation in pregnancy, the UK population depends on dietary sources of iodine. Both women and healthcare professionals have low knowledge and awareness of iodine, its sources or its role for health. Dairy and seafood products are the richest sources of iodine and their consumption is essential to support adequate iodine status. Increasing iodine through the diet might be possible if iodine-rich foods get repositioned in the diet, as they now contribute towards only about 13 % of the average energy intake of adult women. This review examines the use of iodine-rich foods in parallel with other public health strategies, to increase iodine intake and highlights the rare opportunity in the UK for randomised trials, due to the lack of mandatory fortification programmes.
Symposium 5: The role of regulation to improve nutrition
Conference on ‘Improving nutrition in metropolitan areas’
The global burden of obesity leads to significant morbidity and has major economic implications. In April 2018, Britain will join a growing number of countries attempting to tackle this using fiscal measures when the UK Soft Drinks Industry Levy is introduced. We review recent evidence from natural experiments of the impact of health-related food and drink taxes on consumer behaviour, and discuss the possible consequences of these approaches on purchases and health. We highlight some of the potential indirect consequences and the importance of robust prospective evaluation.
The past half-century has been characterised by major technological developments and massive societal change, which have profoundly changed how food is produced, processed, sold and consumed. These days we are faced with a huge choice of pre-packaged foods when we shop in modern supermarkets and we can buy seasonal food all year round, in and out of season. For decades now, the need to guide choice has been recognised, and retailers and many manufacturers have provided increasing amounts of on-pack information and signposting, and more recently UK retailers have led the way in championing front-of-pack information provision, to supplement the standard back-of-pack nutrient composition table. From a European perspective, the present paper summarises developments in nutrition labelling information and signposting, the legislation that controls on-pack declarations, and research conducted to assess whether or not the information is used, understood and supports healthier choices. It also considers whether more could be done to influence behaviour change positively, giving examples of approaches identified in the research.
The objective of the present paper is to draw lessons from policy development on sustainable diets. It considers the emergence of sustainable diets as a policy issue and reviews the environmental challenge to nutrition science as to what a ‘good’ diet is for contemporary policy. It explores the variations in how sustainable diets have been approached by policy-makers. The paper considers how international United Nations and European Union (EU) policy engagement now centres on the 2015 Sustainable Development Goals and Paris Climate Change Accord, which require changes across food systems. The paper outlines national sustainable diet policy in various countries: Australia, Brazil, France, the Netherlands, Qatar, Sweden, UK and USA. While no overarching common framework for sustainable diets has appeared, a policy typology of lessons for sustainable diets is proposed, differentiating (a) orientation and focus, (b) engagement styles and (c) modes of leadership. The paper considers the particularly tortuous rise and fall of UK governmental interest in sustainable diet advice. Initial engagement in the 2000s turned to disengagement in the 2010s, yet some advice has emerged. The 2016 referendum to leave the EU has created a new period of policy uncertainty for the UK food system. This might marginalise attempts to generate sustainable diet advice, but could also be an opportunity for sustainable diets to be a goal for a sustainable UK food system. The role of nutritionists and other food science professions will be significant in this period of policy flux.
Symposium 1: The global impact of urbanisation
Conference on ‘Improving nutrition in metropolitan areas’
Childhood obesity is a common concern across global cities and threatens sustainable urban development. Initiatives to improve nutrition and encourage physical exercise are promising but are yet to exert significant influence on prevention. Childhood obesity in London is associated with distinct ethnic and socio-economic patterns. Ethnic inequalities in health-related behaviour endure, underpinned by inequalities in employment, housing, access to welfare services, and discrimination. Addressing these growing concerns requires a clearer understanding of the socio-cultural, environmental and economic contexts of urban living that promote obesity. We explore opportunities for prevention using asset based-approaches to nutritional health and well-being, with a particular focus on adolescents from diverse ethnic backgrounds living in London. We focus on the important role that community engagement and multi-sectoral partnership play in improving the nutritional outcomes of London's children. London's children and adolescents grow up in the rich cultural mix of a global city where local streets are characterised by diversity in ethnicities, languages, religions, foods, and customs, creating complex and fluid identities. Growing up with such everyday diversity we argue can enhance the quality of life for London's children and strengthen their social capital. The Determinants of young Adult Social well-being and Health longitudinal study of about 6500 of London's young people demonstrated the positive impact of cultural diversity. Born to parents from over a hundred countries and exposed to multi-lingual households and religious practices, they demonstrated strong psychological resilience and sense of pride from cultural straddling, despite material disadvantage and discrimination. Supporting the potential contribution of such socio-cultural assets is in keeping with the values of social justice and equitable and sustainable development. Our work signals the importance of community engagement and multisectoral partnerships, involving, for example, schools and faith-based organisations, to improve the nutrition of London's children.