We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save this undefined to your undefined account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your undefined account.
Find out more about saving content to .
To send this article to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
More than 3 million individuals are estimated to be at risk of malnutrition in the UK, of whom about 93% live in the community. BAPEN's Nutrition Screening Week surveys using criteria based on the ‘Malnutrition Universal Screening Tool’ (‘MUST’) revealed that 28% of individuals on admission to hospital and 30–40% of those admitted to care homes in the previous 6 months were malnourished (medium+high risk using ‘MUST’). About three quarters of hospital admissions and about a third of care home admissions came from their own homes with a malnutrition prevalence of 24% in each case. Outpatient studies using ‘MUST’ showed that 16–20% patients were malnourished and these were associated with more hospital admissions and longer length of stay. In sheltered housing, 10–14% of the tenants were found to be malnourished, with an overall estimated absolute prevalence of malnutrition which exceeded that in hospitals. In all cases, the majority of subjects were at high risk of malnutrition. These studies have helped establish the magnitude of the malnutrition problem in the UK and identified the need for integrated strategies between and within care settings. While hospitals provide a good opportunity to identify malnourished patients among more than 10 million patients admitted there annually and the five- to six-fold greater number attending outpatient departments, commissioners and providers of healthcare services should be aware that much of the malnutrition present in the UK originates in the community before admission to hospitals or care homes or attendance at outpatient clinics.
In 2007, the estimated cost of disease-related malnutrition in the UK was in excess of £13×109. At any point in time, only about 2% of over 3 million individuals at risk of malnutrition were in hospital, 5% in care homes and the remainder in the community (2–3% in sheltered housing). Some government statistics (England) grossly underestimated the prevalence of malnutrition on admission and discharge from hospital (1000–3000 annually between 1998 and 2008), which is less than 1% of the prevalence (about 3 million in 2007–2008) established by national surveys using criteria based on the ‘Malnutrition Universal Screening Tool’ (‘MUST’). The incidence of malnutrition-related deaths in hospitals, according to government statistics (242 deaths in England in 2007), was also <1% of an independent estimate, which was as high as 100 000/year. Recent healthcare policies have reduced the number of hospital and care home beds and encouraged care closer to home. Such policies have raised issues about education and training of the homecare workforce, including 6 million insufficiently supported informal carers (10% of the population), the commissioning process, and difficulties in implementing nutritional policies in a widely distributed population. The four devolved nations in the UK (England, Scotland, Northern Ireland and Wales) have developed their own healthcare polices to deal with malnutrition. These generally aim to span across all care settings and various government departments in a co-ordinated manner, but their effectiveness remains to be properly evaluated.
With the considerable cost of disease-related malnutrition to individuals and to society (estimated to be >£13×109 for the UK, 2007 prices), there is a need for effective and evidence-based ways of preventing and treating this condition. The wide range of oral nutritional supplements that may be prescribed for the dietary management of malnutrition and other conditions account for only about 1% (about £99×106, 2007 data) of the prescribing budget in England. Systematic reviews and meta-analyses consistently suggest that ready-made, multi-nutrient liquids which may be prescribed can improve energy and nutritional intake, body weight and have a variety of clinical and functional benefits in a number of patient groups. Meta-analyses have repeatedly shown that oral nutritional supplements produce significant reductions in complications (e.g. infections) and mortality, and a recent meta-analysis shows a reduction in hospital admissions (OR 0·56 (95% CI 0·41, 0·77), six randomised controlled trials). Such benefits suggest that the appropriate use of oral nutritional supplements should form an integral part of the management of malnutrition, particularly as there is currently a lack of evidence for alternative oral nutrition strategies (e.g. food fortification and counselling). As with all therapies, compliance to oral nutritional supplements needs to be maximised and the use monitored. To make sure that those at risk of malnutrition are identified and treated appropriately, there is a need to embed national and local policies into routine clinical practice. In doing so, the economic burden of this costly condition can be curtailed. As recently suggested by the National Institute for Health and Clinical Excellence, substantial cost savings could be made if screening and treatment of malnourished patients was undertaken.
The terminology used for describing intervention groups in randomised controlled trials (RCT) on the effect of intravenous fluid on outcome in abdominal surgery has been imprecise, and the lack of standardised definitions of the terms ‘standard’, ‘restricted’ and ‘liberal’ has led to some confusion and difficulty in interpreting the literature. The aims of this paper were to clarify these definitions and to use them to perform a meta-analysis of nine RCT on primarily crystalloid-based peri-operative intravenous fluid therapy in 801 patients undergoing elective open abdominal surgery. Patients who received more or less fluids than those who received a ‘balanced’ amount were considered to be in a state of ‘fluid imbalance’. When ‘restricted’ fluid regimens were compared with ‘standard or liberal’ fluid regimens, there was no difference in post-operative complication rates (risk ratio 0·96 (95% CI 0·56, 1·65), P=0·89) or length of hospital stay (weighted mean difference (WMD) −1·77 (95% CI −4·36, 0·81) d, P=0·18). However, when the fluid regimens were reclassified and patients were grouped into those who were managed in a state of fluid ‘balance’ or ‘imbalance’, the former group had significantly fewer complications (risk ratio 0·59 (95% CI 0·44, 0·81), P=0·0008) and a shorter length of stay (WMD −3·44 (95% CI −6·33, −0·54) d, P=0·02) than the latter. Using imprecise terminology, there was no apparent difference between the effects of fluid-restricted and standard or liberal fluid regimens on outcome in patients undergoing elective open abdominal surgery. However, patients managed in a state of fluid balance fared better than those managed in a state of fluid imbalance.
Concerns about the over-prescription of peri-operative fluids, particularly normal saline, culminated in the recent publication of UK national guidelines on fluid prescription during and after surgery. A working group comprising members of the nutrition support team, surgeons, anaesthetists and pharmacists therefore sought to reduce the overall levels of fluid prescription and to limit normal saline usage in our large Teaching Hospital by producing written local fluid prescribing guidelines and holding a series of fluid prescription education sessions for consultants and junior staff. Ideally, the success of such measures would have been determined by studies on fluid balance, body weight and/or measured body water in large numbers of individual patients in a large cluster-randomised controlled trial. However, this would have proved logistically difficult and very costly especially as it is notoriously difficult to rely on the accuracy of daily fluid balance charts in large numbers of patients on busy post-operative surgical wards. We therefore undertook a pragmatic study, comparing historical data on fluid type/volume prescribed (from both individual and ward level pharmacy records), oedema status and clinical outcomes from 2002 with two prospective audits of similar data carried out during 2008 and 2009. Our data showed that in the comparable, elective surgical patients within each audit, there was a decline in total intravenous fluids prescribed over the first 5 post-operative days from 21·1 litres per patient in 2002 to 14·2 litres per patient in 2009 (P<0·05), while pharmacy records showed that the proportion of 0·9% saline supplied declined from 60% to 35% of all fluids supplied to the surgical wards involved, with a concomitant increase in the use of 4%/0·18% dextrose-saline and Hartmann's solution. Alongside these changes in fluid prescribing, the number of patients with clinically apparent oedema declined from 53% in 2002 to 36% in 2009; gut function returned more quickly (6 d in 2002 v. 4 d in 2009, P<0·05) and the length of stay improved from 13 d in 2002 to 10 d in 2009, P<0·05). Although we accept that other factors might have contributed to the observed changes in these clinical parameters, we believe that the measures to reduce fluid and saline administration were the major contributors to these improved clinical outcomes.
Symposium 4: Whose fault was it anyway? Competencies in training
Parenteral nutrition (PN) line sepsis is a common and yet poorly managed complication in hospitalised patients receiving PN. Making a clinical diagnosis is difficult as the clinical picture can be very non-specific and definitions of what constitutes line infection can vary. Once there is clinical suspicion, proving it with microbiological techniques is not an exact science. Traditional techniques have required the removal of the PN line to allow microbiologists to perform analysis of it for infection. This has obvious drawbacks as it is often not easy to replace the line in these patients and the line is often later proven not to be the source of the sepsis. Although the gold-standard technique still requires removal of the line, there has been development in the field of diagnosis line infection while conserving the line. These include intra-luminal brushings of the line, differential blood cultures and simple swabs of the line hub. These techniques are not as sensitive but reduce the problems caused by removing and re-inserting the line in these patients. The definition of PN line sepsis varies between institutions. Rates can be expressed as a true number of cases, or can be expressed correctly as a number of cases per 1000 line days to standardise rates between units of differing sizes. Rates can also be altered if the diagnostic criteria are too strict or too lax. Accurate diagnosis of PN line sepsis remains difficult in modern medical practice.
Symposium 5 (Joint Nutrition Society and BAPEN Symposium): Too many pies: metabolic competencies in obesity
The objective of the present review is to provide an overview of the metabolic effects of pro-inflammatory cytokine production during infection and injury; to highlight the disadvantages of pro-inflammatory cytokine production and inflammatory stress on morbidity and mortality of patients; to identify the influence of genetics and adiposity on inflammatory stress in patients and to indicate how nutrients may modulate the inflammatory response in patients. Recent research has shown clearly that adipose tissue actively secretes a wide range of pro- and anti-inflammatory cytokines. Paradoxically, although inflammation is an essential part of the response of the body to infection, surgery and trauma, it can adversely affect patient outcome. The metabolic effects of inflammation are mediated by pro-inflammatory cytokines. Metabolic effects include insulin insensitivity, hyperlipidaemia, muscle protein loss and oxidant stress. These effects, as well as being present during infective disease, are also present in diseases with a covert inflammatory basis. These latter diseases include obesity and type 2 diabetes mellitus. Inflammatory stress also increases during aging. The level of cytokine production, within individuals, is influenced by single nucleotide polymorphisms (SNP) in cytokine genes. The combination of SNP controls the relative level of inflammatory stress in both overt and covert inflammatory diseases. The impact of cytokine genotype on the intensity of inflammatory stress derived from an obese state is unknown. While studies remain to be done in the latter context, evidence shows that these genomic characteristics influence morbidity and mortality in infectious disease and diseases with an underlying inflammatory basis and thereby influence the cost of in-patient obesity. Antioxidants and n-3 PUFA alter the intensity of the inflammatory process. Recent studies show that genotypic factors influence the effectiveness of immunonutrients. A better understanding of this aspect of nutrient–gene interactions and of the genomic factors that influence the intensity of inflammation during disease will help in the more effective targeting of nutritional therapy.
The worldwide obesity epidemic over the last 20 years has led to a dramatic increase in the prevalence of non-alcoholic fatty liver disease, the hepatic manifestation of the metabolic syndrome. Estimates of prevalence vary depending on the population studied and the methods used to assess hepatic fat content, but are commonly quoted as between 10 and 30% of the adults in the Western hemisphere. Fatty liver develops when fatty acid uptake and synthesis in the liver exceeds fatty acid oxidation and export as TAG. Studies of pathogenesis point to insulin resistance, lipotoxicity, oxidative stress and chronic inflammation being central to the development and progression of the disease. A proportion of individuals with fatty liver develop progressive disease, though large prospective longitudinal studies are lacking. Nevertheless, fatty liver is associated with increased all-cause and liver-related mortality compared with the general population. Management of fatty liver centres around lifestyle and dietary measures to induce controlled and sustained weight loss. Management of cardiovascular risk factors aims to reduce mortality, while certain dietary interventions have been shown to reduce steatosis and inflammation. Specific pharmacological treatments also show promise, but their use is not widespread. A multi-system and multi-disciplinary approach to the management of this disorder is proposed.
Obesity has been described as the health crisis of the 21st century. It is a chronic lifelong medical condition, whose pattern often starts in childhood, and is demographically worsening in every developed country. The cost of treating the many medical conditions associated with obesity threatens to overwhelm healthcare resources. Medical treatments produce at most no more than 10% weight loss in the severely obese, with high failure rates. In this article, we review the available evidence regarding long-term reduction in weight, reduced mortality and improvement in most, if not all, obesity-related comorbidities. There is a need for daily multivitamins and extra minerals, especially with gastric bypass, and nutritional deficiencies of vitamins D and B12, Ca, Fe and folate need monitoring and prevention. Currently there is no medical therapy on the near horizon that will match the effect of surgery, which, if done safely, remains the only effective therapy. Bariatric surgery is cost effective, and health providers should embrace the development and rapid expansion of services.
This review details the practicalities of providing nutrition support to obese patients who experience complications following bariatric surgery and highlights some of the nutritional challenges encountered by this group of patients. Bariatric surgery to treat morbid obesity has significantly increased internationally over the past decade with hospital admissions rising annually. The gastric bypass is currently the most commonly performed procedure. The complication rate can be up to 16%, with a considerable proportion having nutritional implications. The treatment can involve avoidance of oral diet and nutrition support, i.e. enteral or parenteral nutrition. Opposition to nutrition support can be encountered. It is useful to clarify the aims of nutrition support, these being: the avoidance of overfeeding and its consequences, preservation of lean body mass and promotion of healing. Evidence suggests that hypoenergic nutrition is not harmful and may actually be beneficial. There is a lack of consensus regarding the optimum method to predict the nutritional requirements in the obese acutely unwell patient. The literature suggests that the predicted equations are fairly accurate compared to measured energy expenditure in free living obese patients before and after bariatric surgery. However, these findings cannot be directly applied to those obese patients experiencing complications of bariatric surgery, who will be acutely unwell exhibiting inflammatory response. It is therefore necessary to refer to the literature on energy expenditure in hospitalized obese patients, to help guide practice. More research examining the energy and protein requirements of obese patients needing nutrition support following bariatric surgery is urgently required.
Clinical response to medication can differ between patients. Among the known sources of variability is an individual's nutrition status. This review defines some pharmacokinetic terms, provides relevant body size metrics and describes the physiologic influences of protein–energy malnutrition and obesity on drug disposition. Weight-based drug dosing, which presumes a healthy BMI, can be problematic in the protein–energy malnourished or obese patient. The use of total body weight, lean body weight, or an adjusted body weight depends on the drug and how it is differently handled in malnutrition or obesity. Most of the recognized influences are seen in drug distribution and drug elimination as a result of altered body composition and function. Distribution characteristics of each drug are determined by several drug-related factors (e.g. tissue affinity) in combination with body-related factors (e.g. composition). Drug elimination occurs through metabolic and excretory pathways that can also vary with body composition. The current data are limited to select drugs that have been reported in small studies or case reports. In the meantime, a rational approach to evaluate the potential influences of malnutrition and obesity can be used clinically based on available information. Antimicrobials are discussed as a useful example of this approach. Further advancement in this field would require collaboration between experts in body composition and those in drug disposition. Until more data are available, routine monitoring by the clinician of the protein–energy malnourished or obese patient receiving weight-based drug regimens is necessary.
Drugs have the potential to interact with nutrients potentially leading to reduced therapeutic efficacy of the drug, nutritional risk or increased adverse effects of the drug. Despite significant interest in such interactions going back to over more than 40 years, the occurrence and clinical significance of many drug–nutrient interactions remains unclear. However, interactions involving drugs with a narrow therapeutic margin such as theophylline and digoxin and those that require careful blood monitoring such as warfarin are likely to be those of clinical significance. Drugs can affect nutrition as a result of changes in appetite and taste as well as having an influence on absorption or metabolism of nutrients. Moreover, foods and supplements can also interact with drugs, of which grapefruit juice and St John's wort are key examples. Significant numbers of people take both supplements and medication and are potentially at risk from interactions. Professionals, such as pharmacists, dietitians, nurses and doctors, responsible for the care of patients should therefore check whether supplements are being taken, while for researchers this is an area worthy of significant further study, particularly in the context of increasingly complex drug regimens and the plethora of new drugs.
There are many factors that can influence nutritional intake. Food availability, physical capability, appetite, presence of gastrointestinal symptoms and perception of food are examples. Drug therapy can negatively influence nutritional intake through their effect on these factors, predominantly due to side effects. This review aims to give a brief overview of each of these factors and how drug therapy can affect them. Specific examples are given for each section and an indication of the impact on nutritional status. This article aims to assist the clinician in the identification of the effects of drug therapy on nutritional intake and provides advice on appropriate intervention. A drug history and side effect review should form an integral part of nutritional assessment. Early identification and effective therapeutic use of alternative drug therapy can also positively influence nutritional intake.
Satellite Symposium: Throw another fish on the fire: the role of n-3 in inflammation
Lipids traditionally used in artificial nutrition are based on n-6 fatty acid-rich vegetable oils like soyabean oil. This may not be optimal because it may present an excessive supply of linoleic acid. One alternative to the use of soyabean oil is its partial replacement by fish oil, which contains n-3 fatty acids. These fatty acids influence inflammatory and immune responses and so may be useful in particular situations where those responses are not optimal. Fish oil-containing lipid emulsions have been used in parenteral nutrition in adult patients post-surgery (mainly gastrointestinal). This has been associated with alterations in patterns of inflammatory mediators and in immune function and, in some studies, a reduction in length of intensive care unit (ICU) and hospital stay. Perioperative administration of fish oil may be superior to post-operative. Parenteral fish oil has been used in critically ill adults. Here the influence on inflammatory processes, immune function and clinical endpoints is not clear, since there are too few studies and those that are available report contradictory findings. Fish oil is included in combination with other nutrients in various enteral formulas. In post-surgical patients and in those with mild sepsis or trauma, there is clinical benefit from a formula including fish oil and arginine. A formula including fish oil, borage oil and antioxidants has demonstrated marked benefits on gas exchange, ventilation requirement, new organ failures, ICU stay and mortality in patients with acute respiratory distress syndrome, acute lung injury or severe sepsis.
Many patients in the intensive care unit are malnourished or unable to eat. Feeding them correctly has the potential to reduce morbidity and even mortality but is a very complex procedure. The inflammatory response induced by surgery, trauma or sepsis will alter metabolism, change the ability to utilise nutrients and can lead to rapid loss of lean mass. Both overfeeding and underfeeding macronutrients can be harmful but generally it would seem optimal to give less during metabolic stress and immobility and increase in recovery. Physical intolerance of feeding such as diarrhoea or delayed gastric emptying is common in the intensive care unit. Diarrhoea can be treated with fibre or peptide feeds and anti-diarrhoeal drugs; however, the use of probiotics is controversial. Gastric dysfunction problems can often be overcome with prokinetic drugs or small bowel feeding tubes. New feeds with nutrients such as n-3 fatty acids that have the potential to attenuate excessive inflammatory responses show great promise in favourably improving metabolism and substrate utilisation. The importance of changing nutrient provision according to metabolic and physical tolerance cannot be understated and although expert groups have produced many guidelines on nutritional support of the critically ill, correct interpretation and implementation can be difficult without a dedicated nutrition health care professional such as a dietitian or a multidisciplinary nutritional support team.
Session 1
Symposium on ‘Food supply and quality in a climate-changed world’
The concept of local food has gained traction in the media, engaged consumers and offered farmers a new marketing tool. Positive claims about the benefits of local food are probably not harmful when made by small-scale producers at the local level; however, greater concern would arise should such claims be echoed in policy circles. This review examines the evidence base supporting claims about the environmental and health benefits of local food. The results do not offer any support for claims that local food is universally superior to non-local food in terms of its impact on the climate or the health of consumers. Indeed several examples are presented that demonstrate that local food can on occasions be inferior to non-local food. The analysis also considers the impact on greenhouse gas emissions of moving the UK towards self-sufficiency. Quantitative evidence is absent on the changes in overall emissions that would occur if the UK switched to self-sufficiency. A qualitative assessment suggests the emissions per item of food would probably be greater under a scenario of self-sufficiency than under the current food system. The review does not identify any generalisable or systematic benefits to the environment or human health that arise from the consumption of local food in preference to non-local food.
Food production and security will be a major issue for supplying an increasing world population. The problem will almost certainly be exacerbated by climate change. There is a projected need to double food production by 2050. In recent times, the trend has been for incremental modest yield increases for most crops. There is an urgent need to develop integrated and sustainable approaches that will significantly increase both production per unit land area and the resource use efficiency of crops. This review considers some key processes involved in plant growth and development with some examples of ways in which molecular technology, plant breeding and genetics may increase the yield and resource use efficiency of wheat. The successful application of biotechnology to breeding is essential to provide the major increases in production required. However, each crop and each specific agricultural situation presents specific requirements and targets for optimisation. Some increases in production will come about as new varieties are developed which are able to produce satisfactory crops on marginal land presently not considered appropriate for arable crops. Other new varieties will be developed to increase both yield and resource use efficiency on the best land.
Symposium on ‘Food supply and quality in a climate-changed world’
Human individuals require at least 20 inorganic elements (‘minerals’) for normal functioning. However, much of the world's population is probably deficient in one or more essential minerals and at increased risk of physiological disorders. Addressing these ‘hidden hungers’ is a challenge for the nutrition and agriculture sectors. Mineral deficiencies among populations are typically identified from dietary surveys because (1) minerals are acquired primarily from dietary sources and (2) (bio)assays of mineral status can be unreliable. While dietary surveys are likely to under-report energy intakes, surveys show that 9% of all UK and US adults consume Ca and Mg, and 14% of adults consume K, at quantities below the UK lower reference nutrient intake, and are therefore at risk of deficiency. Low dietary Ca, Mg and K intakes can be caused by energy-malnourishment and by cultural and economic factors driving dietary conservatism. For example, cereal grains routinely displace vegetables and fruits in the diet. Cereal grains have low concentrations of several minerals, notably Ca, as a consequence of their physiology. Low grain mineral concentrations are compounded when cereal crops are grown in soils of low mineral phytoavailability and when grain is processed. In this paper, the impact of increased vegetable consumption and horticultural biofortification, i.e. enhancing crop mineral content through breeding and agronomy, on intakes of the major minerals Ca, Mg and K is assessed. Despite low energy intake from horticultural crops generally, increased vegetable consumption and biofortification would significantly improve dietary intakes of Ca, Mg and K.
Session 2
Symposium on ‘Food supply and quality in a climate-changed world’
Ruminant farming is an important component of the human food chain. Ruminants can use offtake from land unsuitable for cereal crop cultivation via interaction with the diverse microbial population in their rumens. The rumen is a continuous flow fermenter for the digestion of ligno-cellulose, with microbial protein and fermentation end-products incorporated by the animal directly or during post-ruminal digestion. However, ruminal fermentation is inefficient in capturing the nutrient resource presented, resulting in environmental pollution and generation of greenhouse gases. Methane is generated as a consequence of ruminal fermentation and poor retention of ingested forage nitrogen causes nitrogenous pollution of water and land and contributes to the generation of nitrous oxide. One possible cause is the imbalanced provision of dietary substrates to the rumen micro-organisms. Deamination of amino acids by ammonia-producing bacteria liberates ammonia which can be assimilated by the rumen bacteria and used for microbial protein synthesis. However, when carbohydrate is limiting, microbial growth is slow, meaning low demand for ammonia for microbial protein synthesis and excretion of the excess. Protein utilisation can therefore be improved by increasing the availability of readily fermentable sugars in forage or by making protein unavailable for proteolysis through complexing with plant secondary products. Alternatively, realisation that grazing cattle ingest living cells has led to the discovery that plant cells undergo endogenous, stress-mediated protein degradation due to the exposure to rumen conditions. This presents the opportunity to decrease the environmental impact of livestock farming by using decreased proteolysis as a selection tool for the development of improved pasture grass varieties.
The paper considers some of the reasons why governments develop food policies, gives examples of what is in food policies at the Scottish and UK levels and explores ways of effectively providing balanced evidence for policy development. It discusses the challenges of exchanging knowledge between the science and policy communities, given their different languages and cultures, highlighting the need for greater mutual understanding of roles and responsibilities. It draws on experience in the Scottish Government of developing the government's ‘Recipe for Success – Scotland's National Food and Drink Policy’ through engagement with stakeholders, scientists and analysts and touches on the more complex nature of the Department for International Development's contribution to meeting the first Millennium Development Goal. It compares the need for collation and analysis of existing evidence during the development of policy, with the desirability of providing policy direction for longer-term strategic research and the challenges of connecting the policy expectations with researchable questions. The paper concludes by emphasising the need to focus research in the short-term on mitigation of climate change through decreasing greenhouse gas emissions associated with the production of food, while also taking an account of economic, health and broader environmental sustainability objectives. A further challenge is to communicate complexity and uncertainty in ways which enable decision-makers from the consumer to policy-makers to make informed choices. Longer-term research needs to focus on the opportunities and risks associated with adapting to climate change.