About two billion people in the world suffer from micronutrient deficiencies(Reference Bailey, West and Black1). The most common micronutrient deficiencies are those of iron, iodine, zinc, vitamin A and folate. Women and children in low- and middle-income countries (LMIC) are the most affected(Reference Black, Allen and Bhutta2, Reference Muthayya, Rah and Sugimoto3). These deficiencies compromise several essential bodily and cellular functions leading to a range of disabilities including adverse pregnancy outcomes, poor growth, impaired cognitive development, compromised immunity and even death(Reference Christian and Stewart4, Reference Kennedy, Nantel and Shetty5). Consequently, the scale-up of proven interventions addressing micronutrient deficiencies and anaemia are being promoted to achieve the global nutrition targets and the nutrition-related sustainable development goals(6).
Among the proven evidence-based interventions proposed in the Lancet Nutrition Series, fortification of staple foods and supplementation are considered to be cost-effective in preventing and controlling micronutrient deficiencies(Reference Bhutta, Das and Rizvi7). Fortifications of staple foods have been introduced in industrialised countries as early as the 1920s(Reference Allen, De Benoist and Dary8). Micronutrient deficiencies that were then widespread have been greatly decreased or have even been eliminated(Reference Mannar and Hurrell9). To replicate this success story in LMIC, food fortification is now being initiated and promoted in many African and South east Asian countries where the burden of multiple micronutrient deficiencies are believed to be high(Reference Osendarp, Martinez and Garrett10).
This review paper reminds the basic principles for effective fortification, highlights current fortification practices and illustrates the possible risks of adverse effects related to excessive intakes. The focus on the risks of excessive intake in this review is deliberate as the benefits of fortification in preventing micronutrient deficiencies are widely known and many reviews exist on the subject.
Basic principles for effective fortification
Effective implementation of micronutrient fortification programme requires documented evidence that the deficiency in question exists and that fortification will correct this deficiency(Reference Allen, De Benoist and Dary8). The micronutrient deficiencies can be best determined through biochemical measurements, but to determine whether a confirmed deficiency will be addressed through fortification requires food consumption surveys that identify and quantify the nutrient gap to be filled(Reference Allen, De Benoist and Dary8, Reference Gibson, Carriquiry and Gibbs11). Ideally, nationally representative data should provide the food and nutrient intake information of different population and physiological groups, because fortified foods can indiscriminately reach all these groups(Reference Allen, De Benoist and Dary8).
The main goal of an effective fortification programme is to correct inadequate nutrient intake while making sure that excessive intake is acceptably low(Reference Allen, De Benoist and Dary8). This requires a 2 d intake data at least from a subset of the population to estimate the usual intake distribution of the population in question, to then shift the distribution upwards so that the intake of each nutrient meets the estimated average requirement for 97·5 % of the population, while not more that 2·5 % of the population reaches or exceeds the tolerable upper intake level(Reference Allen, De Benoist and Dary8) (Fig. 1). Although these are basic principles that need to govern every fortification programme, countries with at least one national food consumption survey in LMIC are very limited(Reference Huybrechts, Aglago and Mullee12). For example, in 2017 only four sub-Saharan African countries (Ethiopia, Nigeria, South Africa and Uganda) had at least one nationally representative food consumption survey (Fig. 2(a)). In contrast, a lot more countries without a national food consumption survey have introduced mandatory food fortification programmes or have the legislation mandating grain fortification (Fig. 2(b)). Perhaps, the implementation of fortification programmes in the absence of data on micronutrient intake might have been precipitated from the assumption that: fortification is a proven strategy; micronutrient deficiencies in the diet are likely to exist; and the risk of any adverse effect is rare or non-existent.
Evidence from existing intake and biochemical data
The few existing and nationally representative intake data confirm the high prevalence of inadequate intake of one or more micronutrient deficiencies. However, they also indicate that excessive intake of iodine, vitamin A, iron and folic acid exist.
Risk of excessive iodine intake
A closer look at the Nigerian Food Consumption Survey reveals that not only inadequate intake of iodine exists, but a significant proportion of the population consume more than adequate amounts of iodine, and about 30 % of children, 20 % of women and close to 10 % of pregnant women have possible excessive intakes. These estimates were based on urinary iodine concentration assay(Reference Maziya-Dixon, Akinyele and Oguntona13). A more recent study conducted in Djibouti, Kenya and Tanzania showed that urinary iodine concentration exceeded the threshold for excessive iodine intake among school age children, whereas thyroglobulin, the thyroid-specific glycoprotein on which iodine is stored and thyroid hormone is formed, was high in all population groups(Reference Farebrother, Zimmermann and Abdallah14). Indeed, according to the global iodine network, excess iodine intake is present in eleven countries(Reference Network15) and was primarily due to overiodisation of salts, possibly high sodium intake and high iodine concentrations in ground water(Reference Farebrother, Zimmermann and Abdallah14).
The effects of such excessive iodine intake on thyroid function have not been well studied. Acute excess iodine intake is well tolerated because of homeostatic regulation that transiently shuts down thyroid hormone synthesis(Reference Leung and Braverman16); however, chronic exposure to excessive iodine intake can be less tolerated and lead to overt hyperthyroidism among individuals with sub-clinical hyperthyroidism(Reference Cooper and Biondi17). This risk is particularly high in historically iodine-deficient regions where nodular goitre is more prevalent(Reference Pearce18), and thus requires attention.
Risk of excessive vitamin A intake
Vitamin A deficiency, a risk factor for blindness and for mortality from measles and diarrhoea, remains a public health concern in much of Africa and south Asia(Reference Stevens, Bennett and Hennocq19). However, recent evidence suggests that there is a significant decreasing trend in the proportion of children aged 6–59 months affected by vitamin A deficiency(Reference Stevens, Bennett and Hennocq19). The decreasing trend in vitamin A deficiency could largely be ascribed to vitamin A supplementation, fortification, reduced infection rates and possibly increased economic status and knowledge that enable households to consume vitamin A-rich foods(Reference Stevens, Bennett and Hennocq19). However, an emerging risk of excessive vitamin A intake is present as a result of overlapping interventions. Multiple foods are now fortified with vitamin A (e.g. oil, sugar, cereal starch), which substantially increase the risk of excessive intake that would have not been the case with the consumption of only one vitamin A fortified food vehicle(Reference Tanumihardjo, Kaliwile and Boy20) (Fig. 3).
Excessive vitamin A intake and increased liver vitamin A accumulation have been reported from studies in countries such as Zambia and Guatemala, where concurrent vitamin A interventions such as fortification of sugar, cooking oil, starch and vitamin A supplementation are given to children(Reference Mondloch, Gannon and Davis21–Reference Tanumihardjo23). These interventions are happening with little or no adequate assessment of vitamin A status; hence, increasing the risk of adverse effects(Reference Tanumihardjo, Gannon and Kaliwile24). Children turning yellow in the mango season as a result of excessive vitamin A intake have been reported(Reference Tanumihardjo, Gannon and Kaliwile25), yet very little effort exists to guide a more careful implementation of vitamin A interventions(Reference Tanumihardjo, Mokhtar and Haskell26). This is unfortunate, given that excessive vitamin A (retinol) has been linked with increased bone fracture in men(Reference Michaëlsson, Lithell and Vessby27) and women(Reference Feskanich, Singh and Willett28). Given that the problem of excessive vitamin A intake increases with increased coverage of various vitamin A interventions, monitoring population vitamin A intake is critical(Reference Tanumihardjo, Kaliwile and Boy20).
Risk of excessive folic acid intake
Deficiencies in folate (organic form found in food) impair several bodily functions. For example, impaired erythrocyte synthesis due to folate deficiencies can lead to macrocytic anaemia and increased risk of neural tube defects in fetuses(Reference Reynolds, Biller and Ferro29, Reference Berry, Li and Erickson30). Consequently, fortification with folic acid (synthetic form) has been implemented in over seventy-five countries worldwide(Reference Santos, Lecca and Cortez-Escalante31). This was reported to have decreased the incidence of neural tube defects in the USA by 19–32 %(Reference Boulet, Yang and Mai32, Reference Williams, Mai and Mulinare33). This success has led many countries in Africa and Asia where folate deficiencies are expected to be still high to plan or implement folic acid fortification.
As previously illustrated, many of these countries do not have adequate intake data nor have sufficient information on the folate concentrations in their food composition tables. In the absence of such information, the implementation of mandatory folic acid fortification can potentially be risky if excessive amounts are consumed(Reference Patel and Sobczyńska-Malefora34). This is because, unlike the natural form (folate), folic acid needs to undergo a two-step reduction to tetrahydrofolate through dihydrofolate reductase before being used for metabolic processes. This conversion (reduction) is not always efficient(Reference Scaglione and Panzavolta35). Besides, the absorption and biotransformation of the folic acid to its active folate 5 methyltetrahydrofolate form is saturated when intake of folic acid is excessive, thus leading to the appearance of unmetabolised folic acid in circulation(Reference Powers36).
Similar to inadequate folate intake, excess folic acid intake is also associated with adverse effects. The prolonged presence of unmetabolised folic acid has been associated with increased risk of cancer and insulin resistance in children(Reference Krishnaveni, Veena and Karat37, Reference Mason, Dickstein and Jacques38). Besides, excess folic acid intake masks vitamin B12 deficiency and at very high concentrations, has been associated with hepatotoxicity(Reference Patel and Sobczyńska-Malefora34). Given that high levels of unmetabolised folic acid have been increasingly diagnosed in countries with mandatory folic acid fortification(Reference Pfeiffer, Hughes and Lacher39, Reference Pasricha, Drakesmith and Black40), ensuring that interventions are guided by evidence is critical.
Risk of excessive iron intake
A key nutrient that fortification and supplementation programmes often target to prevent or treat anaemia is iron. This is informed by the wealth of information on the health benefits of iron fortification and supplementation, which includes reduced adverse pregnancy outcomes, lower risk of anaemia, increased cognitive function(Reference Bhutta, Das and Rizvi7, Reference Gera, Sachdev and Boy41). Consequently, fortification or supplementation with iron has been listed as a proven and efficacious intervention that if scaled-up, could substantially reduce morbidities and mortality(Reference Bhutta, Das and Rizvi7). In line with the global nutrition targets for 2025 that aim to reduce the prevalence of anaemia by 50 % (relative to prevalence figures in 2012), the WHO proposes daily iron supplementation for pregnant women and fortification of wheat, maize and rice starch in settings where these are staples(6, 42). Moreover, WHO strongly recommends home fortification of complementary foods with multiple micronutrient powders to improve iron status and reduce anaemia among infants and young children (6–23 months of age)(42).
In Ethiopia, a national food consumption survey was conducted to inform the design and implementation of fortification programmes(43). While dietary diversity and consumption of animal source foods was extremely low, iron intake was however high. Among women of reproductive age, inadequate intake was about 15 %, whereas excessive intake was estimated to be between 60 and 80 %. Similarly, smaller studies from different parts of the country confirm the low prevalence of inadequate intake and iron deficiency, even among children and pregnant women whose physiological demands are high(Reference Baye, Guyot and Icard-Verniere44–Reference Gebreegziabher and Stoecker47). Simulation of fortification with iron even at lower doses (about 6 mg) than the WHO recommendations (12·5 mg) significantly increased the risk of excessive intake among infants and young children(Reference Abebe, Haki and Baye45).
A closer look at the source of the high iron intake despite a relatively poor consumption of iron-fortified or animal source foods revealed that much of the iron came from soil contamination of staple cereals occurring during traditional threshing that happens under the hooves of cattle(Reference Baye, Mouquet-Rivier and Icard-Vernière48, Reference Guja and Baye49). Although little is known about the bioavailability of this iron, Guja and Baye(Reference Guja and Baye49) using a haemoglobin depletion–repletion rat assay indicated that extrinsic iron from soil contamination can contribute to haemoglobin regeneration. This is supported by observations from Malawi that showed low iron deficiency among rural women whose foods were contaminated by acidic soils(Reference Gibson, Wawer and Fairweather-Tait50). Additional sources of iron, such as screw-wares during milling(Reference Icard-Vernière, Hama and Guyot51) and groundwater iron(Reference Rahman, Ahmed and Rahman52, Reference Ahmed, Khan and Shaheen53), can also contribute to high iron intake in LMIC.
There is mounting evidence of the possible adverse effects associated with the provision of iron in excess or for iron-replete individuals. In iron-replete infants and young children, exposure to iron fortified foods has been associated with decreased growth(Reference Lönnerdal54, Reference Idjradinata, Watkins and Pollitt55), impaired cognitive development(Reference Hare, Arora and Jenkins56, Reference Wessling-Resnick57) and increased diarrhoea possibly due to altered gut microbiota to more pathogenic bacteria(Reference Paganini and Zimmermann58, Reference Jaeggi, Kortman and Moretti59). In pregnant women, there is emerging evidence that suggest that high iron status as reflected by high haemoglobin or serum ferritin values can be associated with adverse effects including preterm birth, impaired fetal growth and gestational diabetes(Reference Dewey and Oaks60–Reference Tamura, Goldenberg and Johnston65). More research in this area is urgently needed to better understand the emerging evidence coming largely from observational studies.
Summary and programmatic implications
Both inadequate and excessive micronutrient intakes may co-exist in LMIC, and both are related with adverse health outcomes. This calls for a better design and implementation of micronutrient interventions to maximise benefits and minimise adverse outcomes. Before implementing micronutrient interventions, adherence to the basic principles of documenting evidence confirming that the deficiency in question exists and that fortification will correct this deficiency is needed. This can be supported with dietary intake assessments and biochemical screening that help diagnose nutrient deficiencies, estimate the gap to be filled and the risk of excessive intake. Targeting micronutrient interventions, although programmatically challenging, should be considered whenever possible. To this end, recent advances in point of care diagnostics such as the Cornell Nutriphone(Reference Mehta, Colt and Lee66) and the density-based fractionation of erythrocytes(Reference Hennek, Kumar and Wiltschko67) hold promise for a minimally-invasive screening tool with potential application in-field. Closer monitoring of appropriate fortification of foods and any possible risk with overlapping interventions is also needed.
Conclusion
Many LMIC do not have even one nationally representative food consumption survey; hence, the extent and magnitude of excessive intake and related adverse effects remain largely unknown. Besides, even when intake data exist, the proportion of the population exceeding the upper limit is rarely reported. Future research should not only focus on inadequate intake, but also on the risks of excessive intakes. Moreover, the validity of upper limits for specific age groups such as infants and young children are still debatable and thus require more studies. Operational research on how to best target micronutrient interventions, and whether more bioavailable fortificants provided at a lower dosage are more effective and safe requires further investigation.
Acknowledgements
I would like to thank the ANEC VIII organisers.
Financial Support
None.
Conflict of Interest
None.
Authorship
The author had sole responsibility for all aspects of preparation of this paper.