This study provides a comprehensive overview of the relationship between dietary patterns and serum concentrations of essential and toxic trace elements in an adult cohort living in Galicia (north-western Spain). Using a combination of univariate tests, Random Forest models and logistic regression analysis, we identified specific food–element associations that can be interpreted within broader dietary patterns, helping to contextualize the nutritional impact on the trace element profile of the sample population. In parallel with our approach, recent studies have applied machine learning techniques, particularly Random Forest and XGBoost, to explore complex relationships between diet and micronutrient status in different populations and settings. These include models predicting iron, zinc, and copper deficiencies and anaemia risk in children [31], as well as identifying micronutrient intake patterns associated with undiagnosed hypertension in older adults [32] and also comparing machine learning with traditional regression methods to predict serum folate and vitamin B12 levels in healthy individuals [33]. Together, these examples highlight the growing importance of machine learning in nutritional epidemiology and support its use as a complementary tool alongside traditional dietary approaches for enhancing biomarker-based dietary assessment.
Among the essential elements analyzed, selenium clearly stands out with the most consistent associations across all analytical approaches. This result is particularly important given that selenium is the trace element with the most evident deficiency in our study population [19] These findings indicate that circulating selenium is highly responsive to dietary intake and should not be interpreted as evidence of overall nutritional adequacy. Rather, they support the use of selenium as a biomarker of selenium exposure and selenium-specific nutritional status, informing the development of targeted dietary strategies in populations lacking fortification policies.
Previous analyses in this population have identified distinct dietary patterns consistent with a traditional Atlantic diet, characterized by higher consumption of fish, legumes and oils, and lower intake of sweets and processed foods [26]. The associations observed in the present study between specific food groups and serum selenium status align well with these previously described patterns, reinforcing the relevance of dietary combinations rather than isolated foods in shaping selenium status.
Fish consumption emerged as the most relevant positive dietary predictor of selenium levels in our study, consistent with its recognized role as a major dietary source of this element [34,35,36]. It is also well established that nuts, particularly Brazil nuts, are rich in selenium [37]. However, in the sample cohort, a positive trend in nut consumption was only observed in the logistic regression model and did not reach statistical significance. This probably reflects their relatively limited intake in the dietary patterns of this population, in which fish remains a more prevalent and impactful source of selenium. These findings highlight the need to consider not only the nutrient density of individual foods, but also their actual frequency of consumption and cultural relevance in regional diets. This interpretation is further supported by a recent randomized controlled trial demonstrating that daily intake of Brazil nut butter, providing 55 µg of selenium, significantly improved serum selenium concentrations and selenoP levels in both vegans and omnivores, performing comparably to a standardized selenium supplement [38].
A positive association between selenium and legume intake was also observed, along with a negative association with oils (Table 1). However, legumes are not considered important dietary sources of selenium [36], suggesting that this association may be indirect. One plausible explanation is the existence of an overall healthier dietary pattern: individuals who consume more fish—which is the main dietary contributor to selenium in this cohort—may also consume more legumes and less oil. To explore this hypothesis, we analyzed the correlations between fish intake and the consumption of other food groups (Fig. 4). A positive correlation between consumption of fish and of legumes was observed, but no significant association between consumption of fish and of oil was found. This suggests that the link between selenium and legumes may reflect a shared dietary pattern, while the inverse association with oils could arise from other mechanisms. In fact, excessive lipid intake has been shown to impair intestinal function and nutrient absorption. High-fat diets may alter the intestinal mucosa, disrupt gut microbiota composition and increase oxidative stress [39, 40], all of which could reduce the bioavailability of micronutrients such as selenium. In addition, recent animal studies have demonstrated that selenium supplementation can restore intestinal barrier function and antioxidant defence under conditions of oxidative stress [41]. These findings reinforce the hypothesis that oil intake may influence selenium status through physiological rather than behavioural mechanisms.
In addition to its central role in selenium status, fish consumption also proved to be a key dietary predictor of iodine levels, highlighting its dual contribution to essential trace element intake. This finding is consistent with those of previous studies reporting positive associations between fish consumption and circulating concentrations of both selenium and iodine [42,43,44,45]. We also identified a positive association between iodine status and cereal consumption. While cereals are not direct sources of iodine, this relationship likely reflects the use of iodized salt in the manufacturing of industrial bakery products, particularly bread, which is a staple in the Spanish diet. Some processed cereals and breakfast products may also contain added iodized salt, contributing indirectly to iodine intake. In Spain, the use of iodized salt, which is regulated at approximately 60 mg of iodine per kilogram, has been promoted as a public health strategy to ensure adequate iodine intake [46, 47]. Similar strategies have proven successful in other high-income countries, including Denmark, the Netherlands, and Australia, where mandatory or voluntary fortification of bread with iodized salt has contributed substantially to improving population iodine status [48, 49]. These findings highlight how food processing practices and policy-driven interventions can significantly influence micronutrient intake, complementing individual dietary choices. Recent evidence from Germany confirms that use of iodized salt remains the strongest determinant of iodine status at the population level, underscoring the lasting impact of such public health strategies [48,49,50]. These findings underscore how food processing practices and policy-driven interventions can significantly influence micronutrient intake, complementing individual dietary choices.
For the remaining essential elements, i.e. iron, zinc, copper and manganese, no consistent dietary associations were identified. It should be acknowledged that serum concentrations of zinc, copper and manganese are not considered optimal biomarkers of habitual dietary intake in well-nourished populations, due to tight homeostatic regulation [2]. As a result, circulating levels of these elements tend to remain within relatively narrow ranges and are often insensitive to moderate variations in dietary intake. While some weak or inconsistent associations were observed in the univariate analyses, none of the remaining essential elements showed strong or consistent predictive patterns in the multivariate models. Importantly, serum concentrations of these elements were within normal ranges for all individuals in the cohort, indicating adequate nutritional status across the population. This probably reflects the efficiency of homeostatic mechanisms, particularly intestinal regulation of absorption and excretion, which help maintain serum mineral levels when dietary intake is close to dietary reference values. For instance, zinc homeostasis involves the dynamic regulation of intestinal transporters (ZIP4, ZNT1) and endogenous faecal excretion, making serum zinc relatively insensitive to moderate variability in intake in well-nourished individuals [51]. Similarly, iron absorption is tightly controlled by enterocyte transport mechanisms and hepcidin-dependent signalling pathways that adapt dynamically to iron stores in the body [52]. However, in settings where dietary intake is chronically insufficient, such as in low-income populations with low consumption of animal-source foods, these regulatory mechanisms may no longer be sufficient to maintain adequate serum levels. In such cases, homeostatic control becomes insufficient to compensate for low nutrient intake, and biochemical markers more clearly reflect underlying deficiencies. This has been particularly well documented for iron and zinc, as serum levels of these minerals tend to decline in populations with limited access to meat, fish, or dairy, leading to common outcomes such as anaemia, stunted growth and increased infection risk [53]. These examples highlight how the absence of strong serum–diet associations in well-nourished populations may reflect effective homeostasis, whereas in deficient contexts, dietary patterns become more directly evident in biomarker levels.
Regarding toxic elements, the most consistent associations were observed for mercury and arsenic. In both cases, fish consumption was positively associated with serum levels, in line with numerous previous studies identifying fish, particularly large and long-lived species, as primary sources of exposure to these metals in humans [54,55,56,57]. The fact that fish was also a key contributor to optimal selenium and iodine status in our cohort underscores its complex nutritional role as a source of both essential and potentially toxic trace elements. This dual contribution highlights the need to balance the benefits and risks associated with fish intake in dietary recommendations. In the study population, overall exposure to arsenic and mercury remained within the normal reference intervals for non-exposed healthy individuals and did not reach levels associated with adverse health outcomes, even among older adults in whom some bioaccumulation was observed [19]. Importantly, the benefits of fish consumption can be further maximized by prioritizing small and medium-sized species, which have lower levels of mercury and arsenic due to their shorter lifespan and lower trophic level [58, 59]. This dietary approach enables promotion of fish intake as part of public health strategies without compromising toxicological safety.
It should be noted that blood concentrations of certain toxic elements primarily reflect recent exposure rather than long-term intake. In the case of cadmium, circulating levels are considered markers of relatively recent exposure, whereas cumulative or long-term exposure is more accurately assessed using urinary cadmium concentrations [60]. This limitation is even more pronounced for arsenic, for which urinary arsenic is widely recognized as the most appropriate biomarker of chronic exposure [61]. These matrix-specific considerations may partly explain the modest or inconsistent associations observed between dietary intake estimates derived from FFQs and circulating levels of some toxic elements.
In addition to fish, arsenic levels were positively associated with wine and fruit consumption (Table 1). The association with wine may be explained by the historical use of arsenic-based fungicides in vineyards, leading to residual contamination in soils that can persist for decades and be transferred to the final product through the soil–vine–wine pathway. Although arsenic concentrations in wine are generally low and below legal thresholds, previous studies conducted in wine-producing regions have documented measurable levels of arsenic in grapes and wine, especially in soils with a legacy of contamination [62, 63]. On the contrary, the observed positive association between arsenic and fruit consumption was unexpected and should be interpreted with caution. Given the absence of recent evidence identifying fresh fruit as a major contributor to total arsenic exposure in European diets, this finding may instead reflect broader dietary patterns, such as increased fruit consumption among individuals with higher fish intake (Fig. 4).
Several noteworthy negative associations were also observed in our study, particularly between cereal consumption and the levels of arsenic, nickel, chromium and copper. One plausible explanation involves the high content of phytic acid (or phytates) in whole grains, which can chelate divalent and trivalent metal ions, such as As³⁺, Ni²⁺, Cr³⁺ and Cu²⁺, in the gastrointestinal tract, thereby reducing their intestinal absorption. These metals share physicochemical properties such as positive charge and coordination capacity, which facilitate their interaction with the negatively charged phosphate groups of phytic acid. This mechanism is well documented for essential elements such as iron and zinc, particularly in plant-based diets including low consumption of animal-source foods [64]. Emerging evidence further suggests that phytates may also bind toxic metals such as cadmium, lead, chromium and nickel [65]. Although direct human data are limited for some of these toxic elements, our findings support the hypothesis that cereal-rich diets could influence the bioavailability of such elements. This warrants further investigation to determine whether such dietary patterns may offer a protective effect against heavy metal absorption, particularly in settings with environmental exposure.
Other important negative associations included the inverse relationship between dairy intake and arsenic levels, probably mediated by the competitive effect of calcium on intestinal absorption. Recent in vitro digestion studies show that calcium can reduce the bioavailability of arsenic in simulated gastrointestinal conditions, lowering the availability of As³⁺ and As⁵⁺ by approximately 40–70%, depending on the digestive phase (gastric or intestinal) [66]. Likewise, a negative association between wine consumption and serum zinc levels was found. This finding is consistent with evidence from clinical studies showing that chronic alcohol intake can impair intestinal integrity and reduce zinc absorption, leading to lower serum concentrations in individuals with alcohol use disorder [67] as also reported in broader clinical assessments of trace element imbalances in chronic alcohol consumers [68]. Although a similar trend was observed for molybdenum, further research is needed to clarify the mechanisms involved and determine whether this reflects a true malabsorptive effect or broader dietary patterns.
From a multivariate perspective, the Random Forest analysis using all food groups as predictors across the full panel of trace elements suggested that the most influential dietary contributors to the overall mineral profile were fish, dairy products, fruit and vegetables. This finding aligns with recent global studies highlighting these food groups as some of the most micronutrient-dense components of the human diet. In particular, non-starchy vegetables, seafood, dairy and organ meats have consistently been ranked among the top sources of multiple vitamins and minerals, including selenium, iodine, zinc and iron [9, 69, 70]. In parallel the importance of nutrient-rich foods using standardized indices, such as the Nutrient Rich Foods (NRF) score, which systematically classifies foods on the basis of their contribution to essential nutrient intake were emphasized [71]. By contrast, less frequently consumed or lower-density foods, such as legumes, eggs, nuts, sweets and fats, exhibited limited explanatory power for the overall mineral status, with some exceptions like selenium in nuts. These findings further highlight the importance of prioritizing inherently nutrient-dense foods in population diets to address both deficiencies and imbalances in micronutrient profiles.
Among the trace elements analyzed, selenium was the only one for which a logistic regression model was constructed, given its high prevalence of inadequacy in this cohort and its public health relevance [19]. In contrast to other essential elements, for which serum levels were consistently within the adequate range across the population, selenium exhibited sufficient clinical variability to justify a predictive model based on dietary intake. This model identified regular fish consumption as the most consistent dietary predictor of adequate selenium status, reinforcing the well-established role of seafood as a key source of this micronutrient. Notably, individuals with sufficient serum selenium levels reported an average fish intake equivalent to approximately four servings per week, suggesting a realistic dietary threshold that could guide future nutritional recommendations.
While fish was the most consistent contributor, a positive trend was also observed for nut consumption, although statistical significance was not retained in the logistic model. Given the low habitual intake of nuts in this population, this signal (despite the high variability in dietary data) should not be dismissed. Nuts, particularly Brazil nuts, are known to contain high concentrations of selenium, and promotion of nut consumption could serve as a complementary strategy in selenium-deficient groups.
Interestingly, the model also revealed a non-significant but consistent negative association between cereal consumption and selenium status. This trend echoes the inverse associations observed for other divalent and trivalent metal ions (e.g. arsenic, nickel, copper, chromium) and may reflect the inhibitory effect of phytates present in whole grains on mineral absorption. Likewise, a negative trend with intake of fat, particularly oil, aligns with previous hypotheses suggesting that high-fat diets may impair selenium absorption by altering intestinal mucosa or oxidative balance. Although these secondary findings require cautious interpretation, they open interesting avenues for further research and suggest that, in addition to promoting selenium-rich foods, moderation of certain dietary components, such as refined cereals and unhealthy fats, could contribute to optimizing selenium bioavailability.
This study has several strengths that contribute to its scientific and practical value. First, it is based on a well-characterized adult cohort with detailed dietary data and serum trace element concentrations, enabling an integrative analysis of nutritional biomarkers in a real-life population setting. The combination of traditional statistical methods, machine learning techniques and logistic regression models provided complementary perspectives and strengthened the robustness of the associations observed. In particular, the consistent findings regarding selenium reinforce the value of this approach for identifying food-based predictors of micronutrient status. Furthermore, the inclusion of multiple trace elements (both essential and toxic) offers a comprehensive overview of the dietary influences on mineral balance, beyond single-nutrient analyses.
Nonetheless, certain limitations must be acknowledged. The cross-sectional design limits the ability to infer causality, as observed associations may be influenced by reverse causation or residual confounding factors that cannot be fully accounted for in observational studies [72]. Dietary intake was assessed through self-reported food frequency questionnaires, which are inherently subject to recall bias, misreporting and measurement error, particularly for foods consumed irregularly or seasonally [73]. Such uncertainty may attenuate true associations between habitual intake and biomarker concentrations.
In addition, serum concentrations reflect relatively recent exposure rather than long-term nutritional status for several trace elements, and they may be influenced by physiological, metabolic or inflammatory factors unrelated to diet [2, 21]. This limitation is particularly relevant for toxic elements such as cadmium and arsenic, for which blood concentrations primarily reflect recent exposure, while cumulative exposure is more accurately assessed using urinary biomarkers. The lack of arsenic speciation further limits the ability to distinguish between organic and inorganic forms and to identify specific dietary exposure sources. Finally, the relatively low frequency of intake of certain selenium-rich foods (e.g. nuts) in this population may have reduced the statistical power to detect associations with smaller effect sizes, especially in the context of high interindividual variability in dietary reporting.
Comments (0)