Assessing Human Iron Kinetics Using Stable Iron Isotopic Techniques

There are four primary approaches for utilizing stable iron isotopes to evaluate iron kinetics in humans: (i) the fecal recovery method; (ii) the plasma appearance method; (iii) the erythrocyte iron incorporation method; and (iv) the isotope dilution method.

4.1 Fecal Isotope Recovery

Iron absorption can be estimated by measuring the difference between the amount of oral stable isotope ingested and the amount excreted in feces. Typically, complete collections of feces are obtained over a period of 7–10 days following oral isotope dosing, assuming that the majority of non-absorbed isotope in the intestine will have been excreted during this period [20]. The approach applies the principles of a chemical balance method but with enhanced accuracy because the amount of stable iron isotope present in feces can be directly traced back to the labeled supplement, distinguishing it from endogenous iron lost in stool from shed intestinal cells. Fecal monitoring can be used to measure iron absorption because there is no appreciable fecal loss of iron once absorbed and negligible urinary iron excretion.

The equations used to calculate absorption are as follows:

(i)

\(} = } - };\)

(ii)

\(} = }/};\)

(iii)

\(} = } \times 100.\)

The method can also be used to estimate the total absorption of non-heme iron from the diet, assuming that fractional absorption is comparable between the added tracer in the test meal and dietary non-heme iron. On the day of oral dose administration, total dietary intake is assessed through complete collections of duplicate meals or weighed diets. Then, the following equation is used:

(i)

\(} = } \times }.\)

Fecal recovery of orally administered stable iron isotopes has been compared to the erythrocyte iron incorporation method in infants [21] and has been used in studies on iron absorption in infants and adults [22,23,24]. Fairweather-Tait et al. [25] performed a study in newborn infants (n = 36) to measure iron absorption from (58Fe)-labeled lactoferrin compared to (58Fe)-labeled ferric citrate, using a fecal recovery period of 3 days after an oral dose. However, a fecal marker was not administered, and fecal collections might not have been complete for all infants. There was considerable variation in isotope recovery, and no significant difference in iron absorption was observed between the groups.

An advantage of the fecal recovery method is it is non-invasive, as it does not require a blood sample. Additionally, the tracer dose required may be lower than that needed for other isotopic techniques. However, it has significant drawbacks. It is labor intensive and time consuming for subjects and researchers. Dietary intake data are often imprecise. Subject compliance is crucial, as even minor losses of excreted isotope can lead to overestimations of absorbed iron. Short collection periods lead to an overestimate of absorption due to the incomplete recovery of the unabsorbed isotope. Collection periods need to be of sufficient duration to ensure the complete fecal excretion of isotopic iron trapped in the gut mucosa but not absorbed into the circulation, otherwise the method only measures the luminal disappearance of iron. Contamination of stool collection containers with environmental iron can reduce the fecal isotopic enrichment, leading to an overestimation of net absorption. Given that daily dietary iron intake averages only milligrams and the fraction of absorbed iron from the gut is usually low, these small discrepancies can sharply bias results. Because of these challenges and limitations, this approach is now seldom used to assess iron absorption.

4.2 Plasma Stable Isotope Appearance

Stable isotope appearance curves after an oral tracer dose can be measured in whole blood, plasma, or serum to detect the appearance of small amounts of absorbed labeled iron with high precision. This method was originally developed using iron radioisotopes [26]. It can differentiate absorbed iron from circulating body iron, even with doses as low as 5 mg [27]. This provides an advantage over total serum (non-isotopic) iron curves, which are typically informative only for supplemental oral doses exceeding 25 mg of iron [6]. This is because a pharmacologic dose of iron is required to elicit a serum iron response that is significantly different from basal intra-individual and diurnal variations [28]. Stable isotope appearance curves have been validated as a reliable measure of iron absorption [17]. These curves can assess the rate, quantity, and pattern of absorption, as well as the clearance rate of absorbed iron. In a study by Troesch et al. [29], iron from ferrous sulfate was absorbed rapidly and cleared from the serum at a rate comparable to earlier studies estimating the clearance of radioactive iron isotopes from plasma [26]. While stable isotope appearance curves can be measured in whole blood [27], measurements in serum or plasma may enhance sensitivity and resolution [29].

In this approach, a baseline blood sample is collected after an overnight fast before tracer consumption, followed by regular interval samples (e.g., at 15 min, 30 min, 1 h, 90 min, 2 h, 3 h, 4 h, 6 h, 8 h, 12 h, and 24 h post-dose intake) [30]. To minimize postprandial serum iron fluctuations, subjects may receive standardized low-iron vegetarian meals on the day of the study [30]. The maximum (peak) plasma iron concentration and the time to reach maximal plasma concentration can be extrapolated from data fitted to a one-compartment model assuming first-order absorption [30]. The area under the curve (AUC) can be calculated using the trapezoidal rule [30]. A final blood sample can be collected 14 days later to determine the erythrocyte isotopic composition for calculating overall iron bioavailability from the test dose (as described in the next section).

An early study using plasma appearance of a stable isotope to measure iron absorption was done in 9 healthy UK women studied serially during pregnancy [31]. Isotope ratios in serum were measured following the oral administration of 5 mg of 54FeSO4 and the intravenous injection of 200 μg of 57FeSO4. The authors calculated the AUC for enrichment in the iron isotope ratios, rather than concentrations, for both the oral and intravenous labels. Fractional iron absorption was then calculated using the ratio of the two AUCs. The study reported a significant six-fold increase in iron absorption over a normal pregnancy.

The absorption profile of iron fortificants may determine their ability to generate non-transferrin-bound iron and, consequently, their potential safety. Ferrous iron is typically absorbed more rapidly than ferric iron, but differences at the fortification level cannot be distinguished with non-isotopically labeled serum iron curves. Using stable isotope appearance curves measured over 8 hours after oral dosing in a crossover design, Troesch et al. [29] measured iron absorption profiles in 16 healthy Swiss women from 6 mg of iron as FeSO4 with ascorbic acid and from 6 mg as NaFeEDTA, as well as the non-transferrin-bound iron response following the meals. Iron from FeSO4 was more rapidly absorbed, resulting in a 35% greater relative AUC during the first 2 hours compared with NaFeEDTA, but neither compound increased non-transferrin-bound iron. This was the first study to demonstrate that stable isotope appearance curves are a useful tool for comparing iron absorption profiles from different iron compounds in fortified foods.

Husmann et al. [30] measured the serum stable iron isotope appearance to investigate the mechanism by which prebiotics increase iron absorption [32]. In a crossover design, iron-depleted (n = 11) women received two 14-mg iron doses as labeled (57Fe,58Fe) ferrous fumarate 14 days apart with and without prebiotics in random order. Multiple blood samples were collected over of 24 hours and 14 days later to determine plasma appearance and fractional iron absorption (Fig. 2). The appearance data were fitted using non-linear mixed-effects modeling to a one-compartment model with first-order absorption, and AUC and time of peak serum isotope concentration (time to reach maximal concentration) were calculated. The absorption kinetics suggested that prebiotics did not influence gastric emptying but were consistent with enhancement of iron absorption in the proximal, rather than the distal, gut.

Fig. 2figure 2

Serum stable iron isotope appearance curves after administration of two 14-mg iron doses of ferrous fumarate (FeFum) to healthy Swiss women (n = 11). One dose was labeled with 57Fe and administered alone and one was labeled with 58Fe and administered with 15 g of prebiotic galacto-oligosaccharides (GOS). The participants were followed for 24 hours (h). Data are presented as means and standard errors. From reference [30] with permission

The plasma isotope appearance method has limitations, including the burden on participants owing to the need for frequent blood draws, as well as the time and cost of numerous isotopic analyses. The technique assumes there are not marked differences in the blood volume between individuals in the population being evaluated. An advantage of this approach is that it does not rely on assumptions of isotopic incorporation into erythrocytes or estimates of hemoglobin iron content.

4.3 Erythrocyte Iron Incorporation Method

The erythrocyte iron incorporation method assesses iron bioavailability by measuring the incorporation of stable isotopes into erythrocytes 14 days after administration (Fig. 3). This method evaluates not only the absorption of the iron dose but also its transport to the bone marrow and utilization by developing erythrocytes (the two components of iron bioavailability). Concurrent administration of an intravenous tracer allows discrete measurement of iron utilization versus absorption [33] as discussed below.

Fig. 3figure 3

Overview of two stable iron isotope methods: erythrocyte iron incorporation and isotope dilution. The erythrocyte iron incorporation method assesses iron bioavailability by measuring both the absorption of an orally administered stable isotope and its utilization to form new erythrocytes. Erythrocyte incorporation can be directly measured by giving a second tracer intravenously at the same time as the oral label, or incorporation can be assumed to be 80% (in healthy adults). Incorporation is complete by 10–12 days post-dosing. When the labeled erythrocytes reach the end of their lifespan, they will be broken down and the tracer will enter the circulating body iron pool, and gradually distribute through all tissues. After 12 months in adults, all body iron will be uniformly enriched with the tracer. Thereafter, the isotope dilution method can be used to assess iron absorption (and loss). Iron of natural composition acts as a tracer diluting the ad hoc modified isotopic signature in body iron. The concentration of the isotope tracer can only decrease owing to the influx of iron with a natural isotopic composition. The rate of decrease in the tracer concentration in circulation, expressed as the slope of tracer concentration over time, is proportional to iron absorption

Following absorption of an oral dose, most iron is incorporated into erythrocytes within 10–12 days [14]. It is unclear whether this process may take longer in infants. An early study suggested complete iron incorporation into erythrocytes in infants required 28 days [34], but a recent study reported it is achieved within 14 days [35]. Once incorporated, isotopic enrichment remains stable throughout the lifespan of the erythrocyte (~120 days), making it a reliable indicator of iron absorption following administration of single oral doses, multiple oral doses, or combined oral and intravenous iron isotopes.

The erythrocyte iron incorporation method relies on several assumptions, many of which were validated in early studies using iron radioisotopes [36, 37]. These assumptions include: (i) an unchanging fraction of iron is absorbed is incorporated into erythrocytes; (ii) hemoglobin in erythrocytes contains a constant concentration of iron; and (iii) once absorbed iron has been incorporated into erythrocytes, it remains within the erythrocyte and does not exchange with iron in plasma. However, recent data suggest that erythrocytes can release small amounts of iron into the plasma [38], suggesting this final assumption may not be entirely true.

Studies employing the erythrocyte iron incorporation method often utilize a study design where two or three oral iron isotopes are administered simultaneously to compare different conditions within the same subject [19]. This approach reduces the potential bias from estimating blood volume and the fraction of absorbed iron incorporated into erythrocytes (discussed below), as the primary focus is typically on comparing the relative bioavailability between different test conditions. If achieving sufficient enrichment requires a large dose of isotope, or if native iron intake is low, the isotope dose may be spread over several doses [39] or meals [40]. Single meal studies are useful for indicating whether dietary compounds inhibit or enhance iron absorption, but they may overestimate the effects of these compounds [41]. This is because these studies typically focus on the effects of a single enhancer (e.g., ascorbic acid) or a single inhibitor (e.g., phytic acid) on iron absorption, rather than the effects of mixed meals [42, 43]. Additionally, these studies may not accurately reflect long-term iron absorption from fortified foods or supplements, as they do not account for adaptive changes in absorption that occur with an increased iron intake. Comparative studies indicate that the effect of enhancers and inhibitors on non-heme iron absorption in single meals may be approximately twice as strong as in whole diets consumed over several weeks [44, 45]. Two weeks after the final administration of the isotopes, a blood sample is collected to measure the relative abundance of each isotope in erythrocytes. Most studies employ the two lowest abundance isotopes, 58Fe and 57Fe, but 54Fe can also be used if a third isotope is needed [19].

4.3.1 Calculation of the Circulating Iron Pool

Analyzing the collected isotopic data involves several steps. The initial calculation estimates the size of the circulating iron pool, using the formula:

$$}\;}\;(}) = }\;(}) \times }\;}\;} \times 3.47\,(}/}).$$

Blood volume is influenced by multiple factors including body size and composition, sex and age, certain diseases, and pregnancy. It can be measured directly or estimated using established formulas based on weight and/or height. Direct assessment methods include the carbon monoxide re-breathing method [46], intravenous dosing with radioisotope-tagged erythrocytes or albumin, or the use of dyes such as indocyanine green [47]. However, these direct methods are costly and require specialized resources, limiting their use in some research settings. Therefore, in most studies, blood volume estimates are derived from equations considering combinations of body weight, height, surface area, lean body mass, and age. Estimations of blood volume specific to infants, children, and healthy non-pregnant adults have been published [48,49,50,51,52,53]. It should be noted that the normative groups used to develop equations for estimating blood volume may be limited and therefore might not be applicable to all populations. Typically, to estimate blood volume, studies in infants use 80–85 mL/kg body weight [54], studies in children use 65 mL/kg [55], and studies in adults use normative equations based on both height and weight, for women [52] and men [53].

Estimation of adult blood volume for women can be calculated using the formula [52]:

$$}\;}\;(}) = \left( }} \right) + \left( }} \right)-1369.$$

For men, the estimation of adult blood volume is calculated by [53]:

$$}\;}\;(}) = \left( }} \right) + \left( }} \right)-2820.$$

4.3.2 Calculation of Erythrocyte Incorporation and Fractional Iron Absorption

After calculating circulating iron, the total amounts of stable isotope tracers incorporated into erythrocytes can be assessed. While radioiron concentrations can be directly measured using techniques such as Geiger counters or liquid scintillation, stable isotope concentrations are determined based on the isotope composition obtained by mass spectrometry (inductively coupled plasma mass spectrometry or thermal ionization mass spectrometry) from a whole blood sample.

In a study using the stable isotopes 57Fe (A) and 58Fe (B) as tracers, the measured isotopic ratios R57/56 and R58/56 can be expressed as follows, considering that the tracers are not purely mono-isotopic [56]:

$$R_ = \frac}}}^ \times n_}}} + a_^ \times n_}} + a_}}^ \times n_}} }}}}}^ \times n_}}} + a_}}^ \times n_}} + a_}}^ \times n_}} }},$$

$$R_ = \frac}}}^ \times n_}}} + a_}}^ \times n_}} + a_}}^ \times n_}} }}}}}^ \times n_}}} + a_}}^ \times n_}} + a_}}^ \times n_}} }},$$

where \(_}^\) represents the abundance of isotope i in natural iron, \(_^\) represents the abundance of isotope i in tracer j (respectively A and B), nnat is the amount of circulating natural iron in mmol, and nA and nB are the amounts of circulating tracers in mmol.

Circulating Fe ntot in mmol is the sum of nnat, nA, and nB, and is equal to circulating Fe in mg divided by the atomic mass of Fe (55.845 g/mol).

The equations can be arranged in the following form:

$$\left(_\times _}^-_}^\right)\frac_}}_}}+\left(_\times _}^-_}^\right)\frac_}}_}}+\left(_\times _}^-_}^\right)\frac_}}_}}=0,$$

$$\left(_\times _}^-_}^\right)\frac_}}_}}+\left(_\times _}^-_}^\right)\frac_}}_}}+\left(_\times _}^-_}^\right)\frac_}}_}}=0,$$

$$\frac_}}_}}+\frac_}}_}}+\frac_}}_}}=1.$$

This system of three linear equations can be solved using Cramer’s rule [57, 58] for nA/ntot and nB/ntot, which represent the concentrations of tracers A and B in the total circulating iron. If a third tracer is administered, the equations are extended to a system of four linear equations, solved in the same way.

The amounts in mmol of isotopic tracers incorporated into erythrocytes can then be calculated:

$$^}\text}_}=\frac_}}_}}\times _}, ^}\text}_}=\frac_}}_}}\times _}.$$

In healthy adults and children, a significant portion of absorbed iron, but not all, is incorporated into erythrocytes. Studies have reported mean erythrocyte incorporation rates of 80–85% in US men [59], ~80% in iron-depleted US women [60], 93% in Thai women [61], ~80% in British men [62], and 73–76% in Ghanian infants [63]. Iron bioavailability studies typically assume that 80% of iron absorbed from an oral dose in adults is incorporated into erythrocytes, and 75% is incorporated in infants. This shared assumption allows comparisons between studies, and is demonstrated in the following equation:

Calculation of the fractional iron absorption of tracers A and B:

$$^}}_}=\frac^}}_}}^}}_}}\times \frac,\quad ^}}_}=\frac^}}_}}^}}_}}\times \frac,$$

where AFeinc and BFeinc are from the equation above and AFedose and BFedose are the doses administered, in mmol, and the assumed incorporation of the absorbed isotope into erythrocytes is 80%.

4.3.3 Intravenous Administration of Tracers

In certain physiological conditions, the rate at which absorbed iron incorporates into erythrocytes can vary, and assuming an 80% incorporation rate may introduce bias. One study reported that erythrocyte iron incorporation is decreased in women receiving iron supplements [64] and one showed that incorporation is indirectly related to iron stores [45, 60]. These variations in erythrocyte iron incorporation between healthy individuals generally do not affect comparative studies where individuals serve as their own controls. However, direct measurement of incorporation can be beneficial in specific cases. For instance, it may be advantageous in pregnant women, where iron incorporation rates can fluctuate [65] and in individuals with genetic or infectious conditions that affect blood volume, turnover of erythrocytes, or recycling of iron by splenic macrophages [33, 61, 66].

In such cases, the fraction of circulating iron isotope that incorporates into erythrocytes can be directly measured by administering an intravenous iron dose. Ferrous citrate is commonly used because of its lower allergenic potential compared with other forms of intravenous iron [67]. Typically, 1 hour after the oral isotope has been administered, an aqueous solution containing 100 μg of 54Fe or 58Fe as iron citrate is slowly infused over 50 minutes [33]. This rate of intravenous infusion of iron is based on the estimated 2-μg/min plasma appearance of iron normally absorbed from the gastrointestinal tract [62]. Subjects undergoing an intravenous infusion of iron should be monitored under medical supervision. The estimated circulating iron pool size is used to determine the amount of intravenous isotope incorporated into erythrocytes, expressed as a percentage of the intravenous dose administered [61]. This percentage is then assumed to reflect the erythrocyte incorporation of absorbed oral label, using the directly measured incorporation rate instead of the assumed 80%. This approach assumes similar kinetics of intravenous and oral iron tracers after absorption.

4.3.4 Normalization to a Reference Dose or to Iron Status

Non-heme iron absorption varies depending on an individual’s iron status. In iron-deficient individuals, iron absorption is up-regulated, and fractional iron absorption of an oral dose typically is higher than in iron-sufficient individuals. To reduce inter-subject differences in absorption, one approach is to administer a stable isotope as a reference dose at the same time the other stable iron isotope is given with the test meal or compound of interest. The reference dose represents iron absorption under ideal conditions. A common reference dose is 3 mg of iron as FeSO4 combined with 30 mg of ascorbic acid [68]. To facilitate comparisons across subjects, absorption data from the reference dose are standardized to a set reference value, typically 40%. The fraction of iron absorbed from the test meal relative to the reference dose is then calculated accordingly [8, 69].

For normalizing iron absorption data to a reference dose, the following equation can be used:

$$}_}} = }_} }} \times 0/}_}}} ,$$

where absorption is expressed as a percentage of the total administered dose, the subscripts N, O, and Ref denote the normalized, observed, and reference doses, respectively.

Another approach to normalization utilizes the indirect relationship observed between iron absorption and serum ferritin [44]. The equation adjusts measured iron absorption to a specified reference serum ferritin, usually 25 or 40 µg/L. This approach allows for comparisons of iron absorption across individuals with varying serum ferritin levels [70].

The equation for normalizing absorption data to a reference serum ferritin level:

$$}\left( }_}} } \right) = }\left( }_}} } \right) + }\left( }_}} } \right)-}\left( }_}} } \right).$$

where absorptionN is the normalized dietary iron absorption (%), absorptionO is the observed dietary iron absorption (%), ferritinO is the observed serum ferritin level (μg/L), and ferritinR is the reference serum ferritin level (μg/L).

Galetti et al. [71], based on pooled analysis of extant data in over 1000 healthy women given oral stable isotopes, proposed an alternative equation for normalization of individual absorption values to a common ferritin reference value. In this study, serum ferritin levels of 15, 30, and 50 µg/L corresponded to fractional iron absorption of 15.8%, 10.0%, and 5.8%, respectively [

留言 (0)

沒有登入
gif