Introduction
Iron is a vital component of human cellular physiology. It is incorporated into numerous proteins, including hemoglobin, catalases, lipoxygenases, ribonucleotide reductase, and cytochromes, and allows for a wide spectrum of biochemical processes, such as oxygen transport, energy production, DNA synthesis, immune defense, gene regulation, steroid and hormone synthesis, and drug metabolism [1, 2]. Its versatility stems from the ability to participate in oxidation-reduction reactions by transitioning between ferric (3+) and ferrous (2+) states. The same chemical quality that makes iron critical for cell growth and survival also allows ferrous iron to react with peroxides to form destructive hydroxyl or lipid radicals [2, 3]. To maintain appropriate iron levels throughout the body, iron homeostasis is tightly regulated at the cellular, tissue, and systemic levels [3].
Patients with chronic kidney disease (CKD), especially those on hemodialysis (HD), experience marked alterations in iron balance and tissue distribution because of reduced iron absorption, increased iron losses, and impaired mobilization of iron from stores [4]. Without intervention, this dysfunction of iron homeostasis is an important contributor to anemia in patients with CKD. Multiple iron preparations are available for oral or intravenous (IV) administration, and iron therapy is an integral part of anemia management in CKD, particularly among patients on HD. However, considerable uncertainty persists about the initiation, optimal dosing, and monitoring of iron therapy. In particular, there are concerns about adverse long-term consequences of IV iron therapy. This article aims to review factors influencing iron balance in patients with CKD and to propose a new lexicon for describing iron balance in this population. It will also review the clinical implications (or lack thereof) of positive iron balance as suggested by evidence from CKD populations. This review attempts to present a balanced, multidisciplinary perspective on the literature in this area but does not use a systematic formalized assessment of the quality of evidence or its possible biases.
Normal Iron Balance and Stores at the Tissue and Systemic Levels
Under physiologic conditions, the regulation of iron occurs in a virtually closed system, with 1–2 mg/day of iron absorbed to balance the 1–2 mg/day of iron lost [3]. Total body iron content is approximately 2–5 g, and is thus much larger than daily turnover; its representative distribution is summarized in Figure 1.
Fig. 1.
Systemic iron homeostasis in healthy individuals. Not represented: ∼250 mg of iron transported to the fetus over the course of pregnancy. Approximate values subject to significant person-to-person variation.
The ability to export iron for the purposes of systemic iron handling is limited to several cell types, mainly enterocytes (following gastrointestinal [GI] absorption), macrophages, hepatocytes, and placental cells [5]. The transmembrane efflux channel, ferroportin, serves as the only known exporter of iron from these cells [1, 3, 5]. The degree to which ferroportin is cell membrane-bound or internalized/degraded is regulated by the hormone hepcidin [5-7]. Hepcidin is synthesized by the liver and, upon binding to ferroportin, causes internalization and lysosomal degradation of the hepcidin-ferroportin complex. The removal of ferroportin from cell membranes decreases iron export (to plasma) from absorption or storage sites. In the setting of high hepcidin levels, plasma iron continues to be consumed for erythropoiesis and other iron-dependent processes, but is not replaced, leading to hypoferremia [5, 7]. Owing to rapid renal clearance and a high basal secretion rate by the liver, hepcidin rapidly and tightly regulates iron levels and stores [8]. The iron-lowering effects of hepcidin have been compared to the glucose-lowering effects of insulin [7].
Iron status is a key modulator of hepcidin secretion but hepcidin expression can be impacted by a variety of factors (Fig. 2) [6-10]. In 2014, the protein erythroferrone was identified as an “erythroid regulator” of hepcidin, mediating the coordination of iron supply with the iron requirements of hemoglobin synthesis by erythroblasts. Secreted by erythropoietin-stimulated erythroblasts, erythroferrone inhibits hepcidin production, thereby releasing iron from storage cells and increasing iron availability for erythropoiesis [7, 8, 11].
Under physiologic conditions, extracellular transferrin-bound iron is transported throughout the body but mainly to the bone marrow for erythropoiesis. Representing the ratio of serum iron to total iron-binding capacity, transferrin saturation (TSAT) is an important measure of iron status [12]. While methodologic considerations such as diurnal variation and changes associated with dietary intake of iron can impact TSAT levels, TSAT provides insight into iron availability. Among patients without disease, TSAT of 20–45% is considered normal.
Iron, originating mainly from recycled senescent erythrocytes, is transiently stored in macrophages of the spleen, liver (Kupffer cells), and bone marrow, but hepatocytes store the majority of iron in healthy individuals [13]. Intracellular iron that is not utilized is stored within ferritin, a hollow protein capable of holding up to 4,500 iron atoms [1, 12]. While the iron storage function of intracellular ferritin is well documented, the physiologic role of serum/extracellular ferritin – secreted by hepatocytes and macrophages – is unclear. Serum ferritin is iron-poor and, while it may impact iron delivery in some physiologic states, it appears to also have a role in proinflammatory signaling, angiogenesis, and immune regulation [14]. Serum ferritin concentrations are related to intracellular ferritin concentrations in macrophages and hepatocytes, thus providing an indirect estimate of iron stores. However, inflammation increases serum ferritin independently of cellular iron stores. Intracellularly, when iron stores increase, ferritin can be degraded into hemosiderin, an insoluble complex easily visualized by microscopy [13].
Alterations in Iron Homeostasis Associated with CKD
In CKD, iron homeostasis is disordered because of increased losses, reduced iron intake/absorption, increased storage, and reduced mobilization. These changes are considerably greater among those patients undergoing chronic HD. Patients with CKD not on HD and those on HD experienced 3.15 and 6.27 mL/day of GI blood losses, respectively, compared to 0.83 mL/day in control subjects [4, 15]. A separate study also reported GI losses of 5 mL/day among patients on HD [16]. Using these values, it is reasonable to estimate 1–2 L/year of blood loss in HD patients solely due to GI losses. Patients on HD also experience iron loss because of the HD procedure itself and from blood draws. Estimates for the volume of total blood lost vary considerably, with most in the range of 2–5 L/year for patients receiving long-term HD [4, 17, 18]. In some cases, the blood or iron loss among patients receiving HD has been documented well above this range [19].
At a normal hemoglobin concentration, 1 mL of blood contains 0.5 mg of iron. Assuming a 20% lower hemoglobin concentration in CKD patients, a total blood loss of 2–5 L/year corresponds to an iron loss of 0.8–2.0 g. While a recent Japanese study calculated a lower average yearly iron loss of 500 mg secondary to HD and associated laboratory tests [20], others have proposed an average yearly iron loss of 2–3 g for patients undergoing chronic HD (including physiologic iron losses) [21, 22]. Marked variability across individuals and institutions and the inability to accurately assess iron/blood losses among individual patients pose a challenge for physicians attempting to replace lost iron.
In addition to increased iron losses, individuals with CKD are also at increased risk for reduced dietary iron intake and absorption. Reduced protein intake secondary to decreased appetite and adherence to low-protein diets can result in reduced iron available for absorption [4]. Concomitant use of certain medications, such as proton pump inhibitors, can inhibit absorption of iron by enterocytes, further contributing to iron deficiency [23, 24].
Beyond the changes in iron absorption and losses, patients with CKD exhibit reduced availability of stored iron as hepcidin levels are greatly increased. Compared with controls, adults with stage 2–4 CKD and those on HD demonstrated median serum hepcidin levels more than 3.5-fold and up to 9-fold increased, respectively [25, 26]. The underlying cause for the increase in hepcidin is likely multifactorial; chronic inflammation and increased infection rates, reductions in the renal clearance of hepcidin, and, paradoxically, the use of IV iron therapy have all been implicated [7, 25, 27]. This hepcidin “block” results in reduced bioavailability of iron and the increased potential for iron-restricted erythropoiesis. Accordingly, among patients on HD, higher (relative to healthy individuals) values of iron parameters (e.g., serum ferritin and TSAT) are needed to maintain sufficient iron supply for erythropoiesis. A summary of iron homeostasis in patients with CKD on HD is presented in Figure 3.
Fig. 3.
Systemic iron homeostasis in individuals with CKD on hemodialysis and IV iron. Approximate values subject to significant person-to-person variation.
Given that disorders in both iron balance and iron distribution can impact iron availability, “iron deficiency” (alternatively referred to as “iron restriction”) should be viewed as a mismatch between iron supply and demand, rather than as an indicator of reduced body iron content. A state of iron deficiency can occur in the presence of decreased, normal, or increased total body iron (Fig. 4). As a consequence of increased losses and decreased absorption, most untreated patients with CKD have reduced iron stores and will progress to a state of absolute iron deficiency unless treated with transfusion or iron supplementation [28, 29]. In the setting of normal total body iron stores, increased iron demands can still result in a supply-demand mismatch. The result is functional iron deficiency, a state of inadequate delivery of iron to the bone marrow in the setting of adequate iron stores caused by impaired iron mobilization (from the reticuloendothelial system [RES]) and/or increased bone marrow iron demand (as might be secondary to reduced red cell life span and/or erythropoiesis-stimulating agents [ESA] use) [4, 7, 30]. Even among patients with increased total body iron, reduced iron availability can result in functional iron deficiency. For example, in the setting of inflammation, patients can have high ferritin levels, low TSAT, and increased iron stores but still experience restricted erythropoiesis resulting from “reticuloendothelial blockade” [31].
Fig. 4.
Iron deficiency represents an iron supply-demand mismatch. Intended to demonstrate relative associations between iron supply and demand; not representative of a numerical relationship between these parameters.
Targeting Effective Erythropoiesis in Patients with CKD
Effective erythropoiesis in patients with CKD requires adequate body iron and availability of that iron to the bone marrow. Current treatment guidelines recognize the utility of IV iron for replacing iron stores and the potential to result in a “clinically meaningful erythropoietic response” [32]. Kidney Disease: Improving Global Outcomes (KDIGO) guidelines recommend the use of IV iron for adult CKD patients receiving ESA with TSAT ≤30% and serum ferritin ≤500 ng/mL for whom an increase in hemoglobin concentration or a decrease in ESA dose is desired. Although there are variations in the specific TSAT and serum ferritin cut-offs cited by different organizations, guidance about the initiation of iron is very similar [32-36]. Naturally, the use of IV iron should be guided by considerations beyond iron status; patients with known hypersensitivity to IV iron preparations and those with active infections should not receive therapy [32]. Caution is also warranted in patients with coexisting conditions that may put them at risk of experiencing adverse effects from iron (e.g., preexisting liver disease).
Developing a Lexicon for Iron Status in Patients with CKD
Existing terminology regarding a patient’s iron status is inconsistent. For example, a KDIGO Controversies Conference defined iron overload as “a condition of increased total body iron content that is possibly associated with a time-dependent risk of organ dysfunction” and pathologic iron overload as “a condition of increased body iron content associated with signs of organ dysfunction that are presumably caused by excess iron.” Thus, “iron overload” is defined on the basis of potential or manifest adverse consequences and connotes a potential harmful or unfavorable state.
The authors of the present article propose to focus on “iron balance” to describe changes in total body iron content irrespective of the consequences. By virtue of the inability to accurately measure iron intake and loss, precise quantification of iron balance is currently impossible, but use of this term emphasizes the need to consider therapeutic iron dosing in the context of iron losses and the net balance between both.
“Negative iron balance” describes a state in which body iron content is decreasing, secondary to relative decreases in iron intake and/or relative increases in iron losses. Over time, negative iron balance can progress to functional and absolute iron deficiency and anemia. As outlined above, CKD patients undergoing HD (without iron treatment) are at particularly high risk for developing negative iron balance, resulting in reduced iron stores and compounded by reduced iron availability [4, 28, 37].
States in which body iron content is increasing represent “positive iron balance.” Such a state could be secondary to increases in iron intake and/or decreases in iron loss. For a patient undergoing chronic HD, iron losses are typically stable over time. Therefore, positive iron balance among HD patients generally results from increases in iron intake due to IV iron treatment. Given the above-mentioned estimates of iron loss in CKD patients on HD, IV iron therapy in excess of 2–3 g/year is likely to result in positive iron balance. The authors believe that the term “positive iron balance” more accurately describes a state of iron homeostasis than “iron overload.” While distinctions between negative and positive iron balance are conceptually straightforward, determining the iron balance and iron stores for a given patient can be challenging in clinical practice (Fig. 5).
It is suggested that the term “iron toxicity” be reserved to describe the adverse clinical consequences of iron therapy or disorders of iron metabolism. Iron toxicity should be considered distinct from the hypersensitivity or other acute reactions that can occur with IV iron administration. While iron toxicity is more likely to occur with increasingly positive iron balance, positive iron balance does not automatically lead to iron toxicity, nor is it a prerequisite for iron toxicity. For example, the risk for iron toxicity may be elevated when levels of non-transferrin bound iron (NTBI), iron which is neither bound to transferrin nor incorporated into heme or ferritin, are increased despite stable/decreased iron stores.
IV Iron Use in CKD
Multiple guidelines recommend the initiation of IV iron therapy prior to beginning ESA therapy and in combination with it, provided TSAT and ferritin levels are below proposed thresholds. Thus current KDIGO guidelines do not recommend the routine use of iron supplementation in patients with TSAT >30% or serum ferritin >500 ng/mL [32]. Despite these guidelines, mean serum ferritin levels have increased in recent years in the United States and were approximately 800 ng/mL in late 2016 [38, 39]. These increases in ferritin levels are presumably, in part, the result of increased IV iron use to compensate for reduced ESA use [39].
The risk associated with IV iron treatment and high ferritin concentrations remains a matter of uncertainty. To date, no randomized controlled studies have examined the safety or efficacy of chronically maintaining patients at serum ferritin levels greater than 500–800 ng/mL [32]. However, some observational evidence suggests that markedly elevated iron measures (e.g., serum ferritin >1,500 ng/mL or TSAT >50%) or high doses of IV iron (e.g., ≥300 or ≥400 mg/month) are associated with increased mortality among HD patients (with most receiving ESA ± IV iron) [40-44]. Given the potential for residual confounding, particularly by background inflammation, causality cannot be established from these observational studies. Additionally, these findings have not been consistently observed across studies [45, 46]. A pair of randomized trials examining the safety and efficacy of oral vs IV iron formulations (REVOKE and FIND-CKD) has demonstrated conflicting results [47, 48]. Whereas REVOKE demonstrated a higher risk of serious adverse events, cardiovascular serious adverse events, and infection resulting in hospitalization among non-HD patients receiving IV iron (vs. oral iron), FIND-CKD did not identify a safety signal among patients receiving low (targeting a ferritin of 100–200 μg/L) or high (targeting a ferritin of 400–600 μg/L) doses of IV iron as compared to patients receiving only oral iron or the subgroup of patients that attained a ferritin of 800 μg/L or greater [47-49]. The differences between these 2 studies and potential explanations for their findings have been discussed previously [50, 51].
There is also uncertainty about the efficacy of intentionally targeting higher ferritin or TSAT values [32]. In an observational study, Gaweda et al. [52] demonstrated no further increase in hemoglobin in ESA-treated HD patients with TSAT >30% or ferritin >350–500 ng/mL. In contrast, in the DRIVE Study, patients on HD with anemia, TSAT ≤25%, and serum ferritin levels 500–1200 ng/mL benefited from IV iron therapy as assessed by hemoglobin levels. As all patients had TSAT ≤25%, the increase in hemoglobin suggests functional iron deficiency. Such a benefit was also observed in the subset of patients with baseline serum ferritin levels >800 ng/mL [53, 54], but the effects of therapy beyond 12 weeks and on other clinical outcomes remain unclear.
A complete review of the risk and benefits of IV iron therapy is beyond the scope of this article, but a recent review by Fishbane of the available evidence suggests there is no clear association between IV iron therapy and mortality or other risk events. Nevertheless, Fishbane concluded that continued treatment with IV iron among patients with serum ferritin levels above 500 ng/mL is “particularly hard to justify” [55]. However, although increased serum ferritin has been associated with adverse outcomes in non-renal conditions such as heart failure in the general population [56], it may be inappropriate to generalize such findings to the CKD population given the unique impact of CKD on iron regulation, as well as the impact of parenteral iron and ESA on iron homeostasis.
When Does Positive Iron Balance Become Iron Toxicity?
Across therapeutic areas and disease states, the transition from positive iron balance to iron toxicity appears to be dependent on factors beyond the magnitude of body iron content. For instance, the cellular pattern of iron deposition within the liver during states of positive iron balance appears to greatly impact the risk of tissue damage and subsequent adverse outcomes [57-59]. When “excess” iron is predominately deposited in hepatocytes (i.e., parenchymal deposition), as might be observed in hereditary HFE-associated hemochromatosis, tissue damage is common [57]. In such conditions, iron deposition into Kupffer cells is a later finding [59]. In contrast, when reticuloendothelial deposition of iron predominates, as is expected with IV iron administration, tissue damage is less frequent. Data from other disease states (e.g., hemochromatosis) suggest that a “spill over” of iron from macrophages is dependent upon both the rapidity and magnitude of iron accumulation. In the setting of CKD and IV iron therapy, iron would be expected to “spill over” into hepatocytes if NTBI is present. Clinically relevant concentrations of NTBI would be expected if the iron-carrying capacity of transferrin is saturated.
In current practice, IV iron can be administered via large doses at long intervals (“load and hold”) or maintenance dosing with smaller doses at regular, shorter intervals [60, 61]. There are concerns that higher doses of IV iron could lead to temporary oversaturation of transferrin and greater concentrations of NTBI [62, 63]. Although large randomized controlled trials comparing different iron replacement strategies are lacking, available data suggest a more favorable risk:benefit profile with maintenance dosing in patients on HD [60].
The distribution and regulation of iron stores within the body can also be impacted by ESA use. Effective erythropoiesis requires the flow of iron from the RES to the bone marrow. As previously described, stimulation of erythropoiesis by ESA causes erythroferrone secretion, which suppresses hepcidin, allowing the release of iron from macrophages and enterocytes via ferroportin [7, 27]. ESA treatment thus helps offset the increased hepcidin levels and iron loading within macrophages that can occur with IV iron therapy, supporting a synergistic relationship between the therapies.
In genetic disorders of iron regulation that can be associated with iron toxicity, iron accumulates in the liver before it appears in the endocrine organs (in particular, the pancreas); iron accumulation in the heart appears even later [4, 57]. MRI can accurately determine hepatic iron content but cannot differentiate between parenchymal and RES distribution of iron within the liver. Since only the former is associated with toxicity, the presence of increased hepatic iron (i.e., increased liver iron content [LIC]) on MRI in itself should not be viewed as a sign of iron toxicity. On the other hand, iron accumulation in the pancreas and heart should be viewed as a radiologic surrogate for iron toxicity. As reviewed in Table 1, cardiac iron deposition has rarely been observed in HD patients in modern studies (i.e., in the ESA era) [64-67].
Assessing Iron Stores in Patients with CKD
Many laboratory/imaging tests can give insight into a patient’s iron status (Fig. 6). Unfortunately, no single measure is ideal for identifying positive iron balance or increased stores, singling out those patients at risk of developing iron toxicity, and guiding IV iron dosing in clinical practice [21, 52, 68].
Whereas low serum ferritin concentrations have high specificity for iron deficiency, high serum ferritin levels are not necessarily indicative of positive iron balance or increased iron stores. As an acute phase reactant, serum ferritin levels are increased in the setting of inflammation (as is often observed in CKD). Serum ferritin levels can also be impacted by sex, age, race/ethnicity, infection, malignancy, hyperthyroidism, liver disease, ascorbic acid status, and heavy alcohol intake [30, 31]. Thus, serum ferritin concentrations should not be used as the sole factor guiding iron therapy [32, 54]. The inadequacy of using serum ferritin as a marker for iron stores in patients on HD and IV iron is further evidenced by a poor correlation between serum ferritin levels and MRI-assessed LIC [69].
Transferrin is also subject to limitations as a marker for iron status. As a negative acute phase reactant, transferrin levels are reduced by inflammation [70]. Transferrin levels, and by extension TSAT, can therefore also be impacted by infection, malignancy, progesterone, and recent iron intake [30]. TSAT should be measured in the morning after ≥8 h of fasting to reduce the impact of diurnal variation, and at least 1 week after high-dose IV iron to avoid spurious elevations [30, 71]. Although current guidelines do not recommend initiating IV iron when TSAT >30%, there is some evidence that targeting TSAT >30% (and <50%) can reduce ESA usage by increasing “responsiveness” to erythropoiesis [72]. Whereas there are some reports of patients with TSAT >50% benefiting from continued IV iron therapy, in the general population, TSAT >55% has been associated with increased mortality [71, 73]. In the absence of convincing data to the contrary, it seems prudent to avoid TSAT >50% in patients on HD to minimize the risk of iron toxicity. At present, there are no evidence-based recommendations regarding the appropriate use of IV iron among patients with elevated serum ferritin values and reduced TSAT.
LIC is the established standard for assessing body iron stores and can be accurately quantified by several modalities [52, 74]. Liver biopsy, although largely replaced by MRI, remains the only method capable of discriminating between parenchymal and RES patterns of iron deposition. MRI-assessed LIC does not correlate well with conventional markers of iron status in patients with CKD and, to date, no studies have shown a correlation between increased LIC and adverse outcomes (e.g., morbidity or mortality) among patients on HD [4, 69]. Although increased liver iron stores, as assessed by MRI, do not appear to predict iron toxicity in patients with CKD, MRI may be helpful to rule-out extra-hepatic iron deposition.
Data from CKD populations and other disease states suggest LIC is not predictive of extra-hepatic iron [9, 64-67, 75]. Uptake of iron by extra-hepatic organs appears dependent on the presence of NTBI [75]. NTBI is not bound to transferrin or complexed with heme or ferritin, but generally bound to citrate, other acid anions, and albumin [76]. Labile plasma iron (LPI) is the component of plasma NTBI that is redox active, able to permeate cells, chelatable, and thought to account for the tissue damage associated with iron toxicity [76, 77]. In a study of 71 dialysis patients assessed 1 week after iron infusion, 80% had no detectable LPI. Although there was a correlation between TSAT and detectable LPI, normal TSAT did not rule out the potential for detectable LPI [71, 78]. While promising, the clinical utility of LPI assays is still to be determined.
Future Directions
There is a dearth of direct evidence about the relationship between iron balance, iron stores, and functional consequences in the CKD population. Although increased LIC has been seen in HD patients, such increases have not been correlated with adverse clinical outcomes (as they have in other conditions). The histologic effects of long-term IV iron therapy among patients on HD are largely unknown. In an autopsy study of 50 HD patients performed between 1976 and 1979, marked hepatosplenic siderosis was observed in 24 patients [79]. Hepatic iron accumulation began in the Kupffer cells and progressed to the hepatocytes over time. Many of these same patients demonstrated marrow iron depletion and some showed iron deposition in the lungs; none had marked iron deposition in the heart. The results of this study may not be generalizable to modern-day clinical practice – in the absence of ESA therapy, iron accumulation was likely secondary to regular red cell transfusions rather than to the administration of parenteral iron. Furthermore, it is unclear whether manifestations of iron toxicity affected any of these patients. To examine the effects of IV iron on the liver with modern-day clinical practice, liver biopsy studies conducted in at-risk patients undergoing hepatobiliary procedures may be warranted. In the absence of histologic samples, noninvasive measures of hepatic fibrosis could help determine whether an association between IV iron and liver disease exists in the HD population.
It is hoped that the ongoing, prospective Proactive IV Iron Therapy for Haemodialysis Patients (PIVOTAL) randomized controlled trial will help clarify the risks and benefits of IV iron therapy in patients receiving ESA therapy [4, 80]. New alternatives to ESA, including hypoxia-inducible factor (HIF) stabilizers, are in development and may increase iron supply to the marrow. HIF activation inhibits hepcidin secretion by the liver, most likely indirectly through enhanced erythroferrone following the stimulation of erythropoiesis rather than through a direct effect, but may also act directly on the duodenum to increase iron absorption [81]. It is not yet clear to what extent IV or oral iron use will be needed to support effective erythropoiesis in patients receiving these agents. The optimal role of newer, oral iron-containing phosphate binders, one of which (i.e., ferric citrate) is associated with substantial systemic iron absorption, has not yet been incorporated into evidence-based recommendations [32, 82].
Conclusions
Although iron stores are difficult to measure in clinical practice, available data suggest that IV iron dosing in excess of approximately 2–3 g per year is likely to exceed typical iron losses in patients undergoing HD. Such dosing would be expected to place patients into a state of positive iron balance and increasing iron stores. Positive iron balance and increasing iron stores do not necessarily represent pathological states among patients with CKD receiving IV iron. While there is a theoretical risk of iron toxicity associated with IV iron, available (mainly short-to-intermediate duration) data suggest that current practices, despite inducing a state of positive iron balance, are not associated with overt clinical toxicity. Judicious use of IV iron therapy in this population, in particular, avoiding TSAT >45–50%, would seem to pose low risk to the vasculature, heart, and endocrine organs. It stands to reason that as the duration of a positive balance state is extended in CKD patients, the potential for long-term organ toxicity increases. Long-term studies are therefore urgently needed to obtain data that will allow definitive guidance and enable better assessment of the risk:benefit ratio.
Disclosure Statements
This article is based on discussions at an international advisory board meeting, sponsored by Vifor Fresenius Medical Care Renal Pharma, which manufactures several iron replacement therapies. Northstar Strategic Consulting, LLC, provided assistance with the writing and editing of the manuscript. Funding for these services was provided by Vifor Fresenius Medical Care Renal Pharma. The article sponsor had no role in the collection, analysis, and interpretation of data; writing the report; or the decision to submit the report for publication. The final decision on the main points to be communicated, including the conclusions drawn, was made by consensus of the authors. Dr. George R. Aronoff: Consultant to Vifor, Rockwell Medical, AstraZeneca, Hospira, Fresenius USA. Speaker bureau for Keryx Biopharmaceuticals. Owner of Dosis, Inc. Employee of Renal Ventures Management, LLC, and DaVita, Inc. Dr. Bruce R. Bacon: Consultant to Novartis. Dr. Carlo Brugnara: Consultant to Sysmex Diagnostics, one Scientific Advisory Board meeting for Keryx Pharmaceuticals in 2016. Dr. Kai-Uwe Eckardt: Consultant to Akebia, Astra Zeneca, Bayer, Johnson & Johnson, Sandoz/Hexal and Vifor. Grant support: Amgen, Astra Zeneca, Fresenius, and Vifor. Dr. Tomas Ganz: Consultant to Akebia, Vifor, Gilead, Keryx Pharma, and La Jolla Pharmaceutical Company. Scientific founder, shareholder, and consultant to Intrinsic Life Sciences and Silarus Pharma. Dr. Iain C. Macdougall: Received speakers’ fees, honoraria, and consultancy fees from Akebia, AMAG, Amgen, Astellas, Bayer, FibroGen, GlaxoSmithKline, Pharmacosmos, and Vifor Pharma. Dr. Julio Núñez: Received board membership fees and travel expenses from Novartis, Roche Diagnostics, Abbott, Rovi, and Vifor. Dr. Adam J. Perahia: Employee of NorthStar Strategic Consulting, LLC. Dr. Jay B. Wish: Consultant to Akebia, Vifor, Pfizer, and AstraZeneca. Speakers bureau for Keryx Pharma and Pfizer. Dr. John C. Wood: Consultant to ApoPharma, BiomedInformatics, WorldCareClinical, Vifor, Ionis, and Celgene.

Get Permission