Uremic Toxins and their Relation to Dialysis Efficacy

Toxin retention is felt to be a major contributor to the development of uremia in patients with advanced chronic kidney disease and end-stage renal disease (ESRD). Uremic retention compounds are classically divided into 3 categories: small solutes, middle molecules, and protein-bound toxins. Compounds comprising the first category, for which the upper molecular weight limit is generally considered to be 500 Da, possess a high degree of water solubility and minimal or absent protein binding. The second category of middle molecules has largely evolved now to be synonymous with peptides and proteins that accumulate in uremia. Although not precisely defined, low-molecular weight proteins as a class have a molecular weight spectrum ranging from approximately 500 to 60,000 daltons. The final category of uremic retention compounds is protein-bound uremic toxins (PBUTs). As opposed to the above small, highly water-soluble toxins, which are largely by-products of protein metabolism, PBUTs have diverse origins and possess chemical characteristics that preclude the possibility of circulation in an unbound form despite being of low molecular weight. This review is the first in a series of papers designed to provide the current state of the art for extracorporeal treatment of ESRD. Subsequent papers in this series will address membranes, mass transfer mechanisms, and future directions. For small solutes and middle molecules, particular emphasis is placed on the important clinical trials that comprise the evidence base regarding the influence of dialytic solute removal on outcome. Because such trials do not exist for PBUTs, the discussion here is instead focused on solute characteristics and renal elimination mechanisms.


Introduction
One of the major functions of the kidney is to eliminate waste products and toxins generated from a variety of metabolic processes [1]. Normal kidney function provides efficient elimination of these solutes, allowing for control of their blood and tissue concentrations at relatively low levels. On the other hand, toxin retention is felt to be a major contributor to the development of uremia in patients with advanced chronic kidney disease (CKD) and end-stage renal disease (ESRD) [2].
In the classic taxonomy, uremic retention compounds are divided into 3 categories [3]: small solutes, middle DOI: 10.1159/000502331 molecules, and protein-bound toxins. Compounds comprising the first category, for which the upper molecular weight limit is generally considered to be 500 Da, possess a high degree of water solubility and minimal or absent protein binding [4]. Despite having significant kinetic differences, both urea and creatinine are considered to be representative molecules (surrogates) for the small solute class. Nevertheless, it remains a matter of debate whether these 2 solutes themselves are toxic per se. Moreover, while organic compounds in this category historically have been the focus of researchers, there is renewed interest in the potentially toxic effects of inorganic compounds, including sodium, potassium, water, hydrogen ion, phosphate, and calcium, along with the treatment-related determinants of their removal. These topics are addressed in the review.
The second category of middle molecules has largely evolved now to be synonymous with peptides and proteins that accumulate in uremia [5]. Although not precisely defined, low-molecular weight proteins as a class have a molecular weight spectrum ranging from approximately 500 to 60,000 daltons [6]. Thus, peptides with as few as 10 amino acids and proteins nearly as large as albumin comprise this group. In patients with intact kidney function, these compounds are initially filtered by the glomerulus and subsequently undergo catabolism with reclamation of the constituent amino acids at the level of the proximal tubule [7,8]. While the kidney is not the sole organ responsible for detoxification of these compounds, renal elimination accounts for 30-80% of total metabolic removal.
The final category of uremic retention compounds, one which has received much less attention than the other two, is protein-bound uremic toxins (PBUTs) [9,10]. As opposed to the above small, highly water-soluble toxins, which are largely by-products of protein metabolism, PBUTs have diverse origins and possess chemical characteristics that preclude the possibility of circulation in an unbound form despite being of low molecular weight (< 500 daltons also). These organic molecules typically have ionic and/or hydrophobic characteristics and bind avidly to albumin in the blood. Under conditions of normal kidney function, they are eliminated primarily by organic anion transporters (OATs) residing in the proximal tubule [11,12]. Uremia is associated with elevated concentrations of both bound and unbound forms of PBUTs, with both reduced renal elimination and impaired albumin binding considered to be important factors [13]. Attention has focused on the metabolic products of the gut microbiome as the source of many PBUTs, including indoxyl sulfate (IS) and p-cresol [14].
This review is the first in a series of papers designed to provide the current state of the art for extracorporeal treatment of ESRD. Subsequent papers in this series will address membranes, mass transfer mechanisms, and future directions. The common approach for all papers in the series will be not only to describe the current understanding and evidence base for these topics but also to identify the major questions that still need to be addressed in future research. This first installment is not intended to be a comprehensive assessment of each toxin category, as several excellent reviews for each class of toxins have been published recently. For small solutes and middle molecules, particular emphasis is placed on the important clinical trials that comprise the evidence base regarding the influence of dialytic solute removal on outcome. Because such trials do not exist for PBUTs, the discussion here is instead focused on solute characteristics and renal elimination mechanisms.
Even though the main subject of this series is dialytic toxin elimination, it is worthwhile to note that phenomena beyond toxin accumulation play roles in uremiaoxidative stress, carbamylation, peroxidation, and organ crosstalk are all examples of this. However, an extensive discussion of these pathophysiologic processes is beyond the scope of this review. Moreover, uremia may be associated with a deficiency rather than excess of specific critical compounds, including natural antioxidants, micronutrients, and certain amino acids. Although such deficiencies can actually be exacerbated by dialysis, this topic likewise is not addressed in this review.

Background
The concept that uremic pathophysiology is predicated upon the retention of molecules normally excreted by the kidney was established nearly 200 years ago and is broadly accepted now [15]. Based on early clinical evidence that uremia could be ameliorated with extracorporeal dialysis treatments [16], interest turned toward the role that small diffusible solutes play in the uremic syndrome, with a particular focus on a small organic molecule, urea, as being a representative toxin for this class. Although individual molecules and classes of molecules having characteristics quite different from urea had been also characterized as putative uremic toxins by the middle of the last century [17], urea has been the primary solute used clinically for the evaluation of ESRD patients undergoing dialysis. However, debate regarding not only urea's Blood Purif 2019;48:299-314 DOI: 10.1159/000502331 toxicity per se but also its surrogacy status for the small uremic toxin class continues to this day -these questions are addressed below, along with the uremic toxicity of small inorganic compounds.
Urea has numerous attributes that account for its enduring use as a marker guiding the prescription and delivery of chronic dialysis therapies. First, from a quantitative perspective, urea is generated at the greatest rate of all solutes in the body and attains the highest blood concentration [18]. Second, its production is linked closely to protein metabolism -approximately 90% of nitrogen generated by the catabolism of protein is incorporated into urea [19,20]. This relationship allows certain inferences to be made about dietary protein intake in a stable ESRD patient [21]. Third, over the last several decades, a relatively good understanding of the kinetic behavior of urea under conditions of different dialysis therapies has been established through rigorous clinical research [22]. Finally, the blood concentration of urea is easily and widely measured while clinicians are also facile in the interpretation of laboratory-reported values.
The first rigorous kinetic analysis, published in 1951 [23], primarily employed dialysance to quantify the removal of urea and several other putative uremic solutes during batch (recirculating) dialysis. After the initial description of computer-based approaches to dialysis quantification in the 1960s [24], Gotch et al. [25] in the early 1970s described computerized kinetic modeling with urea as the target molecule for the purpose of individualizing prescription and delivery of intermittent treatments. At the same time, an alternative view that the basis for such modeling should be larger uremic toxins ("middle molecules") was proposed by several investigators [26,27]. Specifically, the "square meter-hour" hypothesis [28] suggested the effects of membrane surface area and treatment time were relatively greater in the removal of such larger compounds, whereas blood and dialysate flow rate were the critical determinants of small solute removal for the low-permeability cellulosic filters used at that time.

Major Trials
Attempting to reconcile these differing views, Lowrie et al. [29] performed a comprehensive evaluation designed to correlate clinical parameters with alterations in blood concentrations of both small solute and middle molecule surrogates. Specifically, patients were treated with different dialysis regimens having predicted BUN (blood urea nitrogen) and middle molecule (vitamin B 12 ) concentrations based on dialyzer KoA values [30] and a single-pool kinetic model [31]. While middle molecule concentrations were only theoretical, actual BUN values closely adhered to model predictions. These regimens were produced by the following prescription ranges: blood flow rate, 260 or 300 mL/min; dialysate flow rate, 125-1,500; treatment time, 4-5 h; and (regenerated cellulose) membrane surface area, 0.64-1.5 m 2 . Several clinical parameters were measured, including laboratory values (BUN, creatinine, alkaline phosphatase, phosphate, cholesterol, triglycerides) and putative markers of uremic neuropathy (ulnar nerve conduction velocity, median sensory latency time). The investigators summarized as follows: "In conclusion, these data suggest alterations in uremic abnormalities which occur with dialysis correlate with changes in the concentration of small molecules in the blood as predicted by models described in the literature. No inferences concerning acceptable concentrations for small molecules can be drawn, however, and concentration is determined by the balance between removal and generation rates." A further conclusion was: "Taken together, these observations suggest that clinical dialysis protocols should be structured with due consideration given to the removal of small molecular weight substances, such as urea and creatinine, though in and of themselves these substances may be nontoxic." The above Lowrie et al. [29] study served as a precursor to the more robust National Cooperative Dialysis Study (NCDS) in which the parameters of time-averaged concentration of urea (TAC u ) and dialysis time acted as surrogates for small solute and middle molecule removal, respectively, in a 2 × 2 experimental design [32]. The single-pool urea kinetic model was again employed to guide dialysis prescription. As was the case in the Lowrie et al. [29] study, only dialyzers composed of low permeability regenerated cellulose membranes were used [33]. Withdrawal from the 52-week study (primary endpoint) was significantly higher in the patients dialyzed on high target TAC u regimens (90 mg/dL) than in patients having a low target TAC u (50 mg/dL), irrespective of dialysis time (Fig. 1). In addition, in the 1-year period immediately following the study, significantly more deaths occurred in the high TAC u groups than the low TAC u groups. Finally, treatment time had a statistically significant effect on morbidity only in those patients in the high target TAC u groups.
The basis of the subsequent "mechanistic analysis" [34] of the NCDS database was the relatively new concept of dialysis dose, urea Kt/V. In this analysis, the investigators (Gotch and Sargent [34]) reported an inverse relationship between Kt/V and morbidity, defined as DOI: 10.1159/000502331 dropout from the study ("percent failure"). They argued it was more appropriate to characterize the relationship between percent failure and Kt/V with a step function than a more typical exponential. Therefore, percent failure for Kt/V values of 0.8 and less was expressed at a high constant value (approximately 55%) while that for Kt/V values of 0.9 and greater was expressed at a low constant value (approximately 10%). For patients dialyzed thriceweekly, Gotch and Sargent [34] concluded that adequate dialysis was defined by a delivered Kt/V of 1.0 per treatment and a normalized protein catabolic rate, a proposed surrogate for dietary protein intake [21], of 1.0 g/ kg/day. Another conclusion was that the delivery of dialysis at a Kt/V level of > 1.0 per treatment was "of no apparent clinical value with the cellulosic dialyzers in current use." On the other hand, based on a follow-up analysis that incorporated edited data from the NCDS, Keshaviah [35] concluded that delivered Kt/V values > 1.0 per treatment were, in fact, associated with improved survival (Fig. 2).
The NCDS' underlying tenet that uremic pathophysiology is primarily mediated by small solute retention continued to be debated and challenged vigorously by clinical researchers after the trial. Moreover, the mechanistic analysis, especially the suggestion that a delivered Kt/V of 1.0 constituted "adequate" treatment, created substantial controversy in the dialysis community, especially when reports suggesting systemic underdialysis began to occur in the United States soon after publication of the study [36,37]. A consensus developed in the US nephrology community that another randomized controlled trial was urgently needed to address these uncertainties pertaining to uremic toxin control in ESRD patients treated with HD.
After completion of a pilot trial [38], 1,846 patients were enrolled between 1995 and 2000 in the full-scale HEMO Trial [39]. The trial employed a 2 × 2 factorial design comprised of 2 small solute doses (surrogate: urea Kt/V with single-pool values of approximately 1.25 and 1.65) and 2 levels of middle molecule dose (surrogate: membrane flux providing different β 2 -microglobulin clearances). Although neither small solute nor middle molecule dose was found to have a significant effect on all-cause mortality, the study was widely debated. A major weakness prominently highlighted was insufficient separation for the 2 small solute doses (estimated GFR difference of only ∼10 mL/min). It is worthwhile noting that the ADEMEX Trial, another large randomized published in 2002, failed to show a benefit for high versus conventional dose (urea Kt/V) in peritoneal dialysis patients [40].
Finally, the Frequent Hemodialysis Network trial was a randomized trial involving 245 patients treated with either conventional (thrice-weekly) HD or frequent HD over a 12-month period [41]. For the latter group, mean treatment frequency (per week) was 5.2 and mean weekly treatment time was 23% higher than the conventional group. Both coprimary composite outcomes (death or change in left ventricular mass; death or change in the physical-health composite score of the RAND 36-item health survey) were significantly impacted in a favorable way by frequent treatment [42] (Fig. 3). Moreover, frequent HD also was associated with a 42% increase in the mean dialysis-specific weekly standard Kt/V and a 20% reduction in the mean pretreatment BUN relative to the conventional regimen. Consistent with modeling predictions, frequent HD also led to a significant decrease in the mean pretreatment concentration of phosphate, a relatively small solute with kinetic characteristics quite different from those of urea [43]. Thus, from the perspective of clearance, it is impossible to attribute the outcome benefits achieved with frequent HD specifically to enhanced urea removal. Moreover, while mean ultrafiltration rate did not appear to be different between the 2 groups, mean weekly ultrafiltration volume was significantly higher in the frequent HD group. Thus, after > 30 years of controlled trials in which regimens designed to assess different strategies for urea clearance have been utilized, legitimate questions exist about the status of urea as a uremic retention molecule and the organic toxins for which it is a putative surrogate.
Nevertheless, clinical practice guidelines continue to rely heavily upon urea-based measurements for the assessment of HD adequacy [44].

Revisiting Old Questions (1): Is Urea Toxic and Is it a Good Surrogate for Small Uremic Toxins?
Based on several lines of evidence, most experts in the field of uremia research traditionally have not considered urea to be a particularly toxic molecule [45]. Instead, as detailed above, the molecule's other characteristics account for its continued use as a uremic surrogate. Never-  [34] and Keshaviah [35], respectively. Reproduced with permission from [34] and [35]. theless, several recent reviews highlight new evidence suggesting this long-standing conventional wisdom requires reconsideration. In a very comprehensive review, Vanholder et al. [46] detailed several recent experimental (in vitro and animal) studies suggesting urea has either direct or indirect toxic effects when studied at concentrations relevant to CKD (chronic kidney disease). The direct toxic effects include insulin resistance, intestinal epithelial cell dysfunction, and accelerated apoptosis, atherogenesis, and proinflammation. Studies demonstrating indirect effects focused on compounds either modified by urea (carbamylated proteins) or derived from urea in other ways (cyanate, ammonia). Vanholder et al. [46] proposed that the putative pathways activated by these direct and indirect effects unite to induce toxicity at the molecular level and promote cardiovascular disease and CKD progression. Of note, Duranton et al. [47] also have recently highlighted recent information suggesting a toxic role for urea in CKD.
Finally, as suggested previously, questions have been raised whether urea's kinetic behavior during dialysis is representative of the overall class of small uremic toxins. Utilizing a 2-compartment kinetic model, Eloot et al. [48] performed a comprehensive kinetic comparison of urea and several guanadino uremic toxins in patients treated with low-flux HD. In this context, the authors noted that guanadino compounds are similar to urea in several respects, including their metabolic source (proteins and amino acids), molecular weights, water solubility, and dialyzer clearances. This rigorous analysis demonstrated that, in general, the lower effective removal rates for guanadino compounds are mainly due to large distribution volumes in relation to urea. The authors concluded that urea is not a generally representative surrogate for the small uremic toxin class. As such, conclusions based on results of large prospective trials (e.g., HEMO) that small solute clearance does not significantly influence patient outcome may be confounded by the use of urea rather than another surrogate molecule.

Revisiting Old Questions (2): What About Inorganic Uremic Toxins and Other Aspects of Uremia Not
Related to Toxin Accumulation? As suggested above, treatment time and frequency likely played a critical role in mediating the differences in both phosphate concentration and ultrafiltration volume between the 2 groups in the Frequent Hemodialysis Network Trial. Based on the improved outcomes for patients treated with the intensive HD regimen, this trial and others raise important questions about the importance of non-urea small solute control by different treatment schedules. Indeed, some investigators believe the extent to which a particular HD regimen achieves some of the most basic goals of treatment (namely, adequate control of fluid, sodium, phosphorus, and metabolic acidosis) is the most appropriate adequacy criterion [49][50][51][52][53][54]. Thus, although survival per se has not been shown indisputably to be improved by intensive HD schedules, a consensus exists that such regimens are superior to conventional (thrice-weekly) schedules in the control of non-urea inorganic uremic toxins.
As far as specific intensive HD schedules, a debate continues about the relative benefits of longer treatment time versus increased frequency. While conclusive evidence does not exist, a thoughtful analysis of this question, published recently by Hakim and Saha [55], concluded the following: "The weight of current evidence, particularly when considering patient acceptance, compliance, and maintenance of vascular access, is in favor of increased dialysis time to at least 4 h per session, and preferably as nocturnal dialysis with each session length of 6-8 h, with a minimum frequency of 3 times per week."

Background
In a 1994 review, Vanholder et al. [56] defined small solutes as having a MW < 300 and proposed the MW of middle molecules to fall between 300 and 12,000 daltons, noting that these ranges are arbitrary. While a number of putative toxins falling in the latter category were discussed, Vanholder et al. [56] emphasized that the actual uremic toxicity of these compounds was largely undocumented. Falling in the latter category were several nonpeptidic molecules having kinetic behavior that resembled that of a large MW compound due to hydrophobicity, protein binding, and/or multicompartmental distribution. This original broad application of the term "middle molecules" essentially encompassed all uremic toxins beyond the small solute class [27]. However, based on the pioneering work of Vanholder et al. [3,57], the terminology has been refined over time, with middle molecules now largely defined by peptidic compounds and a separate category for protein-bound toxins (see below).
As discussed recently by Chmielewski et al. [6], the updated MW range for middle molecules is approximately 500-60,000 daltons. These investigators classified 40 different middle molecules, ranging in molecular weight from 3,100 to 55,000 daltons, into the following molecules/categories: adipokines; advanced glycation end Blood Purif 2019;48:299-314 DOI: 10.1159/000502331 products; appetite regulators; natriuretic peptides; complement factor D; cystatin C; endothelin-1; free immunoglobulin light chains; interleukins; parathyroid hormone and fibroblast growth factor-23; pentraxin-3; prolactin; and retinol-binding protein. These compounds were suggested to have multifactorial clinical effects in uremic patients. While the majority of these solutes have a molecular weight between approximately 10,000 and 30,000 daltons, increasing attention is being applied to larger compounds in the middle molecule class, especially as membranes and filters with extended removal capabilities have been developed. This topic has recently been discussed by Ronco and colleagues [58][59][60][61] and will be addressed in a subsequent installment in this series. Chmielewski et al. [6], nevertheless, also seemed to caution against the rigid use of molecular weight to characterize solutes in the category, arguing that some compounds in the middle molecule range may have salutary effects.

Major Trials
Hemodialysis As suggested previously, the NCDS protocol applied the square meter-hour hypothesis to treatment prescription with the aim of achieving differing degrees of middle molecule clearance in the various arms. However, the approach used in the study was fraught with limitations. First, the study occurred at a time when the understanding of uremic toxicity, especially with respect to larger solutes, was rudimentary. This was due primarily to the limited analytical techniques available at the time for the identification of uremic toxins and their concentrations in biologic fluids. Prior to the study, which was performed a full decade before β 2 -microglobulin's definitive identification as a uremic toxin [62], the only putative middle molecules that had been characterized chromatographically possessed molecular weights in the approximate 500-2,000 dalton range [27]. In retrospect, most of these molecules were likely PBUTs -it is highly unlikely that the concentrations of these compounds were influenced by different NCDS treatment conditions. Second, unlike urea, these putative middle molecules could not be measured easily in clinical practice. Although clearance of vitamin B 12 (based on aqueous in vitro dialyzer KoA values) was felt to be representative of clinical middle molecule clearances for the different regimens, this molecule has little relevance as a viable clinical surrogate due to its substantial protein binding in blood. On the other hand, as described above, both urea concentrations (in the original NCDS) and urea clearance (Kt/V from the subsequent mechanistic study) were analyzed to provide important insights, at least for the treatment regimens used in the trial. Third, even though higher permeability synthetic (AN69) dialyzers had been introduced for clinical use by the mid-1970s (at least in Europe) [63], the strict use of low-flux unsubstituted cellulosic filters in the NCDS most likely precluded the attainment of any clinically significant differences in middle molecule concentrations, irrespective of treatment time. In this regard, Lindsay and Spanner [64,65] suggested the enhanced clearance of middle molecules with higher permeability dialyzers alters the relationship between normalized protein catabolic rate and urea Kt/V in a favorable way (i.e., higher slope versus unsubstituted cellulosic), further calling into question the validity of the NCDS results.
In the late 1980s and the early 1990s, 2 significant developments intensifying interest in middle molecule removal occurred. One was the market introduction of synthetic high-flux dialyzers on a widespread basis in the United States and elsewhere. The second development, identification of β 2 -microglobulin as the precursor molecule in the dialysis-related amyloid syndrome [62], for the first time definitively identified a larger molecular weight uremic toxin. Several retrospective studies, published between 1997 and 2001, provided suggestive evidence that increased middle molecule removal improves ESRD patient outcomes [66][67][68][69]. In these studies, survival was significantly better in patients treated with high permeability dialyzers, relative to that in patients treated with dialyzers of lower permeability. These results seemed to confirm a widespread clinical impression that highflux dialyzers have beneficial patient outcome effects and a widely held belief was the HEMO Trial would provide corroboration.
As discussed above, the surrogate for middle molecule clearance in the HEMO Trial was membrane flux, with β 2 -microglobulin clearance as the quantified parameter. The study protocol defined a low-flux dialyzer as having a first-use β 2 -microglobulin clearance of < 10 mL/min and a high-flux dialyzer as one with a first-use β 2microglobulin clearance of > 20 mL/min (along with an ultrafiltration coefficient of > 14 mL/h/mm Hg). Because the study was performed before the prescription of singleuse high-flux dialyzers was common, both peracetic acidbased and bleach-based reuse was employed. Corroborating data published prior to the completion of the trial, β 2 -microglobulin clearance significantly decreased as function of number of dialyzers uses under conditions of peracetic acid-based reuse [70,71]. Conversely, β 2microglobulin significantly increased over time with bleach-based reuse, also corroborating prior studies [ [72][73][74]. This confounding of β 2 -microglobulin clearances was further complicated by the fact that most peracetic acid reuse centers prescribed cellulose triacetate dialyzers while most bleach reuse centers prescribed polysulfone dialyzers. Finally, irrespective of reuse and dialyzer type, the β 2 -microglobulin clearances actually achieved in the study appeared to be modest relative to other published trials in which similar types of dialyzers were used [75,76]. While no significant effect of membrane flux overall on the primary outcome was reported in the HEMO Trial, the above confounding factors rendered interpretation of the results difficult.
A prespecified secondary analysis of the HEMO study provided more detailed information regarding the effect of flux on outcome. Cheung et al. [77] reported a nonsignificant 8% reduction in all-cause mortality in patients treated with high-flux dialyzers, relative to those treated with low-flux dialyzers. However, in the subgroup of patients treated with dialysis for > 3.7 years (the mean vintage of patients in the trial), all-cause mortality was significantly lower in the high-flux group (Fig. 4). One interpretation for this finding was that longer vintage (and the associated loss of residual renal function [RRF] in most patients) "unmasked" the beneficial effect of dialytic middle molecule removal. One of the concluding statements made by the authors was: "This study also does not exclude the possibility that greater benefits could be accrued from other modalities that are associated with higher clearances of β 2 -microglobulin." Another secondary analysis of the HEMO Study [78] involved an assessment of the potential association between patient outcomes and both serum β 2 -microglobulin clearance and concentration. In the 1,704 patients analyzed, multivariate regression analysis revealed both baseline (urea) residual renal clearance and dialyzer β 2microglobulin clearance were unsurprisingly significant predictors of predialysis β 2 -microglobulin concentration, as was dialysis vintage. Mean cumulative predialysis β 2microglobulin concentration (but not clearance) was significantly associated with all-cause mortality -a 10 mg/L increase in concentration was associated with an 11% increase in mortality risk. An additional HEMO secondary analysis suggested benefits for high-flux HD with respect to infectious mortality [79] and in patients with cerebrovascular disease [80]. As discussed below, these data demonstrate the critical influence of RRF middle molecule concentrations in ESRD.
The Membrane Permeability Outcome (MPO) Trial [81], performed in 59 European centers, randomized 738 incident patients (stratified by serum albumin concentration) to low-flux or high-flux hemodialysis. After randomization, the mean follow-up period was 3.0 years, during which the gross mortality rate was only 8%. When all patients were considered, no significant effect of membrane flux was found. However, a survival benefit was reported for high-flux patients with serum albumin concentrations < 4.0 g/dL, the pre-specified value used for stratification. Moreover, survival in diabetic patients was also significantly improved by high-flux hemodialysis, although this finding was based on a post hoc analysis. Finally, a significant interaction between serum albumin concentration and diabetes was observed, suggesting the benefit of high-flux hemodialysis in this group may have been mediated through an effect on serum albumin.
It is difficult to make direct comparisons of the HEMO and MPO Trials. One key difference between the 2 studies was the routine practice of dialyzer reuse in HEMO versus exclusive single-use hemodialysis in MPO. As noted above, the different reuse practices in HEMO had substantial effects on membrane permeability over the life of an individual dialyzer. Differences between the 2 patient populations in treatment time, fluid management, and blood pressure control were also reported. Thus, isolating the effect of membrane permeability itself in the 2 trials so that a valid comparison can be made is confounded in several ways.
It should be noted that the overall negative HEMO and MPO results did not diminish the increasing clinical use of high-flux dialyzers after the study, especially in the United States as single-use treatments grew rapidly in popularity. Moreover, even long before the negative HEMO findings were reported, many in the dialysis community had already concluded that only convective ther- apies (specifically hemodiafiltration [HDF]) could provide sufficient middle molecule clearances to improve patient outcome (relative to diffusive therapies).
Hemodiafiltration A considerable volume of literature pertaining to online HDF (OLHDF) has been produced over the past several years, and the clinical benefits of this modality continue to be debated [82][83][84][85]. The intent of this section is not to provide a comprehensive review of major studies in which OLHDF has been compared to other modalities. Instead, only selected points of emphasis are made. For more comprehensive papers on this topic, the reader can access several recent excellent reviews, especially a concise but thorough analysis by Canaud et al. [85].
It is worthwhile to comment briefly on OLHDF's precursor, hemofiltration, as the development of higher permeability membranes also allowed the clinical application of this modality. In a seminal 1975 report, Henderson et al. [86] described their experience with this therapy. Other investigators, notably, Quellhorst et al. [87] and Baldamus et al. [88], also reported their clinical experience with hemofiltration over the subsequent few years. Relative to hemodialysis, the major clinical advantages of hemofiltration were consistently reported to be enhanced middle molecule removal [89] and hemodynamic stability [90]. However, concerns were raised about inadequate small solute clearances in hemofiltration due to the relative inefficiency of convection as a removal mechanism for such compounds [91]. It is interesting to note that these concerns coincided with an increasing focus on urea clearance and Kt/V due to the NCDS results. Consequently, the origin of HDF can really be traced to a recognition of the need to supplement convective mass transfer with a diffusive element for adequate therapy delivery [92].
The first large prospective trial incorporating HDF was the Italian Cooperative Dialysis Study, which enrolled patients in 1991/1992 [93]. In the part of the study with flux as the key parameter, approximately 200 patients were randomly assigned to the following 4 groups: HD with a low-flux unsubstituted cellulosic dialyzer; HD with a low-flux polysulfone dialyzer; HD with a high-flux polysulfone dialyzer; and post-dilution "soft" HDF (8-12 L per session). Over a 24-month follow-up period, parameters related to treatment tolerance and nutrition did not differ among the 4 groups. Nevertheless, a significant decrease in pretreatment serum β 2 -microglobulin concentration from baseline was reported in the high-flux HD and HDF groups (versus no significant change over time in the low-flux HD groups).
With respect to the effect of OLHDF on patient outcomes, 4 randomized controlled trials now have been published [94][95][96][97]. As summarized by Canaud et al. [85], the primary analyses of these trials have not consistently demonstrated mortality improvements for OLHDF relative to the control treatments. However, subsequent analyses of the data from these studies strongly suggest convective dose (volume) is a critical determinant, with aggregated data suggesting a convective volume cut point of 23 L/1.73 m 2 body surface per session [98]. As far as uremic toxin removal, Canaud et al. [85] also cited findings, both from some of the aforementioned randomized controlled trials and observational trials, that appear to validate the superior large molecular weight solute removal capabilities of HDF.

Middle Molecules: Additional Perspectives
While high-flux hemodialysis and HDF both can achieve clinically significant middle molecule removal, an important consideration is the fundamental kinetic properties of such toxins, especially when these modalities are prescribed with conventional duration and frequency. Due to their relatively large molecular weights, peptide and protein removal rates during diffusion-based hemodialysis are limited [99]. In fact, such molecules are primarily eliminated by internal filtration (a convective mechanism) during high-flux hemodialysis sessions having standard treatment times (4 h or less) [100]. On the other hand, the slow extracorporeal diffusivity of large toxins can be counteracted by extending treatment time, as demonstrated in early studies of nocturnal hemodialysis by Pierratos et al. [101,102].
Another factor constraining large molecule removal during high efficiency/short duration therapies (both high-flux HD and OLHDF) is related to slow "intracorporeal" mass transfer between different body compartments [103][104][105]. As demonstrated by Clark et al. [103], high extracorporeal clearance rates do not necessarily translate into effective solute removal from the body for these compounds because of the difference between extracorporeal and intracorporeal mass transfer rates, the latter of which are rate limiting. On the other hand, the use of lower efficiency/longer-duration therapies narrows this difference between the respective mass transfer rates, allowing for more effective large solute removal.
A final consideration is the importance of RRF or other nondialytic sources of middle molecule elimination [106,107]. While obviously influencing the removal of most all substances eliminated by the kidney, RRF has a particularly substantial impact on serum middle mole-DOI: 10.1159/000502331 cule concentrations due to the powerful effect of continuous clearance [5,108]. In the aforementioned modeling work performed by Clark et al. [103], although simulated patients were anephric, a β 2 -microglobulin nonrenal clearance of 3 mL/min was assumed. This can be considered analogous to continuous clearance provided by RRF for the purpose of this discussion. In this study, effective (continuous-equivalent) β 2 -microglobulin removal (estimated by equivalent renal clearance: EKR (mL/min) in a conventional thrice-weekly (4 h duration) schedule (A) was compared to a short daily regimen (B) and long-duration (8 h) treatments of frequency 3, 5, and 7 per week (C, D, and E, respectively). For the baseline regimen A, continuous nondialytic clearance of only 3 mL/min accounted for approximately 2/3 of the effective β 2 -m removal per week. Moreover, even for regimen E (56 h of treatment per week), this nonrenal clearance still accounted for nearly 30% of total β 2 -microglobulin removal. These data can be applied to an analogous situation involving middle molecule removal by RRF, highlighting the relative importance of this mechanism in the elimination of large uremic toxins.

Protein-Bound Uremic Toxins
As the most recently identified class of uremic toxins, an understanding of the mechanisms by which such toxins are eliminated by the kidney and the role they play in uremic pathophysiology is still being gained [109]. Moreover, this toxin class has not been the subject of the same type of large clinical outcome studies that have been performed to assess the other 2 uremic toxin classes. Thus, for this section, the focus will be on the chemical aspects of PBUT protein binding in the blood along with renal mechanisms of PBUT elimination. Comprehensive reviews on this class of uremic toxins both with respect to their putative pathophysiology and their kinetic behavior during dialysis can be found elsewhere. Moreover, dialytic removal of PBUTs will be addressed in this series' final installment concerning future directions of dialytic therapy.

Important PBUTs and their Putative Toxicity
Although investigators have studied a host of PBUTs [110], produced by different mechanisms and possessing varying chemical characteristics, it is possible to highlight several common features, some of which are discussed further below. First, for many PBUTs, the first step in their production involves gut metabolism. Using metabolomic techniques, Mair et al. [14] identified 33 different uremic compounds of colonic origin, including IS, p-cresol sulfate, and indole acetate. Second, while PBUTs have large effective MWs from a kinetic perspective (in the bound form), they have relatively low intrinsic molecular weights (< 500 daltons) [111]. Third, most PBUTs in the unbound form are anionic compounds having a ringed structure [13], including IS, p-cresol sulfate, indole acetate, and hippurate. It is worthwhile to note that some identified PBUTs also are cationic, including some guanidino compounds [112]. Fourth, uremic toxins bind primarily to albumin in plasma and compete with other endogenous and exogenous compounds for binding sites [113]. Moreover, uremia-induced posttranslational modification of albumin may also influence toxin binding capacity [114]. For a given PBUT, while total plasma concentration is derived from both bound and unbound components, uremic toxicity is mediated only through the latter. Finally, due to their large effective MWs, PBUTs do not undergo glomerular filtration -instead renal tubular secretion is felt to be responsible for their elimination (see below) [9].
Based on numerous investigations over the past several years, a consensus has developed that PBUTs play an important pathophysiologic role in uremia. These studies have led to a construct for which the foundation is increased free plasma concentrations of PBUTs, resulting primarily from solute retention, saturation of protein binding sites, and altered binding. While very low under conditions of normal kidney function, these increased free plasma concentrations provide greater substrate for membrane-bound transporters of cells comprising many different organ systems, including the heart, vasculature, bone, brain, and kidneys [10]. (See below for more specific details about protein binding and cellular transport.) Development of CKD and other complications have been associated with production of oxidative stress by intracellular uremic toxins, which lead to inflammation and tissue destruction in the body [115].

PBUT Binding in Plasma
In human plasma, albumin constitutes more than 50% of the total plasma proteins by mass and can bind to a variety of endogenous and exogenous compounds through different ligands [116]. The fundamental binding process can occur in several ways, including charge-based mechanisms along with hydrophobic (dipole-dipole) or hydrophilic (acid-base) interactions [117]. Irrespective of the mechanism, a chemical equilibrium between bound and free states is created [118]. Most compounds primarily bind to albumin via 2 domains that are classified as Sud- low's binding sites [119]. The prototypical molecules binding to Sudlow's site I and II are heterocyclic compounds and aromatic carboxylates, respectively -warfarin is a representative drug binding to Sudlow's site I while diazepam is a prototypical Sudlow II drug. As discussed below, the defining chemical characteristics of warfarin, including its weakly acidic nature (pKa 5.8) along with its high degree of protein binding and negative charge of unbound drug at physiologic pH, are similar to those of many PBUTs. On the other hand, some drugs, including furosemide and ibuprofen, appear to be capable of binding to both sites. Indeed, an extensive body of literature describes albumin's critical role in the binding of poorly soluble drugs of many types.
Two other aspects of albumin binding are worth noting. First, several drug studies have demonstrated that binding of some compounds at sites distinct from a drug's primary binding site causes allosteric modification of albumin through conformational changes that influence binding at the primary site [120,121]. The simultaneous binding of sulfisoxazole and the protein heme provides an example of this phenomenon [122]. While albumin's binding of heme occurs in a domain ("cleft") that is spatially distinct from the Sudlow I site, the albumin-heme interaction can allosterically modulate binding of drugs to this site. For example, in vitro experiments have demonstrated a substantial reduction in both the rate and capacity of sulfisoxazole binding to albumin in the presence of heme. While the implications of allosteric modulation for PBUTs are much less clear at present, it is reasonable to assume this mechanism, along with direct competitive effects, is an important consideration.
Uremia-induced changes in the concentration and structure of albumin in the plasma also may influence PBUT binding from both a quantitative and qualitative perspective. Hypoalbuminemia is a very common occurrence in CKD and multiple factors may be contributory, including inflammation, malnutrition, and dialytic losses [123]. The lower number of total binding sites in the plasma caused by hypoalbuminemia may intensify competitive binding and lead to higher free concentrations of normally bound solutes [124]. Second, multiple investigations have demonstrated uremia is associated with post-translation modifications of albumin, including oxidation, glycation, and carbamylation [125]. These modifications may alter the chemical nature of binding sites, resulting in impaired binding and higher free solute concentrations. Klammt et al. [126] have reported a significant inverse relationship between albumin binding capacity (measured by a fluorescent-based technique spe-cific to Sudlow site II) and both CKD stage and the plasma concentration of IS. Likewise, a daily urine output of <500 mL was reported to be associated with lower albuminbinding capacity. The findings of Klammt et al. [126] have been corroborated recently by Deltombe et al. [114].

PBUT Elimination from the Body
Fundamental studies have conclusively demonstrated that the final pathway in general for elimination of protein-bound compounds, including drugs and endogenous toxins, is secretion by renal proximal tubule cells (Fig. 5) [127,128]. Most of these studies have assessed protein-bound drugs, with findings adapted for the understanding of PBUT elimination mechanisms in many instances. As is the case for cellular transport in other organs, the first step in this process is transport of unbound toxins from the blood across a cellular membrane to the intracellular space. For both nonrenal cells and proximal tubular cells, specific organic anion transporters (OATs) drive this process. Because only unbound solutes (drugs, uremic toxins, etc.) are transported by this mechanism, the equilibrium established between bound and unbound solute forms is critical.
Specifically in the kidney, unbound uremic toxins are secreted from the peritubular capillary into the proximal tubule cell by a specific basolateral OAT, which also may play a role in the uncoupling of a bound solute and protein in the first place [13,129]. In this step, despite an unfavorable electrochemical gradient, the unbound solutes are transported in exchange for intracellular dicarboxylates. Subsequent transport from the proximal tubular cell to the tubule lumen is achieved by a second membrane-based protein (MATE1) residing in the apical membrane [130]. Since the interior of proximal tubular cells are of lower electrical potential compared with the lumen, this pathway is potential driven and energetically favorable [131]. Additionally, the cell to lumen efflux can be performed via an electrically neutral anion/anion exchanger at the apical membranes of proximal tubules [132]. In addition to the organic anion transporter system, renal tubule cells are also equipped with organic cation transporters [133].

PBUT Elimination for Different Dialysis Modalities
As mentioned previously, large prospective trials have not assessed the effect of different renal replacement strategies on PBUT removal with respect to patient outcomes. Nevertheless, numerous studies over the past decade have evaluated the influence of changes in various dialytic parameters on PBUT removal. The dialytic parameters evaluated include treatment mode (diffusion DOI: 10.1159/000502331

Conclusions and Questions
The goal of this work has been to provide an overview of uremic toxicity as a prelude to subsequent reviews regarding the current state of the art for extracorporeal treatment of ESRD. The discussion has focused largely on prospective clinical trials, even with the knowledge that data from "real world" clinical practice are also very valu-versus convection), time, and blood/dialysate flow rates along with membrane surface area. In addition, adsorption-based techniques and interventions coupling displacement of PBUTs by competitive binding inhibitors with dialysis have been explored. The more promising of these approaches along with other novel techniques will be discussed further in this series' final installment addressing future directions of dialytic therapy.  Fig. 5. Production, transport, and elimination of uremic toxin indoxyl sulfate (IS) in the body. First, indole is produced through proteolysis of dietary proteins by bacteria in the gut. It is subsequently transported across the gut wall to enter the circulation (enterohepatic primarily) and subsequently the liver. In the liver, indoxyl is produced via hydroxylation of indole and finally indoxyl is converted to IS. This unbound toxin is secreted into the systemic circulation, in which it competes with other free solutes such as drugs (e.g., warfarin and diazepam) to bind primarily to 1 of 2 sites (Sudlow's site I and II) of albumin. This leads to a dynamic equilibrium between bound and unbound forms of these solutes. Both free and protein-bound forms are transported to the kidney, at which the unbound solutes are secreted from the peri-Blood Purif 2019;48:299-314 DOI: 10.1159/000502331 able and may vary considerably across centers [134]. While significant progress over the past 2 decades has occurred, elucidation of the retention solutes responsible for the uremic syndrome remains a daunting task and many questions remain: 1. Has urea made a "comeback" -is it a toxic uremic compound, a surrogate for small solutes, neither, or both? 2. Will urea continue to be used extensively in future clinical practice to guide the prescription and delivery of dialysis? 3. Are non-urea small uremic toxins more representative of the toxicity of the overall solute class than urea? 4. Does β 2 -microglobulin deserve its status as the preeminent middle molecule or do more clinically relevant surrogates exist? 5. Is the evidence compelling that OLHDF significantly increases effective middle molecule removal (relative to high-flux HD)? 6. What further evidence demonstrating patient outcome benefits for OLHDF over high-flux HD is needed?
7. What role does the removal of organic small solutes, inorganic small solutes, and middle molecules play in the apparent clinical benefits provided by intensive hemodialysis regimens? 8. Will the need for improved removal of PBUTs influence the innovation evolution for dialysis therapy in the future?
Disclosure Statement