The outcomes of adult living donor liver transplantation (ALDLT) are equivalent to the outcomes of deceased donor liver transplantation in similar patient populations at experienced centers.1 Because of the organ shortage, most transplant centers are confronted with the need to perform transplantation for patients on the waiting list before they die or become too ill. Ultimately, it is the combined mortality of patients on the waiting list and posttransplant mortality, which determines the efficacy of liver replacement therapy. With the current ALDLT crisis in Western countries, it is imperative to develop reliable strategies that lead to successful outcomes and, at the same time, reduce the risks for living donors. On the basis of donor morbidity data for ALDLT from American and Asian surveys (1957 patients), we can calculate a reduced morbidity rate for left lobe donors versus right lobe donors (21% versus 11.5%).2, 3 Data from the European Liver Transplant Registry show similar results.4 The widespread use of left lobes has been limited because they are smaller and are potentially insufficient for the metabolic demands of recipients. Currently, a donor liver is considered a small-for-size graft (SFSG) when the graft-to-recipient weight ratio is less than 0.8 or the ratio of the graft volume to the standard liver volume is less than 40%. Nevertheless, the graft size is only one of the factors required for successful ALDLT. Because of the vast potential pool of living donors and the inferior risk associated with left liver donation, a strategy for transplanting SFSGs safely into adult recipients could effectively address the donor shortage and potentially reduce mortality on the waiting list. Despite the statistically significant negative impact of SFSGs on graft prognosis and patient outcomes, success with these grafts was observed in early reports. Nishizaki et al.5 recorded no graft losses in 5 patients receiving SFSGs with graft volume to standard liver volume ratios of 26% to 29% in 2001. Likewise, in 2003, Kiuchi et al.6 reported 4 successful instances in which SFSGs were used in recipients with various pretransplant statuses; the lowest graft-to-recipient weight ratio was 0.59. These findings suggest that although recipients of SFSGs are at higher risk for worse outcomes, the small size of the grafts is not always solely responsible.7 Factors other than the graft volume can potentially influence the outcome; these include recipient-related factors (disease clinical status and portal hypertension), graft-related factors (donor age, steatosis, cold and warm ischemia times, ischemia/reperfusion injury, and immunological factors), and technical factors (vascular reconstruction and adequate outflow, vascular inflow, and pressure gradients). All these factors contribute to the concept of functional graft size, and even a larger graft can fail when multiple risk factors are present. ALDLT, adult living donor liver transplantation; GIM, graft inflow modulation; SAL, splenic artery ligation; SFSG, small-for-size graft; SFSS, small-for-size syndrome. During the last decade, controversial results have been reported about outcomes with SFSGs. The lowest 1-year graft survival rates have ranged from 33% to 65%,8-15 and the highest rates have ranged from 75% to 100%.16-24 Graft failure due to the use of SFSGs has been described as small-for-size syndrome (SFSS). The incidence of SFSS in these reports has greatly varied and has ranged from 13.6% to 100%. These discrepancies may be adequately explained by the different criteria used to define SFSS and by the lack of uniformity in reporting recipient-related, graft-related, and technical factors. Experimental studies have shown that portal hyperperfusion is detrimental, especially for SFSGs.25-32 This excessive portal inflow is thought to be a primary factor in the dysfunction and failure of SFSGs. The reduction of the intrahepatic vascular bed results in higher portal flow per 100 g of liver tissue, a rise in the portal pressure, and stress at the hepatic sinusoid.10, 11, 21, 33, 34 Reduced hepatic artery flow after increased portal flow has been described.35, 36 This imbalance, which has been observed with all graft types, is markedly increased in partial grafts.37 To reduce the risk for these stress factors, different graft inflow modulation (GIM) techniques are currently being used.12, 21, 38-42 When SFSGs are used without the application of any GIM technique, we may pay a steep price in graft survival (from 100% to 38%).10, 12, 17, 18, 20, 21, 23, 43, 44 In this issue of Liver Transplantation, Ishizaki et al.45 present their successful single-center experience with ALDLT and left livers, allegedly without GIM. This poses the question whether GIM is necessary. The authors describe 17 patients (40% of their study population) with SFSGs; for only 3 of these patients (18%) was the Model for End-Stage Liver Disease score greater than 20. The highest reported flow (833 mL/minute/100 g of liver) represents a value that puts the graft at risk. Because it is indexed by the graft weight, this flow probably corresponds to recipients of the smallest grafts; 4 of these patients underwent GIM by means of splenic artery ligation (SAL). Regardless of the indication, SAL is a GIM technique that potentially influences the outcome. Unfortunately, no data are provided about flow or gradient measurements before and after SAL to dissipate this doubt. Also, the fact that SAL patients did not show any significant differences in pressure gradients in comparison with non-SAL patients merits further discussion. The gradients and the pressures were 10 mm Hg higher in SAL patients (as could be expected in patients with marked splenomegaly). Therefore, an effect on the outcome due to a reduction of hyperflow and/or portal hypertension cannot be ruled out unless measurements of flows and gradients are performed before and after SAL. Similarly, the percentage of patients with intractable ascites was twice as high as the percentage of patients with severe portal hypertension, and the lack of statistical significance could be attributed to the limited patient population. Fortunately, all the patients recovered. As previously discussed, disease severity can worsen the results with SFSGs, which are less likely to fail in patients with low Model for End-Stage Liver Disease scores.46, 47 Do we need GIM when we are transplanting SFSGs? Good results have indeed been described without the use of GIM. In addition to technical and graft-related factors, a patient's metabolic requirements due to the disease stage, the presence of portal hypertension, and the hyperdynamic status play major roles in the stress that an SFSG can tolerate. Accordingly, the size of the graft is not the only factor determining the outcome, and the functional size has to be taken into account. If hemodynamic stress is identified, GIM could be applied to decrease it. Nevertheless, GIM does not guarantee adequate regeneration and function in the sickest patients. New combined strategies of pharmacological flow and pressure gradient modulation and pharmacological protection for reducing ischemia/reperfusion injury could be extremely important for the regeneration and function of SFSGs in sick patients.48-50 Such approaches could increase the use of left liver grafts and lead to good expected outcomes and a reduction of living donor morbidity.
Introduction: Tacrolimus induced optic neuropathy (TION) is a rare condition seen in transplant patients leading to severe vision loss caused by damage to the optic pathway. The underlying pathophysiology is thought to be a combination of ischemic damage due to vasoconstriction of the cerebral microvasculature and direct neurotoxicity. Method: We describe a case of a 51-year old male, combined multivisceral and renal transplant recipient who developed severe, bilateral TION 3 years after transplantation. Furthermore, a literature research was performed for all published cases describing TION after organ transplantation. Results: Optic tract inflammation was clearly detected on MRI (Figure). Treatment with intravenous corticosteroids and immunoglobulins was started. Tacrolimus was reduced but not withdrawn completely to avoid rejection, especially of the intestinal component of the graft. Everolimus was associated to maintain sufficient immunosuppression. After three months, vision had recovered completely. The patient experienced no signs of rejection in any transplanted organ during this period and organ function remained stable. Seven other reports in various organs were found in literature (Table). In most, tacrolimus was discontinued completely and outcomes were poor. Conclusion: Our report demonstrates the importance of swift treatment to reverse optic tract inflammation and highlights the possibility to add everolimus to the immunosuppressive regimen to allow safe reduction of tacrolimus exposure in intestinal transplant patients. By contrast, results from literature show sporadic use of anti-inflammatory medication and poor long-term vision outcomes, often related to delayed diagnosis and treatment (Table).
P832 Aims: The role of antifungal prophylaxis in liver transplantation (OLTx) is accepted for the high risk of fungal infections due to Candida spp. in OLTx recipients. However, the extensive use of fluconazole, still considered the gold standard drug, can determinate microbiological drug resistance or shift to Candida non albicans species. A low drug exposition, intended as both patient’s under-exposition and population’s over-use, represents one of the principal factors in the genesis of fungal resistance or shift to fluconazole resistant strains of Candida spp. Considering that pharmacodinamic characteristics of the azoles determinate their time dependant action, to optimize the use of fluconazole is important to maintain a pre-dose plasmatic concentration of the drug (Cmin) always over the MIC of sensible strains. Methods: All OLTx recipients receiving prophylaxis with fluconazole at a dosage of 200-400 mg/die, underwent monitoring of plasmatic concentration (TDM) since to post-treatment day two, in order to obtain Cmin levels ≥ 8 mg/L (in vitro sensibility breakpoint for Candida spp.) HPCL was used for determination of C-min levels. We retrospectively analyzed the levels of exposition obtained by dose of fluconazole, and by renal function. Results: 14 patients, 9 with normal renal function (group 1) (median CLCR 0.52-0.76 mL/min/kg), and 5 patients needing CVVH (group 2) (median CLCR 0.40-0.56 mL/min/Kg). In group 1 the optimal dose of fluconazole (range Cmin 9.33-12.48 mg/L), during prophylaxis administration (range 5-24 days), was 4.07-5.01 mg/Kg during the first 48-72 hours and 1.76-3.45 mg/Kg subsequently. In group 2, similar average doses of fluconazole (3.54-4.25 mg/Kg during the first 48-72 hours, and 2.62-3.90 mg/Kg thereafter), did not allow to maintain stable plasmatic concentration range of fluconazole (Cmin 5.10-12.54 mg/L) during the study period (6-14 days). Conclusions: our preliminary data, although limited to a small group of patients, suggest that a initial dose of 400 mg of fluconazole, followed by 200 mg/die, could be adequate to prevent Candida spp. infection for patients with normal renal function after OLTx. Thereafter in case of epidemiological settings of a low incidence of resistant or dose-dependant non albicans species it seems not useful to administer higher dosage. In patients with alterated renal function necessitating CVVH, there was a marked variability of fluconazole plasmatic concentration and a higher dosage of fluconazole is needed. This data is of particular clinical interest, since CVVH is frequently compared to haemodialysis with subsequent tendency to reduce the dosage of the drug while in that condition an increment of the dose is quite almost always necessary.