Abstract Background Noncontrast cardiac T 1 times are increased in dialysis patients which might indicate fibrotic alterations in uremic cardiomyopathy. Purpose To explore the application of the texture analysis (TA) of T 1 images in the assessment of myocardial alterations in dialysis patients. Study Type Case–control study. Population A total of 117 subjects, including 22 on hemodialysis, 44 on peritoneal dialysis, and 51 healthy controls. Field Strength A 3 T, steady‐state free precession (SSFP) sequence, modified Look–Locker imaging (MOLLI). Assessment Two independent, blinded researchers manually delineated endocardial and epicardial borders of the left ventricle (LV) on midventricular T 1 maps for TA. Statistical Tests Texture feature selection was performed, incorporating reproducibility verification, machine learning, and collinearity analysis. Multivariate linear regressions were performed to examine the independent associations between the selected texture features and left ventricular function in dialysis patients. Texture features' performance in discrimination was evaluated by sensitivity and specificity. Reproducibility was estimated by the intraclass correlation coefficient (ICC). Results Dialysis patients had greater T 1 values than normal ( P < 0.05). Five texture features were filtered out through feature selection, and four showed a statistically significant difference between dialysis patients and healthy controls. Among the four features, vertical run‐length nonuniformity (VRLN) had the most remarkable difference among the control and dialysis groups (144 ± 40 vs. 257 ± 74, P < 0.05), which overlap was much smaller than Global T 1 times (1268 ± 38 vs. 1308 ± 46 msec, P < 0.05). The VRLN values were notably elevated (cutoff = 170) in dialysis patients, with a specificity of 97% and a sensitivity of 88%, compared with T 1 times (specificity = 76%, sensitivity = 60%). In dialysis patients, VRLN was significantly and independently associated with left ventricular ejection fraction ( P < 0.05), global longitudinal strain ( P < 0.05), radial strain ( P < 0.05), and circumferential strain ( P < 0.05); however, T 1 was not. Data Conclusion The texture features obtained by TA of T 1 images and VRLN may be a better parameter for assessing myocardial alterations than T 1 times. Level of Evidence 4 Technical Efficacy Stage 3
Chronic kidney disease (CKD) is a common complication after liver transplantation and is traditionally considered to be secondary to calcineurin inhibitors (CNIs). However, several studies have reported that the etiology of CKD after liver transplantation is broad and may only be assessed accurately by renal biopsy. The current study aimed to explore the usefulness of renal biopsies in managing CKD after liver transplantation in daily clinical practice.This retrospective analysis enrolled all post-liver transplantation patients who had a renal biopsy in a single center from July 2018 to February 2021.Fourteen renal biopsies were retrieved for review from 14 patients at a median of 35.7 (minimum-maximum: 2.80-134.73) months following liver transplantation. The male-to-female ratio was 13:1 (age range, 31-75 years). The histomorphological alterations were varied. The predominant glomerular histomorphological changes included focal segmental glomerular sclerosis (FSGS) (n = 4), diabetic glomerulopathy (n = 4), and membranoproliferative glomerulonephritis (n = 4). Thirteen (92.9%) patients had renal arteriolar sclerosis. Immune complex nephritis was present in six patients, of whom only two had abnormal serum immunological indicators. Despite interstitial fibrosis and tubular atrophy being present in all the patients, only six (42.9%) presented with severe interstitial injury. No major renal biopsy-related complications occurred. After a mean follow-up of 11.8 months (range: 1.2-29.8), three patients progressed to end-stage renal disease (ESRD).The etiology of CKD after liver transplantation might be more complex than originally thought and should not be diagnosed simply as calcineurin inhibitors(CNI)-related nephropathy. Renal biopsy plays a potentially important role in the diagnosis and treatment of CKD after liver transplantation and might not be fully substituted by urine or blood tests. It may help avoid unnecessary changes to the immunosuppressants and inadequate treatment of primary diseases.
Abstract Background: Posttransplantation diabetes mellitus (PTDM) constitutes one of the most important complications associated with kidney transplantation and is associated with significant morbidity and mortality. Methods: This study was a single-centred prospective observational study that included 310 consecutive renal transplant recipients. The primary end point was graft failure, including death-censored graft failure and mortality. The secondary endpoints include estimated glomerular filtration rate (eGFR) at 12 months and adverse events after transplantation. The prevalence rate of PTDM and relevant risk factors for PTDM were also explored. Results: The incidence of PTDM was 16.4% within one year. Death-censored graft loss rate differed significantly between recipients without PTDM and those with PTDM(0.77% versus 12%, p<0.001). Compared with non-PTDM group, the mean eGFR was significantly lower in the PTDM group(70.55±20.54 ml/min·1.73 m² versus 63.04±21.92 ml/min·1.73 m², P=0.03). Additionally, compared with the other group, the PTDM group was more easily infected by bacteria(16.2% versus 40%, P<0.001). Multi-factor analysis indicated that higher preoperative fasting plasma glucose (FPG), increased age and use of tacrolimus after transplantation were independent risk factors for PTDM. Conclusion: The incidence rate of PTDM is 16.4% 1 year after surgery. Our study suggests that patients with PTDM are at higher risk of death-censored graft loss and bacterial infection, and worse kidney function. Independent risk factors of PTDM include preoperative FPG level, increased age, and tacrolimus. The PTDM group is more vulnerable to worse graft function, postoperative graft loss and bacterial infection.
The stimulator of interferon genes (STING) plays a critical role in innate immunity. Emerging evidence suggests that STING is important for DNA or cGAMP-induced non-canonical autophagy, which is independent of a large part of canonical autophagy machineries. Here, we report that, in the absence of STING, energy stress-induced autophagy is upregulated rather than downregulated. Depletion of STING in Drosophila fat cells enhances basal- and starvation-induced autophagic flux. During acute exercise, STING knockout mice show increased autophagy flux, exercise endurance, and altered glucose metabolism. Mechanistically, these observations could be explained by the STING-STX17 interaction. STING physically interacts with STX17, a SNARE that is essential for autophagosome biogenesis and autophagosome-lysosome fusion. Energy crisis and TBK1-mediated phosphorylation both disrupt the STING-STX17 interaction, allow different pools of STX17 to translocate to phagophores and mature autophagosomes, and promote autophagic flux. Taken together, we demonstrate a heretofore unexpected function of STING in energy stress-induced autophagy through spatial regulation of autophagic SNARE STX17.
Background Diastolic dysfunction (DD) frequently occurs in dialysis patients; however, the risk factors of DD remain to be further explored in such a population. Epicardial adipose tissue (EAT) volume has proven to be an independent clinical risk factor for multiple cardiac disorders. Purpose To assess whether EAT volume is an independent risk factor for DD in dialysis patients. Study Type Case–control study. Population A total of 113 patients (mean age: 54.5 ± 14.4 years; 41 women) who had underwent dialysis for at least 3 months due to uremia. Field Strength A 3 T, steady‐state free precession (SSFP) sequence for cine imaging, modified Look–Locker imaging (MOLLI) for T1 mapping and gradient‐recalled‐echo for T2*. Assessment All participants were performed cardiac magnetic resonance imaging (MRI) and echocardiogram. For MRI images analysis, borders of the EAT were manually delineated, as well as, pericardial adipose tissue (PeAT) and paracardial adipose tissue (PaAT), T1 mapping, T2* mapping, global longitudinal strain (GLS), and left atrial strain. For echocardiogram assessments, the thickness of PaAT, e' velocity, E velocity, E/e ratio, A velocity, and deceleration time were measured. Statistical tests Univariate and multivariate logistic regressions were performed to explore the independent risk factors for DD. P value less than 0.05 was considered as significant. Results Compared with the DD(−) group, the DD(+) group had significantly more epicardial tissue fat (18.5 ± 1.3 vs. 30.9 ± 2.3) In addition, EAT volumes increased significantly with the grades of DD (grade 1 vs. grade 2 and 3: 27.9 ± 15.9 vs. 35.4 ± 13.1). Moreover, EAT had significant correlations with T1 mapping, T2* mapping, GLS, left atrial strain, e' velocity, and E/e ratio. EAT accumulation added an independent risk for DD (Odds Ratio = 1.03) over conventional clinical risk factors including age, diabetes mellitus, and hemodialysis. Data Conclusion EAT was associated with diastolic function, and its accumulation may be an independent risk factor for DD among dialysis patients. Evidence Level 2 Technical Efficacy Stage 2
One of the most critical axes for cell fate determination is how cells respond to excessive reactive oxygen species (ROS)—oxidative stress. Extensive lipid peroxidation commits cells to death via a distinct cell death paradigm termed ferroptosis. However, the molecular mechanism regulating cellular fates to distinct ROS remains incompletely understood. Through siRNA against human receptor-interacting protein kinase (RIPK) family members, we found that RIPK4 is crucial for oxidative stress and ferroptotic death. Upon ROS induction, RIPK4 is rapidly activated, and the kinase activity of RIPK4 is indispensable to induce cell death. Specific ablation of RIPK4 in kidney proximal tubules protects mice from acute kidney injury induced by cisplatin and renal ischemia/reperfusion. RNA sequencing revealed the dramatically decreased expression of acyl-CoA synthetase medium-chain (ACSM) family members induced by cisplatin treatment which is compromised in RIPK4-deficient mice. Among these ACSM family members, suppression of ACSM1 strongly augments oxidative stress and ferroptotic cell death with induced expression of ACS long-chain family member 4, an important component for ferroptosis execution. Our lipidome analysis revealed that overexpression of ACSM1 leads to the accumulation of monounsaturated fatty acids, attenuation of polyunsaturated fatty acid (PUFAs) production, and thereby cellular resistance to ferroptosis. Hence, knockdown of ACSM1 resensitizes RIPK4 KO cells to oxidative stress and ferroptotic death. In conclusion, RIPK4 is a key player involved in oxidative stress and ferroptotic death, which is potentially important for a broad spectrum of human pathologies. The link between the RIPK4–ASCM1 axis to PUFAs and ferroptosis reveals a unique mechanism to oxidative stress–induced necrosis and ferroptosis.
Monitoring allograft function during the early stages is crucial, and therefore requires biomarkers more sensitive than serum creatinine (Scr). Kidney injury molecular-1 (KIM-1) is a potent biomarker; however, disparities exist in the literature concerning its predictive value in allograft function. Therefore, this study aimed to evaluate its predictive value for the long-term prognosis of kidney transplantation patients.A prospective study with a cohort comprising 160 patients scheduled for kidney transplantation was conducted to evaluate the predictive power of urinary KIM-1 (uKIM-1) and other renal ischemia-reperfusion biomarkers including urinary L-type fatty acid binding protein (uL-FABP), urinary N-acetyl-β-D glucosaminidase (uNAG), and urinary neutrophil gelatinase-related lipoprotein (uNGAL) for allograft prognosis.One hundred and forty kidney recipients who were admitted to our hospital between September 2014 and December 2017 with a median follow-up of 30.3 months were included. Thirty-seven recipients had functional delayed graft function (fDGF) in the first week post transplantation, and 42 recipients had progressed to allograft dysfunction [estimated glomerular filtration rate (eGFR) <60 mL/min/1.73 m2] by the end of the study, while nine recipients deteriorated into allograft loss (defined by the initiation of dialysis). The levels of uKIM-1 in the fDGF group were higher than those in the immediate graft function (IGF) recipients (P<0.05) at 0 hour post transplantation [5.885 (4.420-7.913) vs. 4.605 (3.417-5.653) ng/mmol], and on the first day post transplantation [5.569 (4.181-6.722) vs. 4.002 (3.222-6.488) ng/mmol]. The levels of uL-FABP in the fDGF group were also higher than those in the IGF group at 0 hour post transplantation (89.818±39.332 vs. 69.187±37.926 µg/mmol) and on the third day post transplantation [77.835 (60.368-100.678) vs. 66.841 (28.815-89.783) µg/mmol]. Multivariate Cox regression analysis demonstrated that recipients with higher uKIM-1 levels on the first day post transplantation had a 23.5% increase in the risk of developing fDGF and a 27.3% increase in the risk of prolonged renal allograft dysfunction.uKIM-1 on the first day post transplantation can predict short-term graft function and is a potent biomarker for the long-term prognosis of graft function.