ABSTRACT The traditional posterior median approach laminectomy is widely used for lumbar decompression. However, the bilateral dissection of paraspinal muscles during this procedure often leads to postoperative muscle atrophy, chronic low back pain, and other complications. The posterior midline spinous process‐splitting approach (SPSA) offers a significant advantage over the traditional approach by minimizing damage to the paraspinal muscles. SPSA reduces the incidence of muscle atrophy and chronic low back pain while maintaining the integrity of the posterior spinal structures. The technique involves longitudinal splitting of the spinous process, which allows for adequate access to the lamina for decompression without detaching the paraspinal muscles. As a result, it provides a clearer surgical field and facilitates muscle preservation, which reduces the risk of postoperative complications. Additionally, SPSA requires only standard surgical instruments, making it accessible in most surgical settings. This paper reviews the anatomical considerations, surgical techniques, and clinical applications of the SPSA, highlighting its effectiveness in reducing muscle atrophy and improving recovery outcomes. The paper also discusses its potential in treating conditions such as lumbar spinal stenosis, disc herniation, and spondylolisthesis. Furthermore, it emphasizes the need for future research to establish the long‐term benefits of SPSA and refine surgical techniques. The results suggest that SPSA is a promising alternative to traditional approaches, with better outcomes in terms of muscle preservation and overall recovery.
Postoperative spinal epidural hematoma (SEH) is a rare but serious complication following lumbar surgery, with cauda equina syndrome (CES) being one of its most devastating outcomes. While CES typically presents with a combination of bladder and/or bowel dysfunction, diminished sensation in the saddle area, and motor or sensory changes in the lower limbs, atypical cases with isolated urinary symptoms are less recognized and pose significant diagnostic challenges. We report the case of a 46-year-old male who developed CES following lumbar microdiscectomy, presenting solely with urinary retention, without the classic signs of lower limb weakness or perineal sensory loss. Initial symptoms were attributed to postoperative urinary issues, delaying the diagnosis of CES. On postoperative day 7, magnetic resonance imaging (MRI) revealed SEH, and emergency hematoma evacuation was performed. Despite the delayed intervention, the patient made a full neurological recovery, with bladder and bowel functions restored by 3 months postoperatively. This case highlights the importance of recognizing CES in patients with isolated urinary dysfunction after lumbar surgery, even when typical neurological symptoms such as lower limb weakness or perineal sensory loss are absent. Early detection and prompt surgical intervention are critical, as delayed diagnosis may result in permanent neurological deficits. Moreover, this case underscores the need for vigilant postoperative monitoring, especially of urinary function, as isolated urinary symptoms may signal early CES. Maintaining a high index of suspicion for CES, even in atypical presentations, can facilitate timely diagnosis and improve patient outcomes. Furthermore, this case highlights the need for continued research into the prevention of SEH and the development of more robust diagnostic criteria for CES in postoperative patients. Future studies should focus on developing more comprehensive guidelines for monitoring postoperative patients, especially regarding urinary function, to aid in the early detection of CES.
ObjectivesTo determine whether microbial contamination of preservation solution (PS) in kidney transplantation is associated with donor-derived infections (DDIs).MethodsWe retrospectively analysed data from 1077 deceased kidney transplant recipients of 560 donors. In all, 1002 PS samples were collected for microbiological assessment to establish the incidence and distribution of contamination. Comparisons between patients with contaminated PS and those with sterile PS were performed to assess the impact of microbial contaminations in perfusate on probable donor-derived infections (p-DDIs), and potential risk factors for p-DDIs were examined.ResultsThe contamination rate of PS was 77.8% (402/517). Bacterial species accounted for 85.6% (887/1036) of the total 1036 isolated microorganisms and 26.5% (275/1002) of the recipients' PS were contaminated by ESKAPE pathogens (Enterococcus faecium, Staphylococcus aureus, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa and Enterobacter spp.). Enterococcus predominated the microbiological pattern. The incidence of infection was significantly higher in patients with microbial contamination than in patients with sterile PS (13.8% (107/776) versus 7.1% (16/226), p 0.006). The prevalence of p-DDIs was significantly higher in patients with ESKAPE contamination than in patients with other bacterial contamination in PS (7.2% (18/251) versus 1.0% (4/405), p 0.000). Univariate analysis indicated that ESKAPE contamination increased the risk of p-DDIs (p 0.001, OR 3.610, 95% CI 1.678–7.764). Multivariate analysis determined ESKAPE contamination as the only independent risk factor associated with p-DDIs (OR 3.418, 95% CI 1.580–7.393).ConclusionsThe high rate of microbial contaminations in PS is unusual and probably due to poor surgical procedures. Patients whose PS are contaminated by ESKAPE pathogens could have a significantly increased risk of p-DDIs at early post-transplantation.
Aims The aim of this study was to compare the efficacy and safety of induction therapy using the interleukin-2 receptor antagonist (IL-2RA) with antithymocyte globulin (ATG) under a tacrolimus-based immunosuppression regimen in kidney transplantation from donors after cardiac death. Methods It was a single-centre, retrospective, cohort study design to evaluate the efficacy and safety of IL-2RA vs. ATG induction therapy in adult renal transplant recipients from donors after cardiac death. The primary end-point was the incidence of biopsy-proven acute rejection (BPAR) at 6 months, and the secondary end-point included the incidence of delayed graft function (DGF), the renal function, and the patient and graft survival at 6 months. The safety end-point was the incidence of infectious complications. Results A total of 132 patients (n = 37 in the IL-2RA group and n = 95 in the ATG group) were enrolled from March 2013 to April 2014. The BPAR at 6 months was similar between the two groups (IL-2RA vs. the ATG group, 5.4% vs. 12.6%, respectively, p = 0.228). There were no differences in the DGF, renal function at 1 and 3 months, and the patient and graft survival at 6 months between the two groups, but the renal function at 6 months in the IL-2RA group was superior to that of the ATG group (p = 0.02). The IL-2RA group experienced less infection than the ATG group (p = 0.025). Conclusions The efficacy of IL2-RA and ATG induction under a tacrolimus-based immunosuppression regimen in low-risk DCD transplantation did not differ, but the safety of the IL2-RA induction was better than that of the ATG induction.