Typhoid fever, caused by the human-restricted organism Salmonella enterica serovar Typhi (S. Typhi), constitutes a major global health problem. The development of improved attenuated vaccines is pressing, but delayed by the lack of appropriate preclinical models. Herein we report that high levels of S. Typhi-responsive CD8+ T cells at baseline significantly correlate with an increased risk of disease in humans challenged with a high dose (~104 CFU) wild-type S. Typhi. Typhoid fever development was associated with higher multifunctional S. Typhi-responsive CD8+ T effector/memory cells at baseline. Early decreases of these cells in circulation following challenge were observed in both S. Typhi-responsive integrin a4b7- and integrin a4b7+ CD8+ T effector/memory (TEM) cells, suggesting their potential to home to both mucosal and extra-intestinal sites. Participants with higher baseline levels of S. Typhi-responsive CD8+ T memory cells had a higher risk of acquiring disease, but among those who acquired disease, those with a higher baseline responses took longer to develop disease. In contrast, protection against disease was associated with low or absent S. Typhi-responsive T cells at baseline and no changes in circulation following challenge. These data highlight the importance of pre-existing S. Typhi-responsive immunity in predicting clinical outcome following infection with wild-type S. Typhi and provide novel insights into the complex mechanisms involved in protective immunity to natural infection in a stringent human model with a high challenge dose. They also contribute important information on the immunological responses to be assessed in the appraisal and selection of new generation typhoid vaccines.
Morbidity and damage due to prednisone use in the treatment of SLE is recognized, but prednisone has been a requisite of lupus nephritis induction regimens. We examined, by calendar years, the prednisone exposure and urine protein in lupus nephritis patients in a large single-center cohort.
Methods
We identified 76 SLE patients who had: 1) biopsy-proven Class III or IV lupus nephritis 2) cohort visit prior to their biopsy with elevated urine protein (dip stick of 2+ to 4+) and at least 4 cohort visits in the year following their biopsy. For each patient, the average daily prednisone dose, urine dipstick, serum cholesterol, and systolic blood pressure in the year following the biopsy were calculated.
Results
The average daily dose of prednisone is lower in more recent years, but the average urine protein was better.
Conclusions
Prednisone dose in Class III-IV lupus nephritis has been reduced in recent years, with no deleterious effect on urine protein (in fact there has been improved control of urine dipstick protein). The effect of prednisone on traditional risk factors was surprising. Patients receiving more than 20 mg/day of prednisone had a major increase in serum cholesterol. However, in those receiving 10-19 mg/d prednisone, there was a surprising decrease in both cholesterol and systolic blood pressure.
ABSTRACT IMPORTANCE Pericarditis is the most common cardiac manifestation of Systemic Lupus Erythematosus (SLE) and known to recur among patients. Yet, the prevalence and risk factors of recurrent pericarditis in SLE patients are unknown. OBJECTIVE Determine the frequency and risk factors for the recurrence of pericarditis in patients with SLE. DESIGN Retrospective analysis of a well-characterized, prospective cohort of SLE patients enrolled between 1988 and 2023. SETTING A single-center cohort study of a diverse group of SLE patients treated at a tertiary medical center. PARTICIPANTS Patients diagnosed with pericarditis (n=590) among those enrolled in the Hopkins Lupus Cohort (n=2931). MAIN OUTCOME Re-occurrence of pericarditis. The SELENA revision of the SLE Disease Activity Index (SLEDAI) was used to define pericarditis. Clinical information was examined for all follow-up encounters after the first episode of pericarditis. Pericarditis that occurred at least six weeks after the first recorded episode was defined as “Recurrent”. RESULTS Of 2931 patients within the cohort, 590 had a history of pericarditis. In 3.4% of patients, the diagnosis of pericarditis was confirmed via electrocardiogram (EKG) or dedicated imaging, with 100% concordance between clinical and data-based diagnoses. During a median follow-up of 7 years (IQR: 3 – 14), 20% (n=120) of patients experienced recurrent pericarditis (recurrence rate ≈ 0.05 per person-year of follow-up). Most patients (51%) experienced only one recurrence, whereas 49% had ≥2 recurrences. In multivariate analysis, predictors of recurrence included younger age (≥60 years vs. <40, RR 0.11 (0.04, 0.32), P <.001), treatment with prednisone (≥20 mg vs. 0, RR 1.99 (1.17, 3.40), P = 0.012), active SLE disease (SLEDAI ≥3 vs. 0, RR 1.55 (1.21, 2.00), P <.001), and time since initial episode (3-10 years vs. <1, RR 0.32 (0.20, 0.52), P <.001). CONCLUSION AND RELEVANCE Recurrence is more likely to occur within one year of the onset of pericarditis, and younger patients and those with uncontrolled disease are at greater risk of recurrence. As in the general population, oral prednisone therapy is associated with a higher chance of recurrence in SLE patients, with a dose-dependent effect. These findings set the basis for future studies to define optimal treatment for recurrent pericarditis in SLE patients and suggest that oral corticosteroids should be avoided when treating pericarditis.
We evaluated which aPL combinations increase the risk of future thrombosis in patients with SLE.This prospective cohort study consisted of SLE patients who had been tested for all seven aPL (LA, aCL isotypes IgM, IgG and IgA, and anti-β2-glycoprotein I isotypes IgM, IgG and IgA). Pooled logistic regression was used to assess the relationship between aPL and thrombosis.There were 821 SLE patients with a total of 75 048 person-months of follow-up. During the follow-up we observed 88 incident cases of thrombosis: 48 patients with arterial, 37 with venous and 3 with both arterial and venous thrombosis. In individual models, LA was the most predictive of any [age-adjusted rate ratio 3.56 (95% CI 2.01, 6.30), P < 0.0001], venous [4.89 (2.25, 10.64), P < 0.0001] and arterial [3.14 (1.41, 6.97), P = 0.005] thrombosis. Anti-β2-glycoprotein I IgA positivity was a significant risk factor for any [2.00 (1.22, 3.3), P = 0.0065] and venous [2.8 (1.42, 5.51), P = 0.0029] thrombosis. Only anti-β2-glycoprotein I IgA appeared to add significant risk to any [1.73 (1.04, 2.88), P = 0.0362] and venous [2.27 (1.13, 4.59), P = 0.0218] thrombosis among those with LA. We created an interaction model with four categories based on combinations of LA and other aPL to look at the relationships between combinations and the risk of thrombosis. In this model LA remained the best predictor of thrombosis.Our study demonstrated that in SLE, LA remained the best predictor of thrombosis and adding additional aPL did not add to the risk, with the exception of anti-β2-glycoprotein I IgA.
Cluster randomized control trials (CRCTs) are used frequently in the field of infection control and antimicrobial stewardship because randomization at the patient level is often not feasible due to contamination, ethical, or logistical issues. The correlation and thus non-independence that exists among individual patients in a cluster must be accounted for when estimating sample size for such trials, yet many studies neglect to consider or report the intracluster correlation coefficient (ICC) and the resulting coefficient of variation (CV) in rates between hospitals. The aim of this study was to estimate the sample sizes needed to adequately power studies of hospital-level interventions to reduce rates of healthcare-associated infections. We calculated the minimum number of clusters or hospitals that would need to be included in a study to have good power to detect an impact of the intervention given a range of different assumptions. We estimated parameters needed for these calculations using national rates from the National Healthcare Safety Network (NHSN) for methicillin-resistant Staphylococcus aureus (MRSA) bacteremia, central-line associated bloodstream infections (CLABSI), catheter-associated urinary tract infections (CAUTI), C. difficile infections (CDI) and variation between hospitals in these rates. These calculations were based on the assumption that hospitals were uniform and moderate in size and were studied for 1 year. To study an intervention leading to a 50% decrease in daily rates and using the C vs. calculated from NHSN, 22 average-sized hospitals for MRSA bacteremia are needed, 34 for CAUTI, 9 for CDI, and 27 for CLABSI to have a statistically significant decrease with a type I error rate of 0.05 and a type II error rate of 0.8. If a 10% decrease in rates is expected instead, 709, 1205, 279, and 866 hospitals, respectively, are needed. Sample size estimates for CRCTs are most influenced by the CV and the expected effect size. Given the large sample size requirements, it is likely that many CRCTs in hospital epidemiology are under-powered. We hope that these findings lead to more definitive CRCTs in the field of hospital epidemiology that are properly powered and more studies reporting their ICC or CV. All authors: No reported disclosures.