Background Nosocomial norovirus infections and their control measures disrupt patient care, increase staff workload and raise healthcare costs. Objective To determine the impact on outbreaks of nosocomial viral gastroenteritis, staff and patients affected, and bed closures of a multidimensional quality improvement (QI) initiative focused on education; improved patient surveillance; early automated recognition and notification of infection of index patients; and proactive care and control measures. Methods In a pragmatic, retrospective, observational study, we compared numbers of suspected/confirmed norovirus outbreaks at Portsmouth Hospitals National Health Service Trust (PHT) with regional and national data, before and after a multidimensional QI initiative. We also compared mean daily bed closures due to norovirus-like symptoms. At PHT only we recorded patient and staff numbers with norovirus-like symptoms, and days of disruption due to outbreaks. Results Annual outbreak numbers fell between 2009–2010 and 2010–2014 by 91% at PHT compared with 15% and 28% for Wessex and England, respectively. After April 2010, recorded outbreaks were 8 (PHT), 383 (Wessex) and 5063 (England). For the winter periods from 2010/2011 to 2013/2014, total bed closures due to norovirus were 38 (PHT; mean 0.5 per week), 3565 (Wessex hospitals; mean 48.8 per hospital per week) and 2730 (England; mean 37.4 per hospital per week). At PHT, patients affected by norovirus-like symptoms fell by 92%, affected staff by 81% and days of disruption by 88%. Conclusions A multiyear QI programme, including use of real-time electronic identification of patients with norovirus-like symptoms, and an early robust response to suspected infection, resulted in virtual elimination of outbreaks. The ability to identify index cases of infection early facilitates prompt action to prevent ongoing transmission and appears to be a crucial intervention.
Objective To advance methods for the estimation of hospital performance based upon mortality ratios. Design Observational study estimating trust performance in a year derived according to comparative standards from a 3-year period, accounting for patient-level case-mix and overdispersion (unexplained variability). Participants 23 363 630 admissions to the English National Health Service (NHS) by NHS Trust. Main outcome measures Number of SDs (QUality and Outcomes Research Unit Measure, QUORUM banding) and comparative odds of hospital mortality difference from mean performance by trust compared for 2010/2011, 2008/2009 and 2009/2010, accounting for patient-level case-mix. Results The model was highly predictive of mortality (C statistic=0.93), and well calibrated by risk stratum. There was substantial overdispersion. No trusts were more than 3 SDs above the mean, and only one trust was more than 2 SDs above the mean for 2010/2011. Conclusions QUORUM is highly predictive of patient mortality in hospital or up to 30 days after admission. However, like the Summary Hospital Mortality Indicator (SHMI), QUORUM is subjected to considerable remaining legitimate but unexplained variation. It is unlikely that measures like QUORUM and SHMI will be useful beyond identifying a very small number of trusts as potential outliers.
Sequencing reads were aligned to the Amel_HAv3.1 reference genome using BWA-MEM v0.7.17. Reads were sorted with SAMtools v1.9 and duplicates marked (MarkDuplicates) with GATK v4.0.11.0. Variants for each sample were called using GATK’s HaplotypeCaller with the following non-default parameters --ERC GVCF, --sample-ploidy 1 and -A AlleleFraction. Joint variant calling was performed across all samples collated for AmelHap using GATK’s GenomicDBImport and GenotypeGVCFs with --sample-ploidy 1 and a window size of 10 Mb. This dataset comprises the raw variant calls only for samples belonging to project accession: PRJNA363032.
INTRODUCTION During apnea at sea-level, a contraction of the spleen is found in humans (Hurford et al 1990) causing a transient increase in hemoglobin concentration (Hb) and hematocrit (Schagatay et al 2001). The development of these increases is progressive across 3 serial apneas, typically resulting in Hb increases of 2% (Richardson et al, 2003), and recovery to pre-apneic values within 8-9 minutes (Schagatay et al 2005). The spleen contraction-associated Hb increase is in part triggered by the hypoxia occurring during apnea (Richardson et al 2005). The Hb increase leads to increased blood gas storage capacity which facilitates prolonged apnea in humans and may be responsible for the known prolongation of serial apneas (Schagatay et al 2001). At altitude, we suggest that the chronic hypoxia could induce splenic contraction during eupnea in humans, as previously observed in mice (Cook and Alafi 1956). Our aim was to reveal whether spleen related Hb increase occurs during eupneic altitude exposure in humans, thereby abolishing the Hb increase normally seen during apnea.
Rationale: Shared symptoms and genetic architecture between coronavirus disease (COVID-19) and lung fibrosis suggest severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection may lead to progressive lung damage. Objectives: The UK Interstitial Lung Disease Consortium (UKILD) post–COVID-19 study interim analysis was planned to estimate the prevalence of residual lung abnormalities in people hospitalized with COVID-19 on the basis of risk strata. Methods: The PHOSP–COVID-19 (Post-Hospitalization COVID-19) study was used to capture routine and research follow-up within 240 days from discharge. Thoracic computed tomography linked by PHOSP–COVID-19 identifiers was scored for the percentage of residual lung abnormalities (ground-glass opacities and reticulations). Risk factors in linked computed tomography were estimated with Bayesian binomial regression, and risk strata were generated. Numbers within strata were used to estimate posthospitalization prevalence using Bayesian binomial distributions. Sensitivity analysis was restricted to participants with protocol-driven research follow-up. Measurements and Main Results: The interim cohort comprised 3,700 people. Of 209 subjects with linked computed tomography (median, 119 d; interquartile range, 83–155), 166 people (79.4%) had more than 10% involvement of residual lung abnormalities. Risk factors included abnormal chest X-ray (risk ratio [RR], 1.21; 95% credible interval [CrI], 1.05–1.40), percent predicted DlCO less than 80% (RR, 1.25; 95% CrI, 1.00–1.56), and severe admission requiring ventilation support (RR, 1.27; 95% CrI, 1.07–1.55). In the remaining 3,491 people, moderate to very high risk of residual lung abnormalities was classified at 7.8%, and posthospitalization prevalence was estimated at 8.5% (95% CrI, 7.6–9.5), rising to 11.7% (95% CrI, 10.3–13.1) in the sensitivity analysis. Conclusions: Residual lung abnormalities were estimated in up to 11% of people discharged after COVID-19–related hospitalization. Health services should monitor at-risk individuals to elucidate long-term functional implications.
Introduction Innovation is needed for the growing number of patients with chronic obstructive pulmonary disease (COPD). Pulmonary rehabilitation (PR) is effective in improving exercise tolerance and quality of life, but these benefits do not appear to be sustained. This highlights the need for cost effective methods to maintain benefits on completion of therapy. The findings of a large trial from the UK are reported. Methods A two-center randomized controlled trial of patients discharged from PR compared the costs and benefits of PR maintenance with standard care. National Health Service (NHS) resource use, personal expenditure, and societal costs were recorded over one year, and bottom-up costing was undertaken for the PR maintenance program. Changes in health-related quality of life were recorded using the EQ-5D-5L, and differences were compared with the level identified as significant for COPD. A cost utility analysis was undertaken from an NHS perspective; uncertainties in cost and outcome data were incorporated into a sensitivity analysis. Cost-effectiveness ratios and cost-effectiveness acceptability curves (CEACs) were computed. Results The study included 116 patients who had finished PR within the last four weeks. The economic analysis showed that mean healthcare costs per patient for PR maintenance were approximately GBP139.72 (EUR165.57) lower than for usual care. The observed 0.118 advantage in mean quality-adjusted life-years (QALYs) (p<0.05) was above the threshold (0.051) for COPD significance. CEACs indicated there was a 97 percent chance of achieving GBP20,000 (EUR23,699.80) per QALY (NICE acceptance level ≤GBP30,000 (EUR35,549.70). Patient and societal costs increased this percentage. It was estimated that if patients with COPD completed a maintenance program following PR, the NHS could save up to GBP28.6 million (EUR33.89 million). Conclusions Our findings confirm that a structured PR maintenance program is highly cost effective in extending the benefits of short-term PR. The trial, undertaken during COVID, also signals the potential for emerging digital innovations to provide future transformative change in delivering self-management programs to sustain health and reduce NHS costs for people living with chronic conditions.
Objective The prevalence of metabolic syndrome (MetS) has been reported to be higher in selected populations of people with COPD. The impact of MetS on mortality in COPD is unknown. We used routinely collected healthcare data to estimate the prevalence of MetS in people with COPD managed in primary care and determine its impact on 5-year mortality. Methods Records from 103 955 patients with COPD from the Clinical Practice Research Datalink (CPRD-GOLD) between 2009 to 2017 were scrutinised. MetS was defined as the presence of three or more of: obesity, hypertension, lowered high-density lipoprotein cholesterol, elevated triglycerides or type 2 diabetes mellitus (T2DM). Univariate and multivariable Cox regression models were constructed to determine the prognostic impact of MetS on 5-year mortality. Similar univariate models were constructed for individual components of the definition of MetS. Results The prevalence of MetS in the COPD cohort was 10.1%. Univariate analyses showed the presence of MetS increased mortality (hazard ratio (HR) 1.19, 95% CI: 1.12–1.27, p<0.001), but this risk was substantially attenuated in the multivariable analysis (HR 1.06, 95% CI: 0.99–1.13, p = 0.085). The presence of hypertension (HR 1.70, 95% CI: 1.63–1.77, p<0.001) and T2DM (HR 1.41, 95% CI: 1.34–1.48, p<0.001) increased and obesity (HR 0.74, 95% CI: 0.71–0.78, p<0.001) reduced mortality risk. Conclusion MetS in patients with COPD is associated with higher 5-year mortality, but this impact was minimal when adjusted for indices of COPD disease severity and other comorbidities. Individual components of the MetS definition exerted differential impacts on mortality suggesting limitation to the use of MetS as a multicomponent condition in predicting outcome in COPD.
Sequencing reads were aligned to the Amel_HAv3.1 reference genome using BWA-MEM v0.7.17. Reads were sorted with SAMtools v1.9 and duplicates marked (MarkDuplicates) with GATK v4.0.11.0. Variants for each sample were called using GATK’s HaplotypeCaller with the following non-default parameters --ERC GVCF, --sample-ploidy 1 and -A AlleleFraction. Joint variant calling was performed across all samples collated for AmelHap using GATK’s GenomicDBImport and GenotypeGVCFs with --sample-ploidy 1 and a window size of 10 Mb. This dataset comprises the raw variant calls only for samples belonging to project accession: PRJEB16533.