Background: The extent of error, from collection to processing, when measuring PO2, PCO2 and pH in arterial blood samples drawn from critically ill patients with sepsis and leucocytosis, is unknown. Methods: Twenty-nine patients with sepsis and a leucocyte count > 12 000/mm3, who had routine arterial blood analysis were included in the study. Blood was drawn into two 1 ml heparinised glass syringes. One syringe was cooled on ice and tested at 60 minutes. The other syringe was used for analysis at 0, 10, 30 and 60 minutes. Differences in measurements, from the Time-0 results, were described. For PO2, linear mixed models estimated the impact of time to processing, controlling for the potentially confounding and moderating effects of Time-0 leucocyte count and fractional inspired oxygen concentration respectively. Results: PO2 exhibited the most pronounced changes over time at ambient temperature: The mean (SD) relative differences at 10, 30 and 60 minutes were -4.72 (8.82), -13.66 (10.25), and -25.12 (15.55)% respectively; and mean (SD) absolute differences -0.88 (1.49), -2.37 (1.89) and -4.32 (3.06) kPa. For pH, at 60 minutes, the mean (SD) relative and absolute differences were -0.27 (0.45)% and -0.02 (0.03) respectively; for PCO2, 6.16 (7.80)% and 0.25 (0.35) kPa. The median differences for the on-ice 60-minute sample for pH and PCO2 were 0.019 and -0.12 (both P < 0.001), and for PO2 0.100 (P: 0.216). The model estimated that average PO2 decreased by 5% per 10 minute delay in processing (95% CI for effect: 0.94 to 0.96; P < 0.001) at the average leucocyte count, with more rapid declines at higher counts, though with substantial inter-patient variation. Conclusion: Delayed blood gas analysis in samples stored at ambient temperature results in a statistically and clinically significant progressive decrease in arterial PO2, which may alter clinical decision-making in septic patients.
Abstract Introduction While a large proportion of people with HIV (PWH) have experienced SARS‐CoV‐2 infections, there is uncertainty about the role of HIV disease severity on COVID‐19 outcomes, especially in lower‐income settings. We studied the association of mortality with characteristics of HIV severity and management, and vaccination, among adult PWH. Methods We analysed observational cohort data on all PWH aged ≥15 years experiencing a diagnosed SARS‐CoV‐2 infection (until March 2022), who accessed public sector healthcare in the Western Cape province of South Africa. Logistic regression was used to study the association of mortality with evidence of antiretroviral therapy (ART) collection, time since first HIV evidence, CD4 cell count, viral load (among those with evidence of ART collection) and COVID‐19 vaccination, adjusting for demographic characteristics, comorbidities, admission pressure, location and time period. Results Mortality occurred in 5.7% (95% CI: 5.3,6.0) of 17,831 first‐diagnosed infections. Higher mortality was associated with lower recent CD4, no evidence of ART collection, high or unknown recent viral load and recent first HIV evidence, differentially by age. Vaccination was protective. The burden of comorbidities was high, and tuberculosis (especially more recent episodes of tuberculosis), chronic kidney disease, diabetes and hypertension were associated with higher mortality, more strongly in younger adults. Conclusions Mortality was strongly associated with suboptimal HIV control, and the prevalence of these risk factors increased in later COVID‐19 waves. It remains a public health priority to ensure PWH are on suppressive ART and vaccinated, and manage any disruptions in care that occurred during the pandemic. The diagnosis and management of comorbidities, including for tuberculosis, should be optimized.
Abstract Introduction There is an urgent need for more efficient models of differentiated antiretroviral therapy (ART) delivery for people living with HIV (PLHIV), with the World Health Organization calling for evidence to guide whether annual ART prescriptions and consultations (12M scripts) should be recommended in global guidelines. We assessed the association between 12M scripts (allowed temporarily during the COVID-19 pandemic) versus standard 6-month prescriptions and clinical review (6M scripts) and clinical outcomes. Methods We performed a retrospective cohort study using routine, de-identified data from 59 public clinics in KwaZulu-Natal, South Africa. We included PLHIV aged > 18 years with a recent suppressed viral load (VL) who had been referred for community ART delivery with 6M or 12M scripts. We used modified Poisson regression to compare 12-month retention-in-care (not >90 days late for any visit) and viral suppression (<50 copies/mL) between prescription groups. Results Among 27,148 PLHIV referred for community ART between Jun-Dec 2020, 42.6% received 6M scripts and 57.4% 12M scripts. The median age was 39 years (interquartile range [IQR] 33-46) and 69.4% were women. Age, gender, prior community ART use and time on ART were similar in the two groups. However, more of the 12M script group had a dolutegravir-based regimen (60.0% versus 46.3%). The median (IQR) number of clinic visits in the 12 months of follow-up was 1(1-1) in the 12M group and 2(2-3) in the 6M group. Retention at 12 months was 94.6% (95% confidence interval [CI] 94.2%-94.9%) among those receiving 12M scripts and 91.8% (95% CI 91.3%-92.3%) among those with 6M scripts. 17.1% and 16.9% of clients in the 12M and 6M groups were missing follow-up VL data, respectively. Among those with VLs, 91.0% (95% CI 90.5%-91.5%) in the 12M group and 89.7% (95% CI 89.0%-90.3%) in the 6M group were suppressed. After adjusting for age, gender, ART regimen, time on ART, prior community ART use and calendar month, retention (adjusted risk ratio [aRR]: 1.03, 95% CI 1.01-1.05) and suppression (aRR: 1.01, 95% CI 1.00-1.02) were similar in the prescription groups. Conclusions Wider use of 12M scripts could reduce clinic visits without impacting short-term clinical outcomes.
Introduction: Temporal trends in CF survival from low-middle-income settings are poorly reported. We describe changes in CF survival after diagnosis over 40 years from a South African (SA) CF center. Methods: An observational cohort study of people diagnosed with CF from 1974 to 2019. Changes in age-specific mortality rates from the year 2000 (versus before 2000) were estimated using multivariable Poisson regression. Data were stratified by current age < or ≥ 10 years and models controlled for diagnosis age, sex, ethnicity, genotype, and P. aeruginosa (PA) infection. A second analysis explored association of mortality with weight and FEV1z-scores at age 5-8 years. Results: 288 people (52% male; 57% Caucasian; 44% p.Phe508del homozygous) were included (median diagnosis age 0.5 years: Q1,Q3: 0.2, 2.5); 58 (35%) died and 30 (10%) lost to follow-up. Among age >10 years, age-specific mortality from year 2000 was significantly lower (adjusted hazard ratio aHR: 0.14; 95% CI: 0.06,0.29; p<0.001), but not among age <10 years (aHR: 0.67; 95% CI: 0.28,1.64; p=0.383). In children <10 years, Caucasian ethnicity was associated with lower mortality (aHR 0.17; 95% CI 0.05,0.63), and time since first PA infection with higher mortality (aHR 1.31; 95% CI 1.01,1.68). Mortality was 7-fold higher if FEV1z was < -2.0 at age 5-8 years (aHR 7.64; 95% CI 2.58,22.59). Conclusion: Overall, CF survival has significantly improved in SA from year 2000 in people older than 10 years. However, increased risk of mortality persists in young non-Caucasian children, and with FEV1z<-2.0 at age 5-8 years.
Assays for classifying HIV infections as 'recent' or 'nonrecent' for incidence surveillance fail to simultaneously achieve large mean durations of 'recent' infection (MDRIs) and low 'false-recent' rates (FRRs), particularly in virally suppressed persons. The potential for optimizing recent infection testing algorithms (RITAs), by introducing viral load criteria and tuning thresholds used to dichotomize quantitative measures, is explored.The Consortium for the Evaluation and Performance of HIV Incidence Assays characterized over 2000 possible RITAs constructed from seven assays (Limiting Antigen, BED, Less-sensitive Vitros, Vitros Avidity, BioRad Avidity, Architect Avidity, and Geenius) applied to 2500 diverse specimens.MDRIs were estimated using regression, and FRRs as observed 'recent' proportions, in various specimen sets. Context-specific FRRs were estimated for hypothetical scenarios. FRRs were made directly comparable by constructing RITAs with the same MDRI through the tuning of thresholds. RITA utility was summarized by the precision of incidence estimation.All assays produce high FRRs among treated patients and elite controllers (10-80%). Viral load testing reduces FRRs, but diminishes MDRIs. Context-specific FRRs vary substantially by scenario - BioRad Avidity and Limiting Antigen provided the lowest FRRs and highest incidence precision in scenarios considered.The introduction of a low viral load threshold provides crucial improvements in RITAs. However, it does not eliminate nonzero FRRs, and MDRIs must be consistently estimated. The tuning of thresholds is essential for comparing and optimizing the use of assays. The translation of directly measured FRRs into context-specific FRRs critically affects their magnitudes and our understanding of the utility of assays.
Background Mean duration of recent infection (MDRI) and misclassification of long-term HIV-1 infections, as proportion false recent (PFR), are critical parameters for laboratory-based assays for estimating HIV-1 incidence. Recent review of the data by us and others indicated that MDRI of LAg-Avidity EIA estimated previously required recalibration. We present here results of recalibration efforts using >250 seroconversion panels and multiple statistical methods to ensure accuracy and consensus. Methods A total of 2737 longitudinal specimens collected from 259 seroconverting individuals infected with diverse HIV-1 subtypes were tested with the LAg-Avidity EIA as previously described. Data were analyzed for determination of MDRI at ODn cutoffs of 1.0 to 2.0 using 7 statistical approaches and sub-analyzed by HIV-1 subtypes. In addition, 3740 specimens from individuals with infection >1 year, including 488 from patients with AIDS, were tested for PFR at varying cutoffs. Results Using different statistical methods, MDRI values ranged from 88–94 days at cutoff ODn = 1.0 to 177–183 days at ODn = 2.0. The MDRI values were similar by different methods suggesting coherence of different approaches. Testing for misclassification among long-term infections indicated that overall PFRs were 0.6% to 2.5% at increasing cutoffs of 1.0 to 2.0, respectively. Balancing the need for a longer MDRI and smaller PFR (<2.0%) suggests that a cutoff ODn = 1.5, corresponding to an MDRI of 130 days should be used for cross-sectional application. The MDRI varied among subtypes from 109 days (subtype A&D) to 152 days (subtype C). Conclusions Based on the new data and revised analysis, we recommend an ODn cutoff = 1.5 to classify recent and long-term infections, corresponding to an MDRI of 130 days (118–142). Determination of revised parameters for estimation of HIV-1 incidence should facilitate application of the LAg-Avidity EIA for worldwide use.
Aim and objectives: Temporal trends in CF survival from low-middle-income (LMIC) settings are poorly reported. We describe changes in CF survival over 40 years from a South African (SA) CF center. Methods: A cohort study of people with CF from 1974 to 2019. Changes in age-specific mortality rates (MR) before/after year 2000 were estimated using multivariable Poisson regression. Adjusted models were stratified for age 10 years and controlled for diagnosis age, sex, ancestry, genotype, and P.aeruginosa(PA) infection. Adjusted sub-analysis explored association between MRratio (aMRr) and FEV1z-scores at age 5-8 years. Results: 288 people (52% male;57% Caucasian;44% p.Phe508del homozygous) were included (median diagnosis age 0.5 years, IQR 0.2-2.5); 58(35%) died and 30(10%) lost to follow-up. Age-specific MR reduced by 64% (figure 1) after year 2000, with significant reduction age>10 (aMRr 0.25; 95% CI 0.14,0.46;p<0.01) compared to <10 years (aMRr 0.67; 95% CI 0.28,1.64).In children<10 years, Caucasian ancestry was associated with lower mortality (aMRr 0.17; 95% CI 0.04,0.63), and time since first PA infection with higher mortality (aMRr 1.30; 95% CI 1.01,1.68). Mortality was 6-fold higher if FEV1z was<-2.0 at age 5-8 years (aMRr 6.56; 95% CI 2.40,17.9). Conclusion: Overall, CF survival has significantly improved in SA after year 2000. However, increased risk of mortality persists in non-Caucasians and children with FEV1z<-2.0 at age 5-8 years.
Abstract Introduction There is little data on long‐term implementation and outcomes for people living with HIV (PLHIV) in differentiated antiretroviral therapy (ART) delivery programmes. We aimed to analyse usage patterns of and associated treatment outcomes in a community ART programme, within the Centralized Chronic Medicines Dispensing and Distribution programme, in South Africa over 3.5 years. Methods We performed a retrospective cohort study among PLHIV on first‐line ART who were eligible for community ART delivery between October 2016 and March 2019, from 56 urban clinics in KwaZulu‐Natal, South Africa. Follow‐up ended in March 2020. We measured referral rates and, among those referred, we characterized patterns of community ART usage using group‐based trajectory modelling following referral. We used survival analysis to measure the association between community ART usage and loss‐to‐care (no visit for ≥365 days) and logistic regression to measure the association between community ART usage and viraemia (≥50 copies/ml). Results Among the 80,801 patients eligible for community ART, the median age was 36 years, 69.8% were female and the median (interquartile range [IQR]) follow‐up time was 22 (13–31) months. In total, 49,961 (61.8%) were referred after a median of 6 (IQR 2–13) months from first eligibility. After referral, time spent in community ART varied; 42% remained consistently in community ART, 15% returned to consistent clinic‐based care and the remaining 43% oscillated between community ART and clinic‐based care. Following referral, the incidence of loss‐to‐care was 3.93 (95% confidence interval [CI]: 3.71–4.15) per 100 person‐years during periods of community ART usage compared to 5.75 (95% CI: 5.28–6.25) during clinic‐based care. In multivariable models, community ART usage was associated with a 36% reduction in the hazards of loss‐to‐care (adjusted hazard ratio: 0.64 [95% CI: 0.57–0.72]). The proportion of patients who became viraemic after first community ART referral was 5.2% and a 10% increase in time in community ART was associated with a 3% reduction in odds of viraemia (adjusted odds ratio: 0.97 [95% CI: 0.95–0.99]). Conclusions Community ART usage patterns vary considerably, while clinical outcomes were good. Promoting consistent community ART usage may reduce clinic burden and the likelihood of patients being lost to care, while sustaining viral suppression.