Acute aortic dissection in pregnancy is a rare event and rarer still in healthy young women; however, women with a bicuspid aortic valve or the Marfan syndrome are at a higher risk of dissection. The relationship between pregnancy and aortic dissection is still unclear. We describe the cases of two women with no history of cardiovascular disease who developed an acute aortic type A dissection within a few days after term delivery. Surgical repair was performed with ascending aorta replacement and aortic valve sparing. In both cases, the dissection was diagnosed within a few days following cesarean section done neither because of fetal or maternal distress. To date, only one case of type A and two cases of type B aortic dissection following cesarean section have been reported. Compared with spontaneous delivery, scheduled cesarean section, as in our cases, allows for better control of hemodynamic parameters and should protect against aortic dissection. Postoperative screening for inherent connective tissue disorders detected no mutations within the fibrillin and collagen gene chromosome in either patient. Postoperative recovery was uneventful, and the patients were discharged on postoperative days 7 and 8, respectively.
Alteration in lung microbes is associated with disease progression in idiopathic pulmonary fibrosis.
Objective
To assess the effect of antimicrobial therapy on clinical outcomes.
Design, Setting, and Participants
Pragmatic, randomized, unblinded clinical trial conducted across 35 US sites. A total of 513 patients older than 40 years were randomized from August 2017 to June 2019 (final follow-up was January 2020).
Interventions
Patients were randomized in a 1:1 allocation ratio to receive antimicrobials (n = 254) or usual care alone (n = 259). Antimicrobials included co-trimoxazole (trimethoprim 160 mg/sulfamethoxazole 800 mg twice daily plus folic acid 5 mg daily, n = 128) or doxycycline (100 mg once daily if body weight <50 kg or 100 mg twice daily if ≥50 kg, n = 126). No placebo was administered in the usual care alone group.
Main Outcomes and Measures
The primary end point was time to first nonelective respiratory hospitalization or all-cause mortality.
Results
Among the 513 patients who were randomized (mean age, 71 years; 23.6% women), all (100%) were included in the analysis. The study was terminated for futility on December 18, 2019. After a mean follow-up time of 13.1 months (median, 12.7 months), a total of 108 primary end point events occurred: 52 events (20.4 events per 100 patient-years [95% CI, 14.8-25.9]) in the usual care plus antimicrobial therapy group and 56 events (18.4 events per 100 patient-years [95% CI, 13.2-23.6]) in the usual care group, with no significant difference between groups (adjusted HR, 1.04 [95% CI, 0.71-1.53;P = .83]. There was no statistically significant interaction between the effect of the prespecified antimicrobial agent (co-trimoxazole vs doxycycline) on the primary end point (adjusted HR, 1.15 [95% CI 0.68-1.95] in the co-trimoxazole group vs 0.82 [95% CI, 0.46-1.47] in the doxycycline group;P = .66). Serious adverse events occurring at 5% or greater among those treated with usual care plus antimicrobials vs usual care alone included respiratory events (16.5% vs 10.0%) and infections (2.8% vs 6.6%); adverse events of special interest included diarrhea (10.2% vs 3.1%) and rash (6.7% vs 0%).
Conclusions and Relevance
Among adults with idiopathic pulmonary fibrosis, the addition of co-trimoxazole or doxycycline to usual care, compared with usual care alone, did not significantly improve time to nonelective respiratory hospitalization or death. These findings do not support treatment with these antibiotics for the underlying disease.
Circulatory shock, especially endotoxin shock, is characterized by the release of a large number of mediators, among which proteases play a key role. The production of oxygen free radicals into the extracellular space and the increase of capillary permeability is one of the most important consequences of that phenomenon. In order to evaluate the efficacy of gabexate mesilate (Foy) in preventing such increase of microvascular permeability, an experimental model of endotoxin shock was used.Experiments were performed on the mesocecum of male Wistars rats, fluorescent labeled bovine albumine was injected intrarterially to evaluate the capillary permeability and the mesocecum microcirculation was observed by fluorescent light. The control group received saline i.v.; the II group received a DL 100 of E. coli endotoxin (DIFCO 0111: B4); the III and the IV group received a continuous infusion or topical application of gabexate mesilate respectively, before the administration of endotoxin. To evaluate capillary permeability and to quantify the degree of extravasion by counting the number of leaky sites, fluorescent labelled bovine albumin was injected i.v. and mesocecum was observed with fluorescent microscopy for 2 hours.Capillary permeability did not increase in control rats; it largely increased in rats receiving endotoxin i.v. but it did not almost increased in rats receiving gabexate mesilate (Foy) that prevents the increase of capillary permeability that was observed in the group treated with endotoxin alone.
Abstract Background and aim: To analyze quality of data and preliminary results of the first 100 patients enrolled in RATIO trial (low dose -LD 150 U/kg versus high dose -HD 300 U/kg heparinization with target ACT respectively of 250 and 400 seconds). Methods: ACT was recorded before anesthesia, 5’ after heparination, every 30’, 10’ after full protamine reversal. Primary endpoints were occurrence in the first 30 post-operative days of vascular events (death from vascular causes, perioperative myocardial infarction -PMI, stroke), or major bleeding events (redo for excessive bleeding, cardiac tamponade, transfusion ≥3 U of red cells or platelets). Results: Preoperative mean age, male-female index, LVEF, LM disease, and CAD extent did not differ significantly between LD (n.51) and HD group (n.49). DAPT was stopped respectively in 24/51 and 16/49 patients of LD and HD group (p = 0.3). Aspirin was continued throughout surgery respectively in 50/51 and 43/49 patients of LD and HD group (p = 0.5). The table summarizes intraoperative ACT values. No statistical difference was observed in primary cumulative endpoints: no stroke; 1 PMI in HD; 2 and 1 death - 2 and 1 redo for bleeding – 9 and 10 transfusions respectively in LD and HD groups. DAPT was prescribed postoperatively in 42/100 patients. Conclusions: Preliminary analysis seems to confirm the null hypothesis (LD and HD equally safe). LD does not seem to confer any advantage about bleeding. These results could be due to ACT curve in LD patients tending to exceed the target of 250 and to LD-HD curves almost overlapping.
Respiratory viral infection (RVI) in lung transplant recipients (LTRs) is a risk for chronic lung allograft dysfunction (CLAD). We hypothesize that donor-derived cell-free DNA (%ddcfDNA), at the time of RVI predicts CLAD progression. We followed 39 LTRs with RVI enrolled in the Genomic Research Alliance for Transplantation for 1 year. Plasma %ddcfDNA was measured by shotgun sequencing, with high %ddcfDNA as ≥1% within 7 days of RVI. We examined %ddcfDNA, spirometry, and a composite (progression/failure) of CLAD stage progression, re-transplant, and death from respiratory failure. Fifty-nine RVI episodes, 38 low and 21 high %ddcfDNA were analyzed. High %ddcfDNA subjects had a greater median %FEV1 decline at RVI (−13.83 vs. −1.83, p = .007), day 90 (−7.97 vs. 0.91, p = .04), and 365 (−20.05 vs. 1.09, p = .047), compared to those with low %ddcfDNA and experienced greater progression/failure within 365 days (52.4% vs. 21.6%, p = .01). Elevated %ddcfDNA at RVI was associated with an increased risk of progression/failure adjusting for symptoms and days post-transplant (HR = 1.11, p = .04). No difference in %FEV1 decline was seen at any time point when RVIs were grouped by histopathology result at RVI. %ddcfDNA delineates LTRs with RVI who will recover lung function and who will experience sustained decline, a utility not seen with histopathology. Respiratory viral infection (RVI) in lung transplant recipients (LTRs) is a risk for chronic lung allograft dysfunction (CLAD). We hypothesize that donor-derived cell-free DNA (%ddcfDNA), at the time of RVI predicts CLAD progression. We followed 39 LTRs with RVI enrolled in the Genomic Research Alliance for Transplantation for 1 year. Plasma %ddcfDNA was measured by shotgun sequencing, with high %ddcfDNA as ≥1% within 7 days of RVI. We examined %ddcfDNA, spirometry, and a composite (progression/failure) of CLAD stage progression, re-transplant, and death from respiratory failure. Fifty-nine RVI episodes, 38 low and 21 high %ddcfDNA were analyzed. High %ddcfDNA subjects had a greater median %FEV1 decline at RVI (−13.83 vs. −1.83, p = .007), day 90 (−7.97 vs. 0.91, p = .04), and 365 (−20.05 vs. 1.09, p = .047), compared to those with low %ddcfDNA and experienced greater progression/failure within 365 days (52.4% vs. 21.6%, p = .01). Elevated %ddcfDNA at RVI was associated with an increased risk of progression/failure adjusting for symptoms and days post-transplant (HR = 1.11, p = .04). No difference in %FEV1 decline was seen at any time point when RVIs were grouped by histopathology result at RVI. %ddcfDNA delineates LTRs with RVI who will recover lung function and who will experience sustained decline, a utility not seen with histopathology.