Abstract Background Mixed response (MR), a scenario featuring discordant tumor changes, has been reported primarily with targeted therapies or immunotherapy. We determined the incidence and prognostic significance of MR in advanced non–small cell lung cancer (NSCLC) treated with cytotoxic chemotherapy. Patients and Methods We analyzed patient-level data from ECOG-ACRIN E5508 (carboplatin-paclitaxel + bevacizumab induction followed by randomization to maintenance therapy regimens). For patients with at least 2 target lesions and available measurements after cycle 2, we characterized response as homogeneous response (HR, similar behavior of all lesions), MR (similar behavior but >30% difference in magnitude of best and least responding lesions), or true mixed response (TMR, best and least responding lesions showing different behavior: ≥10% growth versus ≥10% shrinkage). We compared category characteristics using Mann-Whitney U and Chi-square tests, and overall survival (OS) using log-rank test and Cox models. Results Among 965 evaluable patients, HR occurred in 609 patients (63%), MR in 208 (22%), and TMR in 148 (15%). Median OS was 13.6 months for HR, 12.0 months for MR, and 7.6 months for TMR (P < .001). Compared to HR, TMR had inferior OS among stable disease cases (HR 1.62; 95% CI, 1.23-2.12; P < .001) and a trend toward inferior OS among progressive disease cases (HR 1.39; 95% CI, 0.83-2.33; P = .2). In multivariate analysis, TMR was associated with worse OS (HR 1.48; 95% CI, 1.22-1.79; P < .001). Conclusion True mixed response occurs in a substantial minority of lung cancer cases treated with chemotherapy and independently confers poor prognosis.
Immune checkpoint inhibitors (ICIs) have revolutionized the treatment paradigm for advanced non-small cell lung cancer (NSCLC). Although certain patients achieve significant, long-lasting responses from checkpoint blockade, the majority of patients with NSCLC do not and may be unnecessarily exposed to inadequate therapies and immune-related toxicities. Therefore, there is a critical need to identify biomarkers predictive of immunotherapy response. While tumor and immune cell expression of programmed death ligand-1 and, more recently, tumor mutational burden are used in clinical practice and may correlate with immunotherapy response in selected circumstances, neither consistently predicts an individual patient's likelihood of clinical benefit from ICI therapy. More recently, innovative approaches such as blood-based assays and combination biomarker strategies are under active investigation. This review will focus on the current role and challenges of programmed death ligand-1 and tumor mutational burden as predictive biomarkers for immunotherapy response in advanced NSCLC and explore promising novel biomarker strategies.
Abstract Background Neoadjuvant chemotherapy (NAC) is frequently used in gastrointestinal cancers (GIC), and pathological, radiological, and tumor marker responses are assessed during and after NAC. Aim To evaluate the relationship between pathologic, radiologic, tumor marker responses and recurrence‐free survival (RFS), overall survival (OS), adjuvant chemotherapy (AC) decisions, and the impact of changing to a different AC regimen after poor response to NAC. Methods and results Medical records of GIC patients treated with NAC at Mount Sinai between 1/2012 and 12/2018 were reviewed. One hundred fifty‐six patients (58.3% male, mean age 63 years) were identified. Primary tumor sites were: 43 (27.7%) pancreas, 62 (39.7%) gastroesophageal, and 51 (32.7%) colorectal. After NAC, 31 (19.9%) patients had favorable pathologic response (FPR; defined as College of American Pathologists [CAP] score 0–1). Of 107 patients with radiological data, 59 (55.1%) had an objective response, and of 113 patients with tumor marker data, 61 (54.0%) had a ≥50% reduction post NAC. FPR, but not radiographic or serological responses, was associated with improved RFS (HR 0.28; 95% CI 0.11–0.72) and OS (HR 0.13; 95% CI 0.2–0.94). Changing to a different AC regimen from initial NAC, among all patients and specifically among those with unfavorable pathological response (UPR; defined as CAP score 2–3) after NAC, was not associated with improved RFS or OS. Conclusions GIC patients with FPR after NAC experienced significant improvements in RFS and OS. Patients with UPR did not benefit from changing AC. Prospective studies to better understand the role of pathological response in AC decisions and outcomes in GIC patients are needed.
Background: Recent modifications to low-dose CT (LDCT)–based lung cancer screening guidelines increase the number of eligible individuals, particularly among racial and ethnic minorities. Because these populations disproportionately live in metropolitan areas, we analyzed the association between travel time and initial LDCT completion within an integrated, urban safety-net health care system. Methods: Using Esri’s StreetMap Premium, OpenStreetMap, and the r5r package in R, we determined projected private vehicle and public transportation travel times between patient residence and the screening facility for LDCT ordered in March 2017 through December 2022 at Parkland Memorial Hospital in Dallas, Texas. We characterized associations between travel time and LDCT completion in univariable and multivariable analyses. We tested these associations in a simulation of 10,000 permutations of private vehicle and public transportation distribution. Results: A total of 2,287 patients were included in the analysis, of whom 1,553 (68%) completed the initial ordered LDCT. Mean age was 63 years, and 73% were underrepresented minorities. Median travel time from patient residence to the LDCT screening facility was 17 minutes by private vehicle and 67 minutes by public transportation. There was a small difference in travel time to the LDCT screening facility by public transportation for patients who completed LDCT versus those who did not (67 vs 66 min, respectively; P =.04) but no difference in travel time by private vehicle for these patients (17 min for both; P =.67). In multivariable analysis, LDCT completion was not associated with projected travel time to the LDCT facility by private vehicle (odds ratio, 1.01; 95% CI, 0.82–1.25) or public transportation (odds ratio, 1.14; 95% CI, 0.89–1.44). Similar results were noted across travel-type permutations. Black individuals were 29% less likely to complete LDCT screening compared with White individuals. Conclusions: In an urban population comprising predominantly underrepresented minorities, projected travel time is not associated with initial LDCT completion in an integrated health care system. Other reasons for differences in LDCT completion warrant investigation.
The purpose of this study was to determine the long-term efficacy and safety of valproic acid (VPA) treatment in patients with pigmentary retinal dystrophies.
Methods
A retrospective chart review was conducted on 31 patients with a diagnosis of pigmentary retinal dystrophy prescribed VPA at a single centre. Visual field (VF), visual acuity (VA), length of treatment, liver enzymes and side effects were analysed. VF areas were defined using Goldmann VF (GVF) tracings recorded before, during and after VPA treatment using the V4e isopter for each eye. Using custom software, planimetric areas of VF were calculated.
Results
Five of the patients (10 eyes) had two Goldmann VF tracings, allowing comparison between baseline and follow-up VF. After 9.8 months of VPA, VF decreased by 0.145 cm2 (26.478%) (p=0.432). For 22 of the patients (41 eyes), VA data was available, and logarithm of the minimum angle of resolution (logMAR) score changed by 0.056 log units (representing a decline in VA) after 14.9 months on VPA (p=0.002). Twelve patients (38.7%) reported negative side effects related to VPA use.
Conclusions
VPA plays a complex role in patients with pigmentary retinal dystrophies and may be associated with VA and field decline as well as adverse side effects. Physicians should use caution with using VPA for pigmentary retinal dystrophies.