The National Registry for Radiation Workers (NRRW) is the largest epidemiological study of UK radiation workers. Following the first analysis published in 1992, a second analysis has been conducted using an enlarged cohort of 124 743 workers, updated dosimetry and personal data for some workers, and a longer follow-up. Overall levels of mortality were found to be less than those expected from national rates; the standardised mortality ratio for all causes was 82, increasing to 89 after adjusting for social class. This `healthy worker effect' was particularly strong for lung cancer and for some smoking-related non-malignant diseases. Analysis of potential radiation effects involved testing for any trend in mortality risk with external dose, after adjusting for likely confounding factors. For leukaemia, excluding chronic lymphatic leukaemia (CLL), the central estimate of excess relative risk (ERR) per Sv was similar to that estimated for the Japanese atomic bomb survivors at low doses (without the incorporation of a dose-rate correction factor); the corresponding 90% confidence limits for this trend were tighter than in the first analysis, ranging from just under four times the risk estimated at low doses from the Japanese atomic bomb survivors to about zero. For the grouping of all malignancies other than leukaemia, the central estimate of the trend in risk with dose was closer to zero than in the first analysis; also, the 90% confidence limits were tighter than before and included zero. Since results for lung cancer and non-malignant smoking-related diseases suggested the possibility of confounding by smoking, an examination was made, as in the first analysis, of all malignancies other than leukaemia and lung cancer. In this instance the central estimate of the ERR per Sv was similar to that from the A-bomb data (without the incorporation of a dose-rate correction factor), with a 90% confidence interval ranging from about four times the A-bomb value to less than zero. For multiple myeloma there was an indication of an increasing trend in risk with external dose (p = 0.06), although the evidence for this trend disappeared after omitting workers monitored for exposure to internal emitters. The second NRRW analysis provides stronger inferences than the first on occupational radiation exposure and cancer mortality; the 90% confidence intervals for the risk per unit dose are tighter than before, and now exclude values which are greater than four times those seen among the Japanese A-bomb survivors, although they are also generally consistent with an observation of no raised risk. Furthermore, there is evidence, of borderline statistical significance, of an increasing risk for leukaemia excluding CLL, and, as with solid cancers, the data are consistent with the A-bomb findings.
The dose-response relationship for radiation-induced leukemia was examined in a pooled analysis of three exposed populations: Japanese atomic bomb survivors, women treated for cervical cancer, and patients irradiated for ankylosing spondylitis. A total of 383 leukemias were observed among 283,139 study subjects. Considering all leukemias apart from chronic lymphocytic leukemia, the optimal relative risk model had a dose response with a purely quadratic term representing induction and an exponential term consistent with cell sterilization at high doses; the addition of a linear induction term did not improve the fit of the model. The relative risk decreased with increasing time since exposure and increasing attained age, and there were significant (P < 0.00001) differences in the parameters of the model between datasets. These differences were related in part to the significant differences (P = 0.003) between the models fitted to the three main radiogenic leukemia subtypes (acute myeloid leukemia, acute lymphocytic leukemia, chronic myeloid leukemia). When the three datasets were considered together but the analysis was repeated separately for the three leukemia subtypes, for each subtype the optimal model included quadratic and exponential terms in dose. For acute myeloid leukemia and chronic myeloid leukemia, there were reductions of relative risk with increasing time after exposure, whereas for acute lymphocytic leukemia the relative risk decreased with increasing attained age. For each leukemia subtype considered separately, there was no indication of a difference between the studies in the relative risk and its distribution as a function of dose, age and time (P > 0.10 for all three subtypes). The nonsignificant indications of differences between the three datasets when leukemia subtypes were considered separately may be explained by random variation, although a contribution from differences in exposure dose-rate regimens, inhomogeneous dose distribution within the bone marrow, inadequate adjustment forcell sterilization effects, or errors in dosimetry could have played a role.
SUMMARY Distinguishing an outlier in a time series arising through measurement error from one arising through a perturbation of the underlying system can be of use in data validation. In this paper a method of testing for the presence of an outlier of unknown type is proposed. Then the properties of a rule based on the likelihood ratio which attempts to distinguish the two types of outlier are examined and compared with those of the corresponding Bayes rules. An example involving data from an industrial production process is studied.
An excess of leukemias in children has been observed between 1950 and 1980 in the village of Seascale (population about 3,000) which is situated approximately 3 km to the south of Sellafield nuclear fuel reprocessing plant in West Cumbria, England. Radiation doses from all the main sources of radiation exposure of the population and risks of radiation-induced leukemia have been calculated for children born and living in Seascale during the period of operation of the plant. For the Seascale study population of 1225 children and young persons, followed to age 20 y, or followed until 1980 for those born after 1960, 0.016 radiation-induced leukemias are predicted from the Sellafield discharges. This corresponds to an average risk to children in the population of about one in 75,000. For the four fatal leukemias observed in the study population (0.5 expected from United Kingdom statistics) to be attributed to the operations at Sellafield, the average risk would have to be increased by a factor of about 250, to one in 300. Although there is some uncertainty about the releases from the plant and concentrations of radionuclides in environmental materials in the Sellafield area, particularly for the early years of its operation, the possibility that the doses calculated and the risk coefficients used for radiation-induced leukemia could be so substantially wrong is very unlikely. The number of radiation-induced leukemias from all radiation sources is calculated to be 0.1, which corresponds to a risk of about one in 12,250 for the average child in the study population. About two-thirds of the risk is from natural radiation, 16% from the Sellafield discharges, and nuclear weapons fallout and medical exposure each contribute about 9%. The models used for calculating radiation doses from intakes of radionuclides were based upon those recommended by the International Commission on Radiological Protection (ICRP). This presented a number of difficulties in the assessment, which included the lack of any generally accepted age-related dosimetric models, particularly for bone-seeking radionuclides; limited information on gut transfer factors for radionuclides incorporated in foodstuffs; and no dosimetric models for the fetus. These and other problems identified in the analysis that require more information are discussed.