The calibration of Leksell Gamma Knife Perfexion (LGK PFX) is performed using a spherical polystyrene phantom 160 mm in diameter, which is provided by the manufacturer. This is the same phantom that has been used with LGK models U, B, C, and 4C. The polystyrene phantom is held in irradiation position by an aluminum adaptor, which has stainless steel side-fixation screws. The phantom adaptor partially attenuates the beams from sectors 3 and 7 by 3.2% and 4.6%, respectively. This unintended attenuation introduces a systematic error in dose calibration. The overall effect of phantom-adaptor attenuation on output calibration of the LGK PFX unit is to underestimate output by about 1.0%.
Purpose: To improve the quality of the treatment planning CT volume for cancer occurring in upper abdomen. Method and Materials:Delineation of tumors in liver or pancreas critically depends on the image quality due to rather weak radio-opacity difference between the tumor and surrounding tissue. Since for cases with tumor motion > 0.5cm, phase 50% scan is used for treatment planning the quality of this image is worse than its helical equivalent. Thus in order to improve Signal to Noise ratio (SNR) and Contrast to Noise ratio (CNR) of the phase 50 scan a synchronized averaging of the entire 4DCT data set is applied to create a composite CT volume equivalent to phase 50 scan. Four dimensional CT scans of ten patients with liver and pancreas cancer were used retrospectively in this study. In-house implementation of Demons algorithm allowed for adding the deformed CT phases to phase 50% scan. Results: Improved SNR for all cases was observed. Average improvement of SNR for all cases in the region of interest was by a factor of 2.8. The scan also look better for a visual inspection. Conclusion: Synchronized averaging of the 4 DCT scan can be used to obtain better quality treatment planning scans. However possible artefacts in 4DCT phases might preclude effective use of the entire set of CT phases.
Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same is for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project
Purpose: To study, using a computational simulation, the accuracy of the rigid body based image registration of the brain with special considerations for the type of the distribution of the homologous fiducial points used in the registration. To determine how the registration affects the resultant accuracy for target points. Method and Materials: The condition numbers of all the combinations of at least four out of the 25 candidate fiducial points are examined. This yields the sets of points with the best and the worst condition number. The Gaussian noise is imposed on the simulation points. The algorithms' accuracy for the sought rotation matrix and the translation, fiducial registration error (FRE), and target registration error (TRE) are investigated as a function of the number of fiducial points used in the registration. Three different algorithms are used: SVD based, the subspace method for the translation computation and the simultaneous optimization of the Euler angles and translation. Results and Conclusion: The robustness of all the algorithms is similar. For the lower number of fiducial points there is a risk of big TREs. Thus the condition number reveals a bad prospective input for the algorithm. The number of points needed to reach 1.0 mm accuracy for 0.5×0.5×1.0 mm 3 pixel size should be six points or more. This method of the condition number examination can be used to predetermine the accuracy of the registration for a given input.
Purpose: An examination of the regularization methods of the deformation vector field obtained with the “Demons” algorithm. The application of the adaptive smoothing for incremental variability of the similarity between the source and target image and its effect on the general characteristics of the transformation map. Method and Materials: The synthetic images are used for the numerical experiments. An adaptive iterative smoothing of the deformation field computed with the “Demons” algorithm is examined. The transformation map is examined with respect to the degree of the dissimilarity between the matched images and adaptive filtering . The standard Gaussian filtering with varying standard deviations (σ) is used for the adaptive smoothing. The magnitudes if the deformation vectors and the smoothness characteristics of the deformation maps are examined. Results: The deformation map reflecting the degree of the dissimilarity between the source and the target image gains truthfulness after application of the adaptive regularization. The real magnitude of the deformation between registered objects affects the effectiveness of the filtering. Since this value is not known an arbitrary a priori selection of the Gaussian filter is never optimal. The larger the filter's σ the smaller the magnitude of the deformation vectors is obtained. An inverse trend is observed for the magnitude of the map's standard deviation. The convergence rate of the algorithm is affected by the selection of the given σ. The mean squared sum of intensity differences measures the images similarity. Conclusion: The application of the adaptive regularization of the deformation field reflects the varying scales of the real deformation and how the algorithm is parametrically allowed to accommodate these differences during the iteration. This idea might be extended to anisotropic adaptive filtering to accommodate inhomogeneity of the real deformation at a given resolution scale.