Relevance Vector Machine for Survival Analysis

2016 
An accelerated failure time (AFT) model has been widely used for the analysis of censored survival or failure time data. However, the AFT imposes the restrictive log-linear relation between the survival time and the explanatory variables. In this paper, we introduce a relevance vector machine survival (RVMS) model based on Weibull AFT model that enables the use of kernel framework to automatically learn the possible nonlinear effects of the input explanatory variables on target survival times. We take advantage of the Bayesian inference technique in order to estimate the model parameters. We also introduce two approaches to accelerate the RVMS training. In the first approach, an efficient smooth prior is employed that improves the degree of sparsity. In the second approach, a fast marginal likelihood maximization procedure is used for obtaining a sparse solution of survival analysis task by sequential addition and deletion of candidate basis functions. These two approaches, denoted by smooth RVMS and fast RVMS, typically use fewer basis functions than RVMS and improve the RVMS training time; however, they cause a slight degradation in the RVMS performance. We compare the RVMS and the two accelerated approaches with the previous sparse kernel survival analysis method on a synthetic data set as well as six real-world data sets. The proposed kernel survival analysis models have been discovered to be more accurate in prediction, although they benefit from extra sparsity. The main advantages of our proposed models are: 1) extra sparsity that leads to a better generalization and avoids overfitting; 2) automatic relevance sample determination based on data that provide more accuracy, in particular for highly censored survival data; and 3) flexibility to utilize arbitrary number and types of kernel functions (e.g., non-Mercer kernels and multikernel learning).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    31
    References
    19
    Citations
    NaN
    KQI
    []