Censoring Unbiased Regression Trees and Ensembles. Academic Article uri icon

Overview

abstract

  • This paper proposes a novel paradigm for building regression trees and ensemble learning in survival analysis. Generalizations of the CART and Random Forests algorithms for general loss functions, and in the latter case more general bootstrap procedures, are both introduced. These results, in combination with an extension of the theory of censoring unbiased transformations applicable to loss functions, underpin the development of two new classes of algorithms for constructing survival trees and survival forests: Censoring Unbiased Regression Trees and Censoring Unbiased Regression Ensembles. For a certain "doubly robust" censoring unbiased transformation of squared error loss, we further show how these new algorithms can be implemented using existing software (e.g., CART, random forests). Comparisons of these methods to existing ensemble procedures for predicting survival probabilities are provided in both simulated settings and through applications to four datasets. It is shown that these new methods either improve upon, or remain competitive with, existing implementations of random survival forests, conditional inference forests, and recursively imputed survival trees.

publication date

  • July 9, 2018

Identity

PubMed Central ID

  • PMC6561730

Scopus Document Identifier

  • 85049629772

Digital Object Identifier (DOI)

  • 10.1080/01621459.2017.1407775

PubMed ID

  • 31190691

Additional Document Info

volume

  • 114

issue

  • 525