Survival regression with proper scoring rules and monotonic neural networks

We consider frequently used scoring rules for right-censored survival regression models such as time-dependent concordance, survival-CRPS, integrated Brier score and integrated binomial log-likelihood, and prove that neither of them is a proper scoring rule. This means that the true survival distrib...

Full description

Bibliographic Details
Main Authors: Rindt, D, Hu, R, Steinsaltz, D, Sejdinovic, D
Format: Conference item
Language:English
Published: Journal of Machine Learning Research 2022
_version_ 1826307753983868928
author Rindt, D
Hu, R
Steinsaltz, D
Sejdinovic, D
author_facet Rindt, D
Hu, R
Steinsaltz, D
Sejdinovic, D
author_sort Rindt, D
collection OXFORD
description We consider frequently used scoring rules for right-censored survival regression models such as time-dependent concordance, survival-CRPS, integrated Brier score and integrated binomial log-likelihood, and prove that neither of them is a proper scoring rule. This means that the true survival distribution may be scored worse than incorrect distributions, leading to inaccurate estimation. We prove, in contrast to these scores, that the right-censored log-likelihood is a proper scoring rule, i.e. the highest expected score is achieved by the true distribution. Despite this, modern feed-forward neural-network-based survival regression models are unable to train and validate directly on right-censored log-likelihood, due to its intractability, and resort to the aforementioned alternatives, i.e. non-proper scoring rules. We therefore propose a simple novel survival regression method capable of directly optimizing log-likelihood using a monotonic restriction on the time-dependent weights, coined SurvivalMonotonic-net (SuMo-net). SuMo-net achieves state-of-the-art log-likelihood scores across several datasets with 20–100x computational speedup on inference over existing state-of-the-art neural methods and is readily applicable to datasets with several million observations.
first_indexed 2024-03-07T07:09:20Z
format Conference item
id oxford-uuid:80de77f9-c708-4a92-8a2a-1eb00d447a0a
institution University of Oxford
language English
last_indexed 2024-03-07T07:09:20Z
publishDate 2022
publisher Journal of Machine Learning Research
record_format dspace
spelling oxford-uuid:80de77f9-c708-4a92-8a2a-1eb00d447a0a2022-05-31T10:56:21ZSurvival regression with proper scoring rules and monotonic neural networksConference itemhttp://purl.org/coar/resource_type/c_5794uuid:80de77f9-c708-4a92-8a2a-1eb00d447a0aEnglishSymplectic ElementsJournal of Machine Learning Research2022Rindt, DHu, RSteinsaltz, DSejdinovic, DWe consider frequently used scoring rules for right-censored survival regression models such as time-dependent concordance, survival-CRPS, integrated Brier score and integrated binomial log-likelihood, and prove that neither of them is a proper scoring rule. This means that the true survival distribution may be scored worse than incorrect distributions, leading to inaccurate estimation. We prove, in contrast to these scores, that the right-censored log-likelihood is a proper scoring rule, i.e. the highest expected score is achieved by the true distribution. Despite this, modern feed-forward neural-network-based survival regression models are unable to train and validate directly on right-censored log-likelihood, due to its intractability, and resort to the aforementioned alternatives, i.e. non-proper scoring rules. We therefore propose a simple novel survival regression method capable of directly optimizing log-likelihood using a monotonic restriction on the time-dependent weights, coined SurvivalMonotonic-net (SuMo-net). SuMo-net achieves state-of-the-art log-likelihood scores across several datasets with 20–100x computational speedup on inference over existing state-of-the-art neural methods and is readily applicable to datasets with several million observations.
spellingShingle Rindt, D
Hu, R
Steinsaltz, D
Sejdinovic, D
Survival regression with proper scoring rules and monotonic neural networks
title Survival regression with proper scoring rules and monotonic neural networks
title_full Survival regression with proper scoring rules and monotonic neural networks
title_fullStr Survival regression with proper scoring rules and monotonic neural networks
title_full_unstemmed Survival regression with proper scoring rules and monotonic neural networks
title_short Survival regression with proper scoring rules and monotonic neural networks
title_sort survival regression with proper scoring rules and monotonic neural networks
work_keys_str_mv AT rindtd survivalregressionwithproperscoringrulesandmonotonicneuralnetworks
AT hur survivalregressionwithproperscoringrulesandmonotonicneuralnetworks
AT steinsaltzd survivalregressionwithproperscoringrulesandmonotonicneuralnetworks
AT sejdinovicd survivalregressionwithproperscoringrulesandmonotonicneuralnetworks