Assessing Discriminative Performance at External Validation of Clinical Prediction Models.
INTRODUCTION:External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluat...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2016-01-01
|
Series: | PLoS ONE |
Online Access: | http://europepmc.org/articles/PMC4755533?pdf=render |
_version_ | 1819038636266487808 |
---|---|
author | Daan Nieboer Tjeerd van der Ploeg Ewout W Steyerberg |
author_facet | Daan Nieboer Tjeerd van der Ploeg Ewout W Steyerberg |
author_sort | Daan Nieboer |
collection | DOAJ |
description | INTRODUCTION:External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. METHODS:We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. RESULTS:The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. CONCLUSION:The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients. |
first_indexed | 2024-12-21T08:40:27Z |
format | Article |
id | doaj.art-18b43fa41cf1463a99bb660373c79508 |
institution | Directory Open Access Journal |
issn | 1932-6203 |
language | English |
last_indexed | 2024-12-21T08:40:27Z |
publishDate | 2016-01-01 |
publisher | Public Library of Science (PLoS) |
record_format | Article |
series | PLoS ONE |
spelling | doaj.art-18b43fa41cf1463a99bb660373c795082022-12-21T19:09:58ZengPublic Library of Science (PLoS)PLoS ONE1932-62032016-01-01112e014882010.1371/journal.pone.0148820Assessing Discriminative Performance at External Validation of Clinical Prediction Models.Daan NieboerTjeerd van der PloegEwout W SteyerbergINTRODUCTION:External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. METHODS:We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. RESULTS:The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. CONCLUSION:The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.http://europepmc.org/articles/PMC4755533?pdf=render |
spellingShingle | Daan Nieboer Tjeerd van der Ploeg Ewout W Steyerberg Assessing Discriminative Performance at External Validation of Clinical Prediction Models. PLoS ONE |
title | Assessing Discriminative Performance at External Validation of Clinical Prediction Models. |
title_full | Assessing Discriminative Performance at External Validation of Clinical Prediction Models. |
title_fullStr | Assessing Discriminative Performance at External Validation of Clinical Prediction Models. |
title_full_unstemmed | Assessing Discriminative Performance at External Validation of Clinical Prediction Models. |
title_short | Assessing Discriminative Performance at External Validation of Clinical Prediction Models. |
title_sort | assessing discriminative performance at external validation of clinical prediction models |
url | http://europepmc.org/articles/PMC4755533?pdf=render |
work_keys_str_mv | AT daannieboer assessingdiscriminativeperformanceatexternalvalidationofclinicalpredictionmodels AT tjeerdvanderploeg assessingdiscriminativeperformanceatexternalvalidationofclinicalpredictionmodels AT ewoutwsteyerberg assessingdiscriminativeperformanceatexternalvalidationofclinicalpredictionmodels |