Summary: | The production of evaluated nuclear data consists not only in the determination of best estimate values for the quantities of interest but also on the estimation of the related uncertainties and correlations. When nuclear data are evaluated with underlying nuclear reaction models, model parameters are expected to synthesize all the information that is extracted from the experimental data they are adjusted on. When dealing with models with a small number of parameters compared to the number of experimental data points – e.g. in resonant cross section analysis – one sometimes faces excessively small evaluated uncertainty compared for instance with model/experimental data agreement. To solve this issue, an attempt was to propagate the uncertainty coming from experimental parameters involved in the data reduction process on the nuclear physics model parameters. It pushed experimentalists to separately supply random (statistical) and systematic uncertainties. It also pushed evaluators to include or mimic the data reduction process in the evaluation. In this way experimental parameters – also called nuisance parameters – could be used to increase evaluated parameter uncertainty through marginalization techniques. Two of these methods: Matrix and Bayesian marginalizations – respectively called sometimes Analytical and Monte-Carlo Marginalizations – that are currently used for evaluation will be discussed here and some limitations highlighted. A third alternative method, also based on a Bayesian approach but using the spectral decomposition of the correlation matrix, is also presented on a toy model, and on a a simple case of resonant cross section analysis.
|