Bounds for the normal approximation of the maximum likelihood estimator

<p>The asymptotic normality of the maximum likelihood estimator (MLE) under regularity conditions is a long established and famous result. This is a qualitative result and the assessment of such a normal approximation is our main interest. For this task we partly use Stein's method, which...

Full description

Bibliographic Details
Main Author: Anastasiou, A
Other Authors: Reinert, G
Format: Thesis
Language:English
Published: 2016
Subjects:
_version_ 1826294636842319872
author Anastasiou, A
author2 Reinert, G
author_facet Reinert, G
Anastasiou, A
author_sort Anastasiou, A
collection OXFORD
description <p>The asymptotic normality of the maximum likelihood estimator (MLE) under regularity conditions is a long established and famous result. This is a qualitative result and the assessment of such a normal approximation is our main interest. For this task we partly use Stein's method, which is a probabilistic technique that can be used to explicitly measure the distributional distance between two distributions. Since its first appearance in 1972, the method has been developed for various distributions; here we use the results related to Stein's method for normal approximation.</p> <p>In this thesis, we derive explicit upper bounds on the distributional distance between the distribution of the MLE and the normal distribution. First, the focus is on independent and identically distributed random variables from both discrete and continuous single-parameter distributions with particular attention to exponential families. For discrete distributions, the case where the MLE can be on the boundary of the parameter space is treated through a perturbation approach, which allows us to obtain bounds on the distributional distance of interest. The bounds are of order n^(-0.5), where n is the number of observations. Simulation-based results are given to illustrate the power of the bound. Furthermore, often the MLE can not be obtained analytically and optimisation methods (such as the Newton-Raphson algorithm) are used. Even in such cases, order n^(-0.5) bounds are given for the distributional distance related to the MLE.</p> <p>The case of multi-parameter distributions follows smoothly after the detailed discussion related to a scalar parameter. Apart from extending our approach to a multi-parameter setting, we also cover the case of independent but not necessarily identically distributed (i.n.i.d.) random vectors with specific focus on the widely applicable linear regression models.</p> <p>Going back to the single-parameter setting a different approach to get an upper bound on the distributional distance between the distribution of the MLE and the normal distribution, based on the Delta method, is also developed. The MLE for a Generalised Gamma distribution gives an illustration of the results obtained through this Delta method approach.</p> <p>Finally, we relax the independence assumption and results for the case of locally dependent random variables are obtained. An example of correlated sums of normally distributed random variables illustrates the bounds. Again, results that do not require an analytic expression of the MLE to be known are given. We end this thesis with ideas currently in progress and further open research questions.</p>
first_indexed 2024-03-07T03:48:45Z
format Thesis
id oxford-uuid:c078fc46-7ed7-4e02-9a68-4608acba4bd2
institution University of Oxford
language English
last_indexed 2024-03-07T03:48:45Z
publishDate 2016
record_format dspace
spelling oxford-uuid:c078fc46-7ed7-4e02-9a68-4608acba4bd22022-03-27T05:54:34ZBounds for the normal approximation of the maximum likelihood estimatorThesishttp://purl.org/coar/resource_type/c_db06uuid:c078fc46-7ed7-4e02-9a68-4608acba4bd2StatisticsLimit theorems (Probability theory)Parameter estimationEnglishORA Deposit2016Anastasiou, AReinert, G<p>The asymptotic normality of the maximum likelihood estimator (MLE) under regularity conditions is a long established and famous result. This is a qualitative result and the assessment of such a normal approximation is our main interest. For this task we partly use Stein's method, which is a probabilistic technique that can be used to explicitly measure the distributional distance between two distributions. Since its first appearance in 1972, the method has been developed for various distributions; here we use the results related to Stein's method for normal approximation.</p> <p>In this thesis, we derive explicit upper bounds on the distributional distance between the distribution of the MLE and the normal distribution. First, the focus is on independent and identically distributed random variables from both discrete and continuous single-parameter distributions with particular attention to exponential families. For discrete distributions, the case where the MLE can be on the boundary of the parameter space is treated through a perturbation approach, which allows us to obtain bounds on the distributional distance of interest. The bounds are of order n^(-0.5), where n is the number of observations. Simulation-based results are given to illustrate the power of the bound. Furthermore, often the MLE can not be obtained analytically and optimisation methods (such as the Newton-Raphson algorithm) are used. Even in such cases, order n^(-0.5) bounds are given for the distributional distance related to the MLE.</p> <p>The case of multi-parameter distributions follows smoothly after the detailed discussion related to a scalar parameter. Apart from extending our approach to a multi-parameter setting, we also cover the case of independent but not necessarily identically distributed (i.n.i.d.) random vectors with specific focus on the widely applicable linear regression models.</p> <p>Going back to the single-parameter setting a different approach to get an upper bound on the distributional distance between the distribution of the MLE and the normal distribution, based on the Delta method, is also developed. The MLE for a Generalised Gamma distribution gives an illustration of the results obtained through this Delta method approach.</p> <p>Finally, we relax the independence assumption and results for the case of locally dependent random variables are obtained. An example of correlated sums of normally distributed random variables illustrates the bounds. Again, results that do not require an analytic expression of the MLE to be known are given. We end this thesis with ideas currently in progress and further open research questions.</p>
spellingShingle Statistics
Limit theorems (Probability theory)
Parameter estimation
Anastasiou, A
Bounds for the normal approximation of the maximum likelihood estimator
title Bounds for the normal approximation of the maximum likelihood estimator
title_full Bounds for the normal approximation of the maximum likelihood estimator
title_fullStr Bounds for the normal approximation of the maximum likelihood estimator
title_full_unstemmed Bounds for the normal approximation of the maximum likelihood estimator
title_short Bounds for the normal approximation of the maximum likelihood estimator
title_sort bounds for the normal approximation of the maximum likelihood estimator
topic Statistics
Limit theorems (Probability theory)
Parameter estimation
work_keys_str_mv AT anastasioua boundsforthenormalapproximationofthemaximumlikelihoodestimator