Intrinsic Losses Based on Information Geometry and Their Applications

One main interest of information geometry is to study the properties of statistical models that do not depend on the coordinate systems or model parametrization; thus, it may serve as an analytic tool for intrinsic inference in statistics. In this paper, under the framework of Riemannian geometry an...

Full description

Bibliographic Details
Main Authors: Yao Rong, Mengjiao Tang, Jie Zhou
Format: Article
Language:English
Published: MDPI AG 2017-08-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/19/8/405
_version_ 1798004969535700992
author Yao Rong
Mengjiao Tang
Jie Zhou
author_facet Yao Rong
Mengjiao Tang
Jie Zhou
author_sort Yao Rong
collection DOAJ
description One main interest of information geometry is to study the properties of statistical models that do not depend on the coordinate systems or model parametrization; thus, it may serve as an analytic tool for intrinsic inference in statistics. In this paper, under the framework of Riemannian geometry and dual geometry, we revisit two commonly-used intrinsic losses which are respectively given by the squared Rao distance and the symmetrized Kullback–Leibler divergence (or Jeffreys divergence). For an exponential family endowed with the Fisher metric and α -connections, the two loss functions are uniformly described as the energy difference along an α -geodesic path, for some α ∈ { − 1 , 0 , 1 } . Subsequently, the two intrinsic losses are utilized to develop Bayesian analyses of covariance matrix estimation and range-spread target detection. We provide an intrinsically unbiased covariance estimator, which is verified to be asymptotically efficient in terms of the intrinsic mean square error. The decision rules deduced by the intrinsic Bayesian criterion provide a geometrical justification for the constant false alarm rate detector based on generalized likelihood ratio principle.
first_indexed 2024-04-11T12:31:54Z
format Article
id doaj.art-07c80a23a2e6430b8a08e3b65dd754e4
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-11T12:31:54Z
publishDate 2017-08-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-07c80a23a2e6430b8a08e3b65dd754e42022-12-22T04:23:43ZengMDPI AGEntropy1099-43002017-08-0119840510.3390/e19080405e19080405Intrinsic Losses Based on Information Geometry and Their ApplicationsYao Rong0Mengjiao Tang1Jie Zhou2College of Mathematics, Sichuan University, Chengdu 610064, ChinaCollege of Mathematics, Sichuan University, Chengdu 610064, ChinaCollege of Mathematics, Sichuan University, Chengdu 610064, ChinaOne main interest of information geometry is to study the properties of statistical models that do not depend on the coordinate systems or model parametrization; thus, it may serve as an analytic tool for intrinsic inference in statistics. In this paper, under the framework of Riemannian geometry and dual geometry, we revisit two commonly-used intrinsic losses which are respectively given by the squared Rao distance and the symmetrized Kullback–Leibler divergence (or Jeffreys divergence). For an exponential family endowed with the Fisher metric and α -connections, the two loss functions are uniformly described as the energy difference along an α -geodesic path, for some α ∈ { − 1 , 0 , 1 } . Subsequently, the two intrinsic losses are utilized to develop Bayesian analyses of covariance matrix estimation and range-spread target detection. We provide an intrinsically unbiased covariance estimator, which is verified to be asymptotically efficient in terms of the intrinsic mean square error. The decision rules deduced by the intrinsic Bayesian criterion provide a geometrical justification for the constant false alarm rate detector based on generalized likelihood ratio principle.https://www.mdpi.com/1099-4300/19/8/405intrinsic lossinformation geometryexponential familycovariance matrix estimationrange-spread target detection
spellingShingle Yao Rong
Mengjiao Tang
Jie Zhou
Intrinsic Losses Based on Information Geometry and Their Applications
Entropy
intrinsic loss
information geometry
exponential family
covariance matrix estimation
range-spread target detection
title Intrinsic Losses Based on Information Geometry and Their Applications
title_full Intrinsic Losses Based on Information Geometry and Their Applications
title_fullStr Intrinsic Losses Based on Information Geometry and Their Applications
title_full_unstemmed Intrinsic Losses Based on Information Geometry and Their Applications
title_short Intrinsic Losses Based on Information Geometry and Their Applications
title_sort intrinsic losses based on information geometry and their applications
topic intrinsic loss
information geometry
exponential family
covariance matrix estimation
range-spread target detection
url https://www.mdpi.com/1099-4300/19/8/405
work_keys_str_mv AT yaorong intrinsiclossesbasedoninformationgeometryandtheirapplications
AT mengjiaotang intrinsiclossesbasedoninformationgeometryandtheirapplications
AT jiezhou intrinsiclossesbasedoninformationgeometryandtheirapplications