Minimising the Kullback–Leibler Divergence for Model Selection in Distributed Nonlinear Systems

The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this measure can be decomposed as a function of two information-theor...

Full description

Bibliographic Details
Main Authors: Oliver M. Cliff, Mikhail Prokopenko, Robert Fitch
Format: Article
Language:English
Published: MDPI AG 2018-01-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/20/2/51