A Diffusion-Map-Based Algorithm for Gradient Computation on Manifolds and Applications

We present a technique to estimate the Riemannian gradient of a given function defined on interior points of a Riemannian submanifold in the Euclidean space based on a sample of function evaluations at points in the submanifold. It applies to cases where the only available information consists of sa...

Full description

Bibliographic Details
Main Authors: Alvaro Almeida Gomez, Antonio J. Silva Neto, Jorge P. Zubelli
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10227282/
Description
Summary:We present a technique to estimate the Riemannian gradient of a given function defined on interior points of a Riemannian submanifold in the Euclidean space based on a sample of function evaluations at points in the submanifold. It applies to cases where the only available information consists of sample points evaluated on an unknown manifold. This approach is based on the estimates of the Laplace-Beltrami operator proposed in the diffusion-map theory. Analytical convergence results of the Riemannian gradient expansion are proved. The methodology provides a new algorithm to compute the gradient in cases where classical methods for numerical derivatives fail. For instance, in classification problems, and in cases where the information is provided in an unknown nonlinear lower-dimensional submanifold lying in high-dimensional spaces. The results obtained in this article connect the theory of diffusion maps with the theory of learning gradients on manifolds. We compare our methods with the results of the latter theory. We apply the Riemannian gradient estimate in a gradient-based algorithm providing a derivative-free optimization method. We test and validate several applications, including tomographic reconstruction from an unknown random angle distribution, and the sphere packing problem in dimensions 2 and 3. In the latter problem we test our methods against the PSO, Nelder-Mead, and Directgo methods.
ISSN:2169-3536