Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning
The purpose of this paper is twofold. On a technical side, we propose an extension of the Hausdorff distance from metric spaces to spaces equipped with asymmetric distance measures. Specifically, we focus on extending it to the family of Bregman divergences, which includes the popular Kullback–Leibl...
Saved in:
Main Authors: | Tuyen Pham, Hana Dal Poz Kouřimská, Hubert Wagner |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-05-01
|
Series: | Machine Learning and Knowledge Extraction |
Subjects: | |
Online Access: | https://www.mdpi.com/2504-4990/7/2/48 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Equivalence of Informations Characterizes Bregman Divergences
by: Philip S. Chodrow
Published: (2025-07-01) -
Bregman divergences for physically informed discrepancy measures for learning and computation in thermomechanics
by: Andrieux, Stéphane
Published: (2023-02-01) -
Properties of Shannon and Rényi entropies of the Poisson distribution as the functions of intensity parameter
by: Volodymyr Braiman, et al.
Published: (2024-07-01) -
Wavelet Entropy for Efficiency Assessment of Price, Return, and Volatility of Brent and WTI During Extreme Events
by: Salim Lahmiri
Published: (2025-03-01) -
Bounds on the Excess Minimum Risk via Generalized Information Divergence Measures
by: Ananya Omanwar, et al.
Published: (2025-07-01)