Abstract
The Sinkhorn divergence has become a very popular metric to compare probability distributions in optimal transport. However, most works resort to the Sinkhorn divergence in Euclidean space, which greatly blocks their applications in complex data with nonlinear structure. It is therefore of theoretical demand to empower the Sinkhorn divergence with the capability of capturing nonlinear structures. We propose a theoretical and computational framework to bridge this gap. In this paper, we extend the Sinkhorn divergence in Euclidean space to the reproducing kernel Hilbert space, which we term “Hilbert Sinkhorn divergence” (HSD). In particular, we can use kernel matrices to derive a closed form expression of the HSD that is proved to be a tractable convex optimization problem. We also prove several attractive statistical properties of the proposed HSD, i.e., strong consistency, asymptotic behavior and sample complexity. Empirically, our method yields state-of-the-art performances on image classification and topological data analysis. Copyright © 2021 IEEE.
Original language | English |
---|---|
Title of host publication | Proceedings of 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021 |
Place of Publication | Danvers, MA |
Publisher | IEEE |
Pages | 3834-3843 |
ISBN (Electronic) | 9781665445092 |
DOIs | |
Publication status | Published - 2021 |