Hilbert Sinkhorn divergence for optimal transport

Qian LI, Zhichao WANG, Gang LI, Jun PANG, Guandong XU

Research output: Chapter in Book/Report/Conference proceedingChapters

6 Citations (Scopus)

Abstract

The Sinkhorn divergence has become a very popular metric to compare probability distributions in optimal transport. However, most works resort to the Sinkhorn divergence in Euclidean space, which greatly blocks their applications in complex data with nonlinear structure. It is therefore of theoretical demand to empower the Sinkhorn divergence with the capability of capturing nonlinear structures. We propose a theoretical and computational framework to bridge this gap. In this paper, we extend the Sinkhorn divergence in Euclidean space to the reproducing kernel Hilbert space, which we term “Hilbert Sinkhorn divergence” (HSD). In particular, we can use kernel matrices to derive a closed form expression of the HSD that is proved to be a tractable convex optimization problem. We also prove several attractive statistical properties of the proposed HSD, i.e., strong consistency, asymptotic behavior and sample complexity. Empirically, our method yields state-of-the-art performances on image classification and topological data analysis. Copyright © 2021 IEEE.

Original languageEnglish
Title of host publicationProceedings of 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021
Place of PublicationDanvers, MA
PublisherIEEE
Pages3834-3843
ISBN (Electronic)9781665445092
DOIs
Publication statusPublished - 2021

Citation

Li, Q., Wang, Z., Li, G., Pang, J., & Xu, G. (2021). Hilbert Sinkhorn divergence for optimal transport. In Proceedings of 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2021 (pp. 3834-3843). IEEE. https://doi.org/10.1109/CVPR46437.2021.00383

Fingerprint

Dive into the research topics of 'Hilbert Sinkhorn divergence for optimal transport'. Together they form a unique fingerprint.