Moment distances for comparing high-entropy
|W. Zellinger, H. Eghbal-zadeh, B. Moser, M. Zwick, E. Lughofer, T. Natschläger, S. Saminger-Platz. Moment distances for comparing high-entropy. 8, 2018.|
|Buch||BoA of the International Conference on Computational Statistics (COMPSTAT 2018)|
Given two samples, the similarity of the distributions of the sample representations in the latent space of a discriminative model shall be enforced. Standard approaches are based on the minimization through probability metrics, e.g. by the Wasserstein metric, the Maximum Mean Discrepancy, or f-divergences. However, also moment distances not satisfying the identity of indiscernibles, i.e. pseudo-metrics, performed well in many practical tasks. The uniform distance between two distributions having finitely many moments in common can be very large.
The question is under which constraints on the distributions small values of moment distances imply distribution similarity. We show that the total variation distance between two distributions is small if the distributions are of high differential entropy constrained at finitely many moments that are similar for the two distribution. We also discuss existing relations between moment convergence and moment-constrained entropy convergence in the one-dimensional case. Our analysis leads to a new target error bound for domain adaptation, which is underpinned by numerical evaluations.