International Conference on Learning Representations

Joint paper presented

Successful scientific cooperation

The International Conference on Learning Representations (ICLR) is one of the most prestigious conferences in this segment. Werner Zeillinger of FLLL presented the paper:

W. Zellinger, T. Grubinger, E. Lughofer, T. Natschläger, S. Saminger-Platz. Central Moment Discrepancy (CMD) for domain-invariant representation learning. Proceedings of ICLR 2017 - 5th International Conference on Learning Representations, 2017.

Abstract

The learning of domain-invariant representations in the context of domain adaptation with neural networks is considered. In particular a new regularization method (CMD) is proposed that is based on differences of higher order central moments. CMD is used to minimize the domain discrepancy of the latent feature representations explicitly in the hidden activation space. In contrast to standard approaches, e.g. ”Maximum Mean Discrepancy” (MMD), computationally expensive distance- and kernel matrix computations are unnecessary. We define CMD to be an empirical estimate of a new metric introduced in this paper. We prove that convergence of bounded random variables w.r.t. to the new metric implies convergence in distribution of the random variables. We test our approach on two different benchmark data sets for object recognition (Office) and sentiment analysis of product reviews (Amazon reviews). CMD outperforms domain-adversarial networks and networks trained with MMD on most domain adaptation tasks on Office and Amazon reviews. Especially, a new state-of-the-art performance (gain +4%) is obtained on the most challenging task of Office. In addition, a post-hoc parameter sensitivity analysis shows that the new approach is stable w.r.t. parameter changes in a certain interval. The source code is publicly available.


back