Domain-invariant regression under Beer-Lambert’s Law

R. Nikzad-Langerodi, B. Moser, W. Zellinger, S. Saminger-Platz. Domain-invariant regression under Beer-Lambert’s Law. pages 581-856, DOI https://doi.org/10.1109/ICMLA.2019.00108, 2, 2020.

Autoren
  • Ramin Nikzad-Langerodi
  • Bernhard A. Moser
  • Werner Zellinger
  • Susanne Saminger-Platz
Editoren
  • M. Arif Wani
  • Taghi M. Khoshgoftaar
  • Dingding Wang
  • Huanjing Wang
  • Naeem (Jim) Seliya
BuchProceedings of the 18th IEEE International Conference of Machine Learning and Applications (ICMLA 2019)
TypIn Konferenzband
VerlagIEEE
DOIhttps://doi.org/10.1109/ICMLA.2019.00108
ISBN978-1-7281-4549-5
Monat2
Jahr2020
Seiten581-856
Abstract

We consider the problem of unsupervised domain adaptation (DA) in regression under the assumption of linear hypotheses (e.g. Beer-Lambert's law) - a task recurrently encountered in analytical chemistry. Following the ideas from the non-linear iterative partial least squares (NIPALS) method, we propose a novel algorithm that identifies a low-dimensional subspace aiming at the following two objectives: i) the projections of the source domain samples are informative w.r.t. the output variable and ii) the projected domain-specific input samples have a small covariance difference. In particular, the latent variable vectors that span this subspace are derived in closed-form by solving a constrained optimization problem for each subspace dimension adding flexibility for balancing the two objectives. We demonstrate the superiority of our approach over several state-of-the-art (SoA) methods on two typical DA scenarios involving unsupervised adaptation of multivariate calibration models between different process lines in melamine production and equality to SoA on a well-known benchmark dataset from analytical chemistry involving (unsupervised) model adaptation between different spectrometers. The former data set is provided along with this paper.