Skip to content

Signal-to-noise ratio in reproducing kernel Hilbert spaces

Research output: Contribution to journalArticle

Standard

Signal-to-noise ratio in reproducing kernel Hilbert spaces. / Gómez-Chova, Luis; Santos-Rodríguez, Raúl; Camps-Valls, Gustau.

In: Pattern Recognition Letters, Vol. 112, 01.09.2018, p. 75-82.

Research output: Contribution to journalArticle

Harvard

Gómez-Chova, L, Santos-Rodríguez, R & Camps-Valls, G 2018, 'Signal-to-noise ratio in reproducing kernel Hilbert spaces', Pattern Recognition Letters, vol. 112, pp. 75-82. https://doi.org/10.1016/j.patrec.2018.06.004

APA

Gómez-Chova, L., Santos-Rodríguez, R., & Camps-Valls, G. (2018). Signal-to-noise ratio in reproducing kernel Hilbert spaces. Pattern Recognition Letters, 112, 75-82. https://doi.org/10.1016/j.patrec.2018.06.004

Vancouver

Gómez-Chova L, Santos-Rodríguez R, Camps-Valls G. Signal-to-noise ratio in reproducing kernel Hilbert spaces. Pattern Recognition Letters. 2018 Sep 1;112:75-82. https://doi.org/10.1016/j.patrec.2018.06.004

Author

Gómez-Chova, Luis ; Santos-Rodríguez, Raúl ; Camps-Valls, Gustau. / Signal-to-noise ratio in reproducing kernel Hilbert spaces. In: Pattern Recognition Letters. 2018 ; Vol. 112. pp. 75-82.

Bibtex

@article{44e51c1e87374999a60027ef594280e9,
title = "Signal-to-noise ratio in reproducing kernel Hilbert spaces",
abstract = "This paper introduces the kernel signal-to-noise ratio (kSNR) for different machine learning and signal processing applications. The kSNR seeks to maximize the signal variance while minimizing the estimated noise variance explicitly in a reproducing kernel Hilbert space (rkHs). The kSNR gives rise to considering complex signal-to-noise relations beyond additive noise models, and can be seen as a useful regularizer for feature extraction and dimensionality reduction. We show that the kSNR generalizes kernel PCA (and other spectral dimensionality reduction methods), least squares SVM, and kernel ridge regression to deal with cases where signal and noise cannot be assumed independent. We give computationally efficient alternatives based on reduced-rank Nystr{\"o}m and projection on random Fourier features approximations, and analyze the bounds of performance and its stability. We illustrate the method through different examples, including nonlinear regression, nonlinear classification in channel equalization, nonlinear feature extraction from high-dimensional spectral satellite images, and bivariate causal inference. Experimental results show that the proposed kSNR yields more accurate solutions and extracts more noise-free features when compared to standard approaches.",
keywords = "Causal inference, Feature extraction, Heteroscedastic, Kernel methods, Noise model, Signal classification, Signal-to-noise ratio, SNR",
author = "Luis G{\'o}mez-Chova and Ra{\'u}l Santos-Rodr{\'i}guez and Gustau Camps-Valls",
year = "2018",
month = "9",
day = "1",
doi = "10.1016/j.patrec.2018.06.004",
language = "English",
volume = "112",
pages = "75--82",
journal = "Pattern Recognition Letters",
issn = "0167-8655",
publisher = "North-Holland Publishing Company",

}

RIS - suitable for import to EndNote

TY - JOUR

T1 - Signal-to-noise ratio in reproducing kernel Hilbert spaces

AU - Gómez-Chova, Luis

AU - Santos-Rodríguez, Raúl

AU - Camps-Valls, Gustau

PY - 2018/9/1

Y1 - 2018/9/1

N2 - This paper introduces the kernel signal-to-noise ratio (kSNR) for different machine learning and signal processing applications. The kSNR seeks to maximize the signal variance while minimizing the estimated noise variance explicitly in a reproducing kernel Hilbert space (rkHs). The kSNR gives rise to considering complex signal-to-noise relations beyond additive noise models, and can be seen as a useful regularizer for feature extraction and dimensionality reduction. We show that the kSNR generalizes kernel PCA (and other spectral dimensionality reduction methods), least squares SVM, and kernel ridge regression to deal with cases where signal and noise cannot be assumed independent. We give computationally efficient alternatives based on reduced-rank Nyström and projection on random Fourier features approximations, and analyze the bounds of performance and its stability. We illustrate the method through different examples, including nonlinear regression, nonlinear classification in channel equalization, nonlinear feature extraction from high-dimensional spectral satellite images, and bivariate causal inference. Experimental results show that the proposed kSNR yields more accurate solutions and extracts more noise-free features when compared to standard approaches.

AB - This paper introduces the kernel signal-to-noise ratio (kSNR) for different machine learning and signal processing applications. The kSNR seeks to maximize the signal variance while minimizing the estimated noise variance explicitly in a reproducing kernel Hilbert space (rkHs). The kSNR gives rise to considering complex signal-to-noise relations beyond additive noise models, and can be seen as a useful regularizer for feature extraction and dimensionality reduction. We show that the kSNR generalizes kernel PCA (and other spectral dimensionality reduction methods), least squares SVM, and kernel ridge regression to deal with cases where signal and noise cannot be assumed independent. We give computationally efficient alternatives based on reduced-rank Nyström and projection on random Fourier features approximations, and analyze the bounds of performance and its stability. We illustrate the method through different examples, including nonlinear regression, nonlinear classification in channel equalization, nonlinear feature extraction from high-dimensional spectral satellite images, and bivariate causal inference. Experimental results show that the proposed kSNR yields more accurate solutions and extracts more noise-free features when compared to standard approaches.

KW - Causal inference

KW - Feature extraction

KW - Heteroscedastic

KW - Kernel methods

KW - Noise model

KW - Signal classification

KW - Signal-to-noise ratio

KW - SNR

UR - http://www.scopus.com/inward/record.url?scp=85048798324&partnerID=8YFLogxK

U2 - 10.1016/j.patrec.2018.06.004

DO - 10.1016/j.patrec.2018.06.004

M3 - Article

VL - 112

SP - 75

EP - 82

JO - Pattern Recognition Letters

JF - Pattern Recognition Letters

SN - 0167-8655

ER -