2024-09-202024-09-202018-11Boloix Tortosa, R., Murillo Fuentes, J.J., Payan-Somet, F.J. y Pérez Cruz, F. (2018). Complex Gaussian Processes for Regression. IEEE Transactions on Neural Networks and Learning Systems, 29 (11), 5499-5511. https://doi.org/10.1109/TNNLS.2018.2805019.2162-237Xhttps://hdl.handle.net/11441/162674In this paper, we propose a novel Bayesian solution for nonlinear regression in complex fields. Previous solutions for kernels methods usually assume a complexification approach, where the real-valued kernel is replaced by a complex-valued one. This approach is limited. Based on the results in complex-valued linear theory and Gaussian random processes, we show that a pseudo-kernel must be included. This is the starting point to develop the new complex-valued formulation for Gaussian process for regression (CGPR). We face the design of the covariance and pseudo-covariance based on a convolution approach and for several scenarios. Just in the particular case where the outputs are proper, the pseudo-kernel cancels. Also, the hyperparameters of the covariance can be learned maximizing the marginal likelihood using Wirtinger's calculus and patterned complex-valued matrix derivatives. In the experiments included, we show how CGPR successfully solves systems where the real and imaginary parts are correlated. Besides, we successfully solve the nonlinear channel equalization problem by developing a recursive solution with basis removal. We report remarkable improvements compared to previous solutions: a 2-4-dB reduction of the mean squared error with just a quarter of the training samples used by previous approaches. © 2012 IEEE.application/pdf12 p.engAtribución 4.0 Internacionalhttp://creativecommons.org/licenses/by/4.0/Complex-valued processesGaussian processes (GPs)Kernel methodsRegressionComplex Gaussian Processes for Regressioninfo:eu-repo/semantics/articleinfo:eu-repo/semantics/openAccess10.1109/TNNLS.2018.2805019