Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-lj6df Total loading time: 0 Render date: 2024-11-20T03:22:41.009Z Has data issue: false hasContentIssue false

3 - Kernel-Based Adaptive Filtering

Published online by Cambridge University Press:  24 November 2022

Paulo S. R. Diniz
Affiliation:
Universidade Federal do Rio de Janeiro
Marcello L. R. de Campos
Affiliation:
Universidade Federal do Rio de Janeiro
Wallace A. Martins
Affiliation:
University of Luxembourg
Markus V. S. Lima
Affiliation:
Universidade Federal do Rio de Janeiro
Jose A. Apolinário, Jr
Affiliation:
Military Institute of Engineering
Get access

Summary

This chapter explains the basic concepts of kernel-based methods, a widely used tool in machine learning. The idea is to present online parameter estimation of nonlinear models using kernel-based tools. The chapters aim is to introduce the kernel version of classical algorithms such as least mean square (LMS), recursive least squares (RLS), affine projection (AP), and set membership affine projection (SM-AP). In particular, we will discuss how to keep the dictionary of the kernel finite through a series of model reduction strategies. This way, all discussed kernel algorithms are tailored for online implementation.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2022

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Aronszajn, N., Theory of reproducing kernels. Transactions of the American Mathematical Society 63, pp. 337-404 (1950).Google Scholar
Aizerman, M. A., Braverman, E. M., and L. I. Rozoner, Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control 25, pp. 821-837 (1964).Google Scholar
Vapnik, V., The Nature of Machine Learning, 2nd ed. (Springer, New York, 1999).Google Scholar
Vapnik, V., Statistical Learning Theory (Wiley Interscience, New York, 1998).Google Scholar
Mallat, S., A Wavelet Tour of Signal Processing (Academic Press, Burlington, 2009).Google Scholar
Diniz, P. S. R., Adaptive Filtering: Algorithms and Practical Implementations, 5th ed. (Springer, Cham, 2020).Google Scholar
Liu, W., P.P. Pokharel, and J.C. Principe, The kernel least-mean-square algorithm. IEEE Transactions on Signal Processing. 56, pp. 543-554 (2008).Google Scholar
Liu, W., J.C. Principe, S. Haykin, Kernel Adaptive Filtering: A Comprehensive Introduction (Wiley, Hoboken, 2010).CrossRefGoogle Scholar
Bishop, C., Pattern Recognition and Machine Learning (Springer, New York, 2007).Google Scholar
Scholkopf, B. and Smola, A. L., Learning with Kernels: Support Vector Machine, Regularization, Optimization and Beyond (The MIT Press, Cambridge, 2001).Google Scholar
Murphy, K. P., Machine Learning: A Probabilistic Perspective (The MIT Press, Cambridge, 2012).Google Scholar
Theodoridis, S., Machine Learning: A Bayesian and Optimization Perspective (Academic Press, Oxford, 2015).Google Scholar
Abu-Mostafa, Y. S., Magdon-Ismail, M., and H.-T. Lin, Learning from Data (AMLbook.com, 2012).Google Scholar
Richard, C., C. M. Bermudez, J., and P. Honeine, Online prediction of time series data with kernels. IEEE Transactions on Signal Processing 57, pp. 1058-1067 (2009).CrossRefGoogle Scholar
Parreira, W. D., C. M. Bermudez, J., Richard, C., and J.-Y. Tourneret, Stochastic behavior analysis of the Gaussian kernel least-mean-square algorithm. IEEE Transactions on Signal Processing 60, pp. 2208-2222 (2012).Google Scholar
Ozeki, K., Theory of Affine Projection Algorithms for Adaptive Filtering (Springer, New York, 2015).Google Scholar
Albu, F., Coltuc, D., Rotaru, M., and K. Nishikawa, An efficient implementation of the kernel affine projection algorithm. 8th International Symposium on Image and Signal Processing and Analysis (ISPA 2013), Trieste, Italy, 2013, pp. 349353.Google Scholar
Honeine, P., Approximation errors of online sparsification criteria. IEEE Transactions on Signal Processing 63, pp. 4700-4709 (2015).Google Scholar
Engel, Y., Mannor, S., and R. Meir, The kernel recursive least-squares algorithm. IEEE Transactions on Signal Processing 52, pp. 2275-2285 (2004).Google Scholar
Van Vaerenbergh, S., Via, J., and I. Santamarla, A sliding-window kernel RLS and its application to nonlinear channel identification. Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Toulouse, France, May 2006, pp. V-789-V-792.Google Scholar
Van Vaerenbergh, S., Via, J., and I. Santamarla, Nonlinear system identification using new sliding-window kernel RLS algorithm. Journal of Communications 2, pp. 1-8 (2007).Google Scholar
Van Vaerenbergh, S., Santamarla, I., Liu, W., and J. C. Principe, Fixed-budget kernel recursive least-squares. Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Toulouse, France, May 2006, pp. V- 789-V-792.Google Scholar
Malipatil, A. V., Huang, Y.-F., Andra, S., and K. Bennett, Kernelized set- membership approach to nonlinear adaptive filtering. Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Philadelphia, USA, May 2005, pp. IV-149-IV-152.Google Scholar
Flores, A. and C. de Lamare, R., Set-membership kernel adaptive algorithms. Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, New Orleans, USA, May 2017, pp. 2676-2680.Google Scholar
Chen, K., Werner, S., Kuh, A., and Y.-F. Huang, Nonlinear adaptive filtering with kernel set-membership approach. IEEE Transactions on Signal Processing 68, pp. 1515-1528 (2020).Google Scholar
Rahimi, A. and Recht, B., Random features for large-scale kernel machines. NIPS'07: Proceedings of the 20th International Conference on Neural Information Processing Systems, Curran Associates, Inc., 2008, pp. 1-8.Google Scholar
Sriperumbudur, B. K. and Szabo, Z., Optimal rates for random Fourier features. NIPS'15: Proceedings of the 28th International Conference on Neural Information Processing Systems, pp. 1144-1152.Google Scholar
Sutherland, D. J. and Schneider, J., On the error of random Fourier features. arXiv:1506.02785, 2015, pp. 1-10.Google Scholar
Muandet, K., Fukumizu, K., Sriperumbudur, B., and B. Scholkopf, Kernel Mean Embedding of Distributions: A Review and Beyond (NOW Publishers, Delft, 2017).Google Scholar
Rudin, W., Fourier Analysis on Groups (Dover Publications, New York, 2017).Google Scholar
Diniz, P. S. R., On data-selective adaptive filtering. IEEE Transactions on Signal Processing 66, pp. 4239-4252 (2018).Google Scholar
Mendonca, M. O. K., Ferreira, J. O., Tsinos, C. G., S. R. Diniz, P., and T. N. Fer- reira, On fast converging data-selective adaptive filtering. Algorithms 12, pp. 115 (2019).Google Scholar
Diniz, P. S. R., Ferreira, J. O., O. K. Mendoncca, M., and T. N. Ferreira, Data selection kernel conjugate gradient algorithm. Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Barcelona, Spain, May 2020, pp. 1-5.Google Scholar
Ferreira, J. O., O. K. Mendonca, M., and P. S. R. Diniz, Data selection in neural networks. IEEE Open Journal of Signal Processing 2, pp. 1-15 (2021).Google Scholar
Souza Filho and P. S. R. Diniz, J. B., A recursive least square algorithm for online kernel principal component extraction. Neuralcomputing 237, pp. 255264 (2017).Google Scholar
Souza Filho, J. B., S. R. Diniz, P., B. Souza Filho, J., and P. S. R. Diniz, Fixed- point online kernel principal component extraction algorithm. IEEE Transactions on Signal Processing 65, pp. 6244-6259 (2017).Google Scholar
Souza Filho and P. S. R. Diniz, J. B., Improving KPCA online extraction by orthonormalization in the feature space. IEEE Transactions on Neural Networks and Learning Systems 29, pp. 2162-2388 (2018).Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×