Document Type
Article
Publication Date
9-2023
Abstract
This article focuses on online kernel learning over a decentralized network. Each agent in the network receives online streaming data and collaboratively learns a globally optimal nonlinear prediction function in the reproducing kernel Hilbert space (RKHS). To overcome the curse of dimensionality issue in traditional online kernel learning, we utilize random feature (RF) mapping to convert the nonparametric kernel learning problem into a fixed-length parametric one in the RF space. We then propose a novel learning framework, named online decentralized kernel learning via linearized ADMM (ODKLA), to efficiently solve the online decentralized kernel learning problem. To enhance communication efficiency, we introduce quantization and censoring strategies in the communication stage, resulting in the quantized and communication-censored ODKLA (QC-ODKLA) algorithm. We theoretically prove that both ODKLA and QC-ODKLA can achieve the optimal sublinear regret O(√T) over T time slots. Through numerical experiments, we evaluate the learning effectiveness, communication efficiency, and computation efficiency of the proposed methods.
Recommended Citation
Xu, Ping et al. “QC-ODKLA: Quantized and Communication-Censored Online Decentralized Kernel Learning via Linearized ADMM.” IEEE transactions on neural networks and learning systems, vol. PP 10.1109/TNNLS.2023.3310499. 13 Sep. 2023, https://doi.org/10.1109/tnnls.2023.3310499
Publication Title
IEEE transactions on neural networks and learning systems
DOI
10.1109/tnnls.2023.3310499
Comments
Original published version available at https://doi.org/10.1109/tnnls.2023.3310499