Document Type

Article

Publication Date

9-2023

Abstract

This article focuses on online kernel learning over a decentralized network. Each agent in the network receives online streaming data and collaboratively learns a globally optimal nonlinear prediction function in the reproducing kernel Hilbert space (RKHS). To overcome the curse of dimensionality issue in traditional online kernel learning, we utilize random feature (RF) mapping to convert the nonparametric kernel learning problem into a fixed-length parametric one in the RF space. We then propose a novel learning framework, named online decentralized kernel learning via linearized ADMM (ODKLA), to efficiently solve the online decentralized kernel learning problem. To enhance communication efficiency, we introduce quantization and censoring strategies in the communication stage, resulting in the quantized and communication-censored ODKLA (QC-ODKLA) algorithm. We theoretically prove that both ODKLA and QC-ODKLA can achieve the optimal sublinear regret O(√T) over T time slots. Through numerical experiments, we evaluate the learning effectiveness, communication efficiency, and computation efficiency of the proposed methods.

Comments

Original published version available at https://doi.org/10.1109/tnnls.2023.3310499

Publication Title

IEEE transactions on neural networks and learning systems

DOI

10.1109/tnnls.2023.3310499

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.