School of Mathematical & Statistical Sciences Faculty Publications and Presentations
A Smoothing Algorithm with Constant Learning Rate for Training Two Kinds of Fuzzy Neural Networks and Its Convergence
Document Type
Article
Publication Date
4-2020
Abstract
In this paper, a smoothing algorithm with constant learning rate is presented for training two kinds of fuzzy neural networks (FNNs): max-product and max-min FNNs. Some weak and strong convergence results for the algorithm are provided with the error function monotonically decreasing, its gradient going to zero, and weight sequence tending to a fixed value during the iteration. Furthermore, conditions for the constant learning rate are specified to guarantee the convergence. Finally, three numerical examples are given to illustrate the feasibility and efficiency of the algorithm and to support the theoretical findings.
Recommended Citation
Li, Long, Zhijun Qiao, and Zuqiang Long. "A smoothing algorithm with constant learning rate for training two kinds of fuzzy neural networks and its convergence." Neural Processing Letters 51, no. 2 (2020): 1093-1109. https://doi.org/10.1007/s11063-019-10135-4
Publication Title
Neural Processing Letters
DOI
10.1007/s11063-019-10135-4

Comments
https://rdcu.be/eL22X