School of Mathematical & Statistical Sciences Faculty Publications and Presentations

A Smoothing Algorithm with Constant Learning Rate for Training Two Kinds of Fuzzy Neural Networks and Its Convergence

Document Type

Article

Publication Date

4-2020

Abstract

In this paper, a smoothing algorithm with constant learning rate is presented for training two kinds of fuzzy neural networks (FNNs): max-product and max-min FNNs. Some weak and strong convergence results for the algorithm are provided with the error function monotonically decreasing, its gradient going to zero, and weight sequence tending to a fixed value during the iteration. Furthermore, conditions for the constant learning rate are specified to guarantee the convergence. Finally, three numerical examples are given to illustrate the feasibility and efficiency of the algorithm and to support the theoretical findings.

Comments

https://rdcu.be/eL22X

Publication Title

Neural Processing Letters

DOI

10.1007/s11063-019-10135-4

Share

COinS