The Community for Technology Leaders
Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers (1994)
Pacific Grove, CA, USA
Oct. 31, 1994 to Nov. 2, 1994
ISSN: 1058-6393
ISBN: 0-8186-6405-3
pp: 907-911
K.M. Tao , Integrated Syst. Inc., Santa Clara, CA, USA
ABSTRACT
The increasingly popular radial basis function (RBF) networks are smoothed piecewise-constant universal approximators. The (smoothed) piecewise-constant property, however, limits their effectiveness in extrapolations and in "trend" learning. This paper extends the RBF network model, in a natural manner, to be smoothed piecewise-linear approximators, referred to as the extended radial basis function (ERBF) networks. This extension is significant in (at least) the following respects: (1) it can function as a global nonlinear model to smoothly link together the various local linear models; (2) it extends the RBFs ability to extrapolate and generalize more meaningfully; (3) it serves as a unifying model that brings together the various approximators including splines and CMAC neural network models, and (4) this ERBF extension, makes possible the applications of statistical modeling and experiment design techniques to the study of general neural network approximation models. Simulations results of learning various response surfaces are included for discussion and comparison.<>
INDEX TERMS
feedforward neural nets, smoothing methods, piecewise constant techniques, extrapolation, design of experiments, statistical analysis, splines (mathematics), learning (artificial intelligence), cerebellar model arithmetic computers
CITATION

K. Tao, "Extended radial basis function (ERBF) networks-linear extension and connections," Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers(ACSSC), Pacific Grove, CA, USA, 1995, pp. 907-911.
doi:10.1109/ACSSC.1994.471592
97 ms
(Ver 3.3 (11022016))