Orthogonal least squares algorithm for RBF network design



Journal Title

Journal ISSN

Volume Title



It is desirable to have a neural network that can be trained fast and yet be good at generalization. Radial basis function (RBF) networks can be trained faster than the traditional back propagation networks, but suffer from lack of generalization. l "sually all the data points in the training set are used to place hidden nodes in an RBF network. Such networks usually have weak generalization capability. Reducing the number of degrees of freedom of the network by reducing the number of hidden nodes is one way to improve the generalization capability of the network. One method which can be used to elin1inate hidden nodes in a systematic manner is by using the orthogonal least squares (OLS) algorithm. The primary objective of this research is to evaluate the effectiveness of the OLS algorithm in center selection for RBF networks. This assessment will be made with respect to son1e classification applications. Initially all points in the training set were used to place hidden nodes. The OLS algorithm was applied on such full-sized networks to eliminate data points which do not contribute significantly towards the desired output energy. For two of the three data sets considered, the generalization capability of the network improved compared to that of the fuJJ -sized network. This occurred after reducing the number of hidden nodes to about a third of the total nun1ber of training points, using the 0 LS algorithm. For the third data set, the generalization capacity did not increase by reducing the ntuuber of hidden nodes using OLS. However, the same performance as that of the full-sized network was achieved with fewer nun1ber of hidden nodes.



Neural networks (Computer science), Least squares, Functions, Orthogonal