Sparse Bayesian approach to fast learning network for multiclassification

This paper proposes a novel artificial neural network called sparse-Bayesian--based fast learning network (SBFLN). In SBFLN, sparse Bayesian regression is used to train the fast learning network (FLN), which is an improved extreme learning machine (ELM). The training process of SBFLN is to randomly generate the input weights and the hidden layer biases, and then find the probability distribution of other weights by the sparse Bayesian approach. SBFLN calculates the predicted output through Bayes estimator, so it can provide a natural marginal possibility for classification problems and can solve the overfitting problem caused by the least-squares estimation in FLN. In addition, the sparse Bayesian approach can automatically trim most redundant neurons in hidden layer, which makes the network more compact and accurate. To verify the effectiveness of the improvements in this paper, the results of SBFLN are evaluated in 15 benchmark classification problems. The experimental results show that SBFLN is not sensitive to the number of neurons in the hidden layer, and the performance of SBFLN is competitive or superior to some other state-of-the-art algorithms.