Orthogonal Stochastic Configuration Networks with Adaptive Construction Parameter for Data Analytics

by   Wei Dai, et al.

As a randomized learner model, SCNs are remarkable that the random weights and biases are assigned employing a supervisory mechanism to ensure universal approximation and fast learning. However, the randomness makes SCNs more likely to generate approximate linear correlative nodes that are redundant and low quality, thereby resulting in non-compact network structure. In the light of a fundamental principle in machine learning, that is, a model with fewer parameters holds improved generalization. This paper proposes orthogonal SCN, termed OSCN, to filtrate out the low-quality hidden nodes for network structure reduction by incorporating Gram-Schmidt orthogonalization technology. The universal approximation property of OSCN and an adaptive setting for the key construction parameters have been presented in details. In addition, an incremental updating scheme is developed to dynamically determine the output weights, contributing to improved computational efficiency. Finally, experimental results on two numerical examples and several real-world regression and classification datasets substantiate the effectiveness and feasibility of the proposed approach.


Stochastic Configuration Networks: Fundamentals and Algorithms

This paper contributes to a development of randomized methods for neural...

Two Dimensional Stochastic Configuration Networks for Image Data Analytics

Stochastic configuration networks (SCNs) as a class of randomized learne...

Deep Stochastic Configuration Networks with Universal Approximation Property

This paper develops a randomized approach for incrementally building dee...

A study on effectiveness of extreme learning machine

Extreme learning machine (ELM), proposed by Huang et al., has been shown...

An Interpretable Constructive Algorithm for Incremental Random Weight Neural Networks and Its Application

Incremental random weight neural networks (IRWNNs) have gained attention...

Are Direct Links Necessary in RVFL NNs for Regression?

A random vector functional link network (RVFL) is widely used as a unive...

Parsimonious Random Vector Functional Link Network for Data Streams

The theory of random vector functional link network (RVFLN) has provided...

Please sign up or login with your details

Forgot password? Click here to reset