Orthogonal Stochastic Configuration Networks with Adaptive Construction Parameter for Data Analytics

05/26/2022
by   Wei Dai, et al.
0

As a randomized learner model, SCNs are remarkable that the random weights and biases are assigned employing a supervisory mechanism to ensure universal approximation and fast learning. However, the randomness makes SCNs more likely to generate approximate linear correlative nodes that are redundant and low quality, thereby resulting in non-compact network structure. In the light of a fundamental principle in machine learning, that is, a model with fewer parameters holds improved generalization. This paper proposes orthogonal SCN, termed OSCN, to filtrate out the low-quality hidden nodes for network structure reduction by incorporating Gram-Schmidt orthogonalization technology. The universal approximation property of OSCN and an adaptive setting for the key construction parameters have been presented in details. In addition, an incremental updating scheme is developed to dynamically determine the output weights, contributing to improved computational efficiency. Finally, experimental results on two numerical examples and several real-world regression and classification datasets substantiate the effectiveness and feasibility of the proposed approach.

READ FULL TEXT
research
02/10/2017

Stochastic Configuration Networks: Fundamentals and Algorithms

This paper contributes to a development of randomized methods for neural...
research
09/06/2018

Two Dimensional Stochastic Configuration Networks for Image Data Analytics

Stochastic configuration networks (SCNs) as a class of randomized learne...
research
02/18/2017

Deep Stochastic Configuration Networks with Universal Approximation Property

This paper develops a randomized approach for incrementally building dee...
research
09/13/2014

A study on effectiveness of extreme learning machine

Extreme learning machine (ELM), proposed by Huang et al., has been shown...
research
07/01/2023

An Interpretable Constructive Algorithm for Incremental Random Weight Neural Networks and Its Application

Incremental random weight neural networks (IRWNNs) have gained attention...
research
03/29/2020

Are Direct Links Necessary in RVFL NNs for Regression?

A random vector functional link network (RVFL) is widely used as a unive...
research
04/10/2017

Parsimonious Random Vector Functional Link Network for Data Streams

The theory of random vector functional link network (RVFLN) has provided...

Please sign up or login with your details

Forgot password? Click here to reset