Parsimonious Random Vector Functional Link Network for Data Streams

by   Mahardhika Pratama, et al.

The theory of random vector functional link network (RVFLN) has provided a breakthrough in the design of neural networks (NNs) since it conveys solid theoretical justification of randomized learning. Existing works in RVFLN are hardly scalable for data stream analytics because they are inherent to the issue of complexity as a result of the absence of structural learning scenarios. A novel class of RVLFN, namely parsimonious random vector functional link network (pRVFLN), is proposed in this paper. pRVFLN features an open structure paradigm where its network structure can be built from scratch and can be automatically generated in accordance with degree of nonlinearity and time-varying property of system being modelled. pRVFLN is equipped with complexity reduction scenarios where inconsequential hidden nodes can be pruned and input features can be dynamically selected. pRVFLN puts into perspective an online active learning mechanism which expedites the training process and relieves operator labelling efforts. In addition, pRVFLN introduces a non-parametric type of hidden node, developed using an interval-valued data cloud. The hidden node completely reflects the real data distribution and is not constrained by a specific shape of the cluster. All learning procedures of pRVFLN follow a strictly single-pass learning mode, which is applicable for an online real-time deployment. The efficacy of pRVFLN was rigorously validated through numerous simulations and comparisons with state-of-the art algorithms where it produced the most encouraging numerical results. Furthermore, the robustness of pRVFLN was investigated and a new conclusion is made to the scope of random parameters where it plays vital role to the success of randomized learning.


A New Learning Paradigm for Random Vector Functional-Link Network: RVFL+

In school, a teacher plays an important role in various classroom teachi...

Random Vector Functional Link Neural Network based Ensemble Deep Learning

In this paper, we propose a deep learning framework based on randomized ...

Autonomous Deep Learning: Continual Learning Approach for Dynamic Environments

The feasibility of deep neural networks (DNNs) to address data stream pr...

Weighting and Pruning based Ensemble Deep Random Vector Functional Link Network for Tabular Data Classification

In this paper, we first introduce batch normalization to the edRVFL netw...

Are Direct Links Necessary in RVFL NNs for Regression?

A random vector functional link network (RVFL) is widely used as a unive...

PANFIS++: A Generalized Approach to Evolving Learning

The concept of evolving intelligent system (EIS) provides an effective a...

Orthogonal Stochastic Configuration Networks with Adaptive Construction Parameter for Data Analytics

As a randomized learner model, SCNs are remarkable that the random weigh...

Please sign up or login with your details

Forgot password? Click here to reset