Table of Contents
Fetching ...

Recurrent Stochastic Configuration Networks for Temporal Data Analytics

Dianhui Wang, Gang Dang

TL;DR

The paper addresses temporal data with unknown dynamic orders $d_A$ and $d_B$, and proposes a recurrent stochastic configuration network (RSCN) combining a supervisor-guided random reservoir with online projection updates of the readout weights. The method preserves the echo state property ($ESP$) and the offline universal approximation property, while online projection updates yield convergence of the readout weights to a target $\mathbf{W}_0$ under assumptions on $\alpha$, $\rho(\cdot)$, and $\sigma_{\max}(\cdot)$. The authors provide theoretical results including ESP, offline/universal approximation, and online convergence, and demonstrate through Mackey-Glass forecasting, nonlinear system identification, and two industrial datasets that RSCN achieves lower NRMSE with smaller reservoirs than LSTM, ESN, SCR, PESN, and LIESN. The approach offers a fast, stable, data-adaptive solution for industrial temporal analytics with uncertain dynamics.

Abstract

Temporal data modelling techniques with neural networks are useful in many domain applications, including time-series forecasting and control engineering. This paper aims at developing a recurrent version of stochastic configuration networks (RSCNs) for problem solving, where we have no underlying assumption on the dynamic orders of the input variables. Given a collection of historical data, we first build an initial RSCN model in the light of a supervisory mechanism, followed by an online update of the output weights by using a projection algorithm. Some theoretical results are established, including the echo state property, the universal approximation property of RSCNs for both the offline and online learnings, and the convergence of the output weights. The proposed RSCN model is remarkably distinguished from the well-known echo state networks (ESNs) in terms of the way of assigning the input random weight matrix and a special structure of the random feedback matrix. A comprehensive comparison study among the long short-term memory (LSTM) network, the original ESN, and several state-of-the-art ESN methods such as the simple cycle reservoir (SCR), the polynomial ESN (PESN), the leaky-integrator ESN (LIESN) and RSCN is carried out. Numerical results clearly indicate that the proposed RSCN performs favourably over all of the datasets.

Recurrent Stochastic Configuration Networks for Temporal Data Analytics

TL;DR

The paper addresses temporal data with unknown dynamic orders and , and proposes a recurrent stochastic configuration network (RSCN) combining a supervisor-guided random reservoir with online projection updates of the readout weights. The method preserves the echo state property () and the offline universal approximation property, while online projection updates yield convergence of the readout weights to a target under assumptions on , , and . The authors provide theoretical results including ESP, offline/universal approximation, and online convergence, and demonstrate through Mackey-Glass forecasting, nonlinear system identification, and two industrial datasets that RSCN achieves lower NRMSE with smaller reservoirs than LSTM, ESN, SCR, PESN, and LIESN. The approach offers a fast, stable, data-adaptive solution for industrial temporal analytics with uncertain dynamics.

Abstract

Temporal data modelling techniques with neural networks are useful in many domain applications, including time-series forecasting and control engineering. This paper aims at developing a recurrent version of stochastic configuration networks (RSCNs) for problem solving, where we have no underlying assumption on the dynamic orders of the input variables. Given a collection of historical data, we first build an initial RSCN model in the light of a supervisory mechanism, followed by an online update of the output weights by using a projection algorithm. Some theoretical results are established, including the echo state property, the universal approximation property of RSCNs for both the offline and online learnings, and the convergence of the output weights. The proposed RSCN model is remarkably distinguished from the well-known echo state networks (ESNs) in terms of the way of assigning the input random weight matrix and a special structure of the random feedback matrix. A comprehensive comparison study among the long short-term memory (LSTM) network, the original ESN, and several state-of-the-art ESN methods such as the simple cycle reservoir (SCR), the polynomial ESN (PESN), the leaky-integrator ESN (LIESN) and RSCN is carried out. Numerical results clearly indicate that the proposed RSCN performs favourably over all of the datasets.

Paper Structure

This paper contains 20 sections, 72 equations, 12 figures, 3 tables, 2 algorithms.

Figures (12)

  • Figure 1: Architecture of the recurrent stochastic configuration network.
  • Figure 2: Prediction fitting curves of the RSCN for MG tasks.
  • Figure 3: Errors between the output weights updated by the projection algorithm and trained offline on the MG1 and MG2 tasks.
  • Figure 4: The prediction curves of each model for the nonlinear system identification task.
  • Figure 5: Performance comparison of various models with different reservoir sizes on the nonlinear system identification task.
  • ...and 7 more figures