Table of Contents
Fetching ...

A Robbins-Monro algorithm for non-parametric estimation of NAR process with Markov-Switching: asymptotic normality

Lisandro Fermin, Ricardo Rios, Luis-Ángel Rodríguez

Abstract

This paper is the second part of our study on the non-parametric estimation of MS-NAR processes started with [L. Fermin et al. 2017]. We consider the Nadaraya-Watson type regression function estimator for non-linear autoregressive Markov switching processes. In this context the regression function estimator is interpreted as a solution of a local weighted We have introduced, in the first work, a restoration-estimation Robbins-Monro algorithm to approximate the estimator, and we proved identifiability of model and the consistency of the non-parametric estimator. In this work, we obtain the central limit theorem for the non-parametric estimator, whether the Markov chain is observed or not. Finally, we present a detailed simulation study illustrating the performances of our estimation procedure.

A Robbins-Monro algorithm for non-parametric estimation of NAR process with Markov-Switching: asymptotic normality

Abstract

This paper is the second part of our study on the non-parametric estimation of MS-NAR processes started with [L. Fermin et al. 2017]. We consider the Nadaraya-Watson type regression function estimator for non-linear autoregressive Markov switching processes. In this context the regression function estimator is interpreted as a solution of a local weighted We have introduced, in the first work, a restoration-estimation Robbins-Monro algorithm to approximate the estimator, and we proved identifiability of model and the consistency of the non-parametric estimator. In this work, we obtain the central limit theorem for the non-parametric estimator, whether the Markov chain is observed or not. Finally, we present a detailed simulation study illustrating the performances of our estimation procedure.

Paper Structure

This paper contains 21 sections, 5 theorems, 77 equations, 13 figures.

Key Result

Theorem 3.1

Assume that the model MS-NAR satisfies the conditions E1-E4, E6-E7, D1, B1-B2, M2, S1, and R2-R3 on a compact set $C$. Then, for each fixed $y\in C$, whenever $nh_n\to \infty$. $\blacktriangleleft$$\blacktriangleleft$

Figures (13)

  • Figure 1: Simulated sample path $Y_{0:n}$ from an MS-NAR model with $m = 3$ states and $n = 3000$.
  • Figure 2: Scatterplot $Y_{k}$ versus $Y_{k-1}$ with real state classification. The points are labeled with respect to the real state of $X_k$: cross for $X_k=1$, triangles for $X_k=2$, and circles for $X_k=3$.
  • Figure 3: Non-parametric kernel regression function estimates in the fully observed data case. The real functions $r_i$ are shown with solid lines and the Nadaraya-Watson kernel estimators with: line with cross for $\hat{r}_{1,n}$, line with triangles for $\hat{r}_{2,n}$, line with circles for $\hat{r}_{3,n}$.
  • Figure 4: Non-parametric kernel density function estimate $\hat{p}_0(y)$ and the frequency histogram of $Y_{0:n}$.
  • Figure 5: Non-parametric kernel regression function estimates in the fully observed data case. The Nadaraya-Watson kernel estimators are shown with: line with cross for $\hat{r}_{1,n}$, line with triangles for $\hat{r}_{2,n}$, line with circles for $\hat{r}_{3,n}$. The real functions $r_i$ are shown with solid lines, and the fill areas correspond to the $95\%$ pointwise confidence bands.
  • ...and 8 more figures

Theorems & Definitions (10)

  • Remark 2.1
  • Remark 3.1
  • Theorem 3.1
  • Lemma 3.1
  • Lemma 3.2
  • Theorem 3.2
  • Remark 3.2
  • Remark 3.3
  • Remark 3.4
  • Lemma 3.3