Reweighted Nadaraya-Watson Estimator of the Regression Mean

In this paper, the estimation of the regression mean using the Reweighted Nadaraya-Watson (RNW) estimator has been considered. The RNW is a modification of the Nadaraya-Watson (NW) estimator in order to obtain some more refinement estimator. We have considered some conditions under which the asymptotic normality of the proposed estimator has been derived. Then we generalized this result to the multivariate case by considering the estimation of the regression mean at distinct points.


Introduction
Theory and methodology for nonparametric regression is now well developed for the case of the estimation of the regression mean, to motivate the problem, consider a sequence of independent and identically distributed real random variables {(  ,   )} =1  with a joint pdf (, ) as a bivariate random variable (, ).The simple nonparametric regression function is written as where   :  = 1,2, … ,  are called the predictors,   :  = 1,2, … ,  are the corresponding responses, (  ) are the unknown regression mean functions to be estimated nonparametrically, and   :  = 1,2, … ,  denote the measurement errors, where ~(0,  2 ).
The regression mean function () is the conditional mean, which is given by where () is the marginal density function of .Using the kernel estimation, the regression mean function () is estimated by  ̂(), where Conditional density estimation was introduced by Rosenblatt (1969).A bias correction was proposed by Hyndman et al. (1996).Fan et al. (1996) proposed a direct estimator based on a local polynomial estimation.One of the most widely used and studied estimators in the literature is the one proposed independently by Nadaraya (1964) and Watson (1964).Nadaraya-Watson kernel estimation is denoted by  ̂ (|) and defined as, This gives the NW estimator  ̂(  ) of (  ), where (1.6)

Reweighted Nadaraya-Watson Estimator
The large bias and boundary effects are considered to be the most important defects in the case of the Nadaraya-Watson estimator.The Nadaraya-Watson estimator was treated and modified in order to obtain some more refinement estimator, which is called the Reweighed Nadaraya-Watson (RNW) estimator, see Cai (2001), De Gooiger and Zerom (2003).
The (RNW) estimator is derived by a slight modification of the well-known Nadaraya-Watson estimator, to solve such disadvantages.The (RNW) conditional density estimator is defined by, where, where the probability weights   The solution can be derived directly by solving the equation 0 ( Salha and Shekh Ahmed (2009) have showed that where  can be derived as a unique minimizer of   The weighted coefficient () i x  depends on the computed  value and that makes (RNW) estimator is more available in the practice.

Main Results
In this section, we state some conditions under which we derive the main two results in this paper.The first result is stated in Theorem 1, where the asymptotic normality of the (RNW) estimator is shown.The second result is presented in Theorem 2, where the result of Theorem 1 has been generalized to the case of multivariate.
First, we consider the following notations.

Conditions
Consider the following conditions:

C1. The kernel ()
K  is a symmetric and bounded density with a bounded support   1, 1  , and satisfies  and 2 ()   are continuous at x and () m  has continuous second order derivatives in neighborhood of x .

C3. The conditional density function of
Now, we state and prove the first main result in this paper.
Lemma 3.Under the conditions C1-C5, the following hold To show the asymptotic normality distribution of 1 , J we use Liapounov's theorem.It is sufficient to show that, for 0 , by C4.
Therefore, we have   → 0, this completes the proof of the lemma.A combination of lemmas ( 1)-(3) and Equation (3.1) completes the proof of Theorem 1.
In the next Theorem, we will generalize Theorem 1 to the case of multivariate.Schuster (1972) has generalized the asymptotic normality of the Nadaraya-Watson estimator that was shown in Nadaraya (1965) 2 2 00 00 , 00 00 where,   We will prove (2) to illustrate the method (the proof of the remaining completes using the same techniques. (2) The proof of Lemma 4 can be obtained by using the same techniques of the proof of Lemma 1 in Schuster (1972).
Lemma 5.Under the condition of Lemma 4, the following holds To complete the proof, define the function H from  4 and   by 24 1 2 3 4 13 ( , , , ) , .

   
Now, the proof of the theorem completes using the Mann-Wald Theorem.
In this section, the performance of the RNW kernel estimator in estimating the regression mean function is tested using two simulated data.The performance of the estimator has been tested using the mean squared error (MSE), which is defined by and the correlation coefficients between the predicted values  ̂ and the actual values ,  , 2 , which is defined Figure 1 and Figure 2 present two scatter plots of the simulated data, the perfect curve, the RNW estimator and the NW estimator for the first and second models, respectively.The results of the simulation studies are collected in Table 1.

Conclusion
In this paper, we considered the RNW kernel estimator of the regression mean function.We derive the asymptotic normality of the regression mean function at different conditional points.Two applications using simulated data indicate that the performance of the RNW kernel estimator is reasonably good and it is better than the NW kernel estimator.

Let
mean of actual values.Also, a comparison between the RNW kernel estimator and the NW kernel estimator has been given.Two samples of size 400 are simulated from the following two models.

Figure 1 .Figure 2 .
Figure 1.RNW and NW estimator for the first model

Table 1 .
Results of the simulation studies.