论文标题

单调单索引模型中的最小二乘估计器

Profile least squares estimators in the monotone single index model

论文作者

Balabdaoui, Fadoua, Groeneboom, Piet

论文摘要

我们考虑了单个索引回归模型中有限回归参数$α$的最小二乘估计器$ y =ψ(α^t x)+ε$,其中$ x $是a $ d $ - 二维随机矢量,$ \ e(y | x)=ψ(α^t x)$ us $ψ$ n onotone。有人建议通过配置文件最小二乘估计器估算$α$,最大程度地减少$ \ sum_ {i = 1}^n(y_i-ψ(α^t x_i))^2 $单调$ψ$和$α$上的边界$ s_ {d-1} $ s_ {d-1} $。尽管此建议已经存在很长时间了,但估计值是否为$ \ sqrt {n} $ concontent仍然未知。我们表明,使用固定$α$的相同最小二乘估计器的配置最小二乘估计器,但是使用不同的全局平方,是$ \ sqrt {n} $ - 收敛性和渐近正常。研究了相应的损失函数之间的差异,并给出了与其他方法的比较。

We consider least squares estimators of the finite regression parameter $α$ in the single index regression model $Y=ψ(α^T X)+ε$, where $X$ is a $d$-dimensional random vector, $\E(Y|X)=ψ(α^T X)$, and where $ψ$ is monotone. It has been suggested to estimate $α$ by a profile least squares estimator, minimizing $\sum_{i=1}^n(Y_i-ψ(α^T X_i))^2$ over monotone $ψ$ and $α$ on the boundary $S_{d-1}$of the unit ball. Although this suggestion has been around for a long time, it is still unknown whether the estimate is $\sqrt{n}$ convergent. We show that a profile least squares estimator, using the same pointwise least squares estimator for fixed $α$, but using a different global sum of squares, is $\sqrt{n}$-convergent and asymptotically normal. The difference between the corresponding loss functions is studied and also a comparison with other methods is given.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源