论文标题

深度神经网络的理论分析用于时间依赖观测

Theoretical analysis of deep neural networks for temporally dependent observations

论文作者

Ma, Mingliang, Safikhani, Abolfazl

论文摘要

深度神经网络是通过非线性模式随着时间的推移对观察进行建模的强大工具。尽管在这种环境中广泛使用神经网络,但深度神经网络的大多数理论发展都在独立观察的假设下,而对时间依赖的观察的理论结果很少。为了弥合这一差距,我们研究了深神经网络对非线性时间序列数据建模的理论特性。具体而言,在混合型假设下建立了(稀疏)馈送神经网络的预测误差的非反应界限。这些假设是温和的,因此它们包括广泛的时间序列模型,包括自动回归模型。与独立观察相比,建立的收敛速率具有额外的对数因素,以补偿由于数据点之间的依赖性而引起的额外复杂性。理论结果通过各种数值模拟设置以及宏观经济数据集的应用支持。

Deep neural networks are powerful tools to model observations over time with non-linear patterns. Despite the widespread use of neural networks in such settings, most theoretical developments of deep neural networks are under the assumption of independent observations, and theoretical results for temporally dependent observations are scarce. To bridge this gap, we study theoretical properties of deep neural networks on modeling non-linear time series data. Specifically, non-asymptotic bounds for prediction error of (sparse) feed-forward neural network with ReLU activation function is established under mixing-type assumptions. These assumptions are mild such that they include a wide range of time series models including auto-regressive models. Compared to independent observations, established convergence rates have additional logarithmic factors to compensate for additional complexity due to dependence among data points. The theoretical results are supported via various numerical simulation settings as well as an application to a macroeconomic data set.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源