论文标题

前馈恢复神经网络的功能维度

Functional dimension of feedforward ReLU neural networks

论文作者

Grigsby, J. Elisenda, Lindsey, Kathryn, Meyerhoff, Robert, Wu, Chenxi

论文摘要

众所周知,具有重新激活函数的完全连接的前馈神经网络可以表示的参数化函数家族恰好是一类有限的分段线性函数。鲜为人知的是,对于Relu神经网络的每个固定架构,参数空间都允许对称性的正维空间,因此在任何给定参数附近的局部功能维度都低于参数维度。在这项工作中,我们仔细地定义了功能维度的概念,表明它在Relu神经网络函数的参数空间中是不均匀的,并继续在[14]和[5]中启动的调查 - 何时函数维度实现其理论最大值。我们还研究了从参数空间到功能空间的实现图的商空间和纤维,提供了断开连接的纤维的示例,功能尺寸为非恒定的纤维以及对称组在其上进行非晶体的纤维。

It is well-known that the parameterized family of functions representable by fully-connected feedforward neural networks with ReLU activation function is precisely the class of piecewise linear functions with finitely many pieces. It is less well-known that for every fixed architecture of ReLU neural network, the parameter space admits positive-dimensional spaces of symmetries, and hence the local functional dimension near any given parameter is lower than the parametric dimension. In this work we carefully define the notion of functional dimension, show that it is inhomogeneous across the parameter space of ReLU neural network functions, and continue an investigation - initiated in [14] and [5] - into when the functional dimension achieves its theoretical maximum. We also study the quotient space and fibers of the realization map from parameter space to function space, supplying examples of fibers that are disconnected, fibers upon which functional dimension is non-constant, and fibers upon which the symmetry group acts non-transitively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源