论文标题

神经功率单位

Neural Power Units

论文作者

Heim, Niklas, Pevný, Tomáš, Šmídl, Václav

论文摘要

常规的神经网络可以近似简单的算术操作,但无法推广到训练过程中看到的数字范围。神经算术单元的目的是克服这一困难,但是当前的算术单元要么仅限于积极数字运行,要么只能代表算术操作的子集。我们介绍了在实数的完整域上运行的神经动力单元(NPU),并且能够在单层中学习任意功能功能。因此,NPU解决了现有算术单元的缺点,并扩展了其表现力。我们通过使用复杂的算术来实现这一目标,而无需将网络转换为复杂数字。简化了对RealnPU的简化产生了高度透明的模型。我们表明,NPU在人工算术数据集上的准确性和稀疏性方面优于竞争对手,而Realnpu只能从数据中发现动态系统的管理方程。

Conventional Neural Networks can approximate simple arithmetic operations, but fail to generalize beyond the range of numbers that were seen during training. Neural Arithmetic Units aim to overcome this difficulty, but current arithmetic units are either limited to operate on positive numbers or can only represent a subset of arithmetic operations. We introduce the Neural Power Unit (NPU) that operates on the full domain of real numbers and is capable of learning arbitrary power functions in a single layer. The NPU thus fixes the shortcomings of existing arithmetic units and extends their expressivity. We achieve this by using complex arithmetic without requiring a conversion of the network to complex numbers. A simplification of the unit to the RealNPU yields a highly transparent model. We show that the NPUs outperform their competitors in terms of accuracy and sparsity on artificial arithmetic datasets, and that the RealNPU can discover the governing equations of a dynamical system only from data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源