论文标题
多重神经网络的非正式介绍
An Informal Introduction to Multiplet Neural Networks
论文作者
论文摘要
在人造神经元中,我用加权的lehmer均值代替了点产物,这可能会模仿普通平均值的不同情况。单个神经元实例被具有相同平均权重的多个神经元的多重替换。一组输出向前馈电,代替单个标量。概括参数通常设置为多元中每个神经元的不同值。 我将该概念进一步扩展到了从Gini平均值中获得的多重组。关于权重参数以及相对于两个概括参数的导数。 研究了网络的某些属性,显示了以两层有机地模拟经典独家或问题并执行一些乘法和分裂的能力。该网络可以实例化截短的功率序列和变体,如果参数受到约束,则可以用于近似不同的功能。 此外,得出了平均情况斜率评分,可以根据所选元素的同质性促进学习率的新颖性。多重神经元方程提供了一种分割正则时间范围和方法的方法。
In the artificial neuron, I replace the dot product with the weighted Lehmer mean, which may emulate different cases of a generalized mean. The single neuron instance is replaced by a multiplet of neurons which have the same averaging weights. A group of outputs feed forward, in lieu of the single scalar. The generalization parameter is typically set to a different value for each neuron in the multiplet. I further extend the concept to a multiplet taken from the Gini mean. Derivatives with respect to the weight parameters and with respect to the two generalization parameters are given. Some properties of the network are investigated, showing the capacity to emulate the classical exclusive-or problem organically in two layers and perform some multiplication and division. The network can instantiate truncated power series and variants, which can be used to approximate different functions, provided that parameters are constrained. Moreover, a mean case slope score is derived that can facilitate a learning-rate novelty based on homogeneity of the selected elements. The multiplet neuron equation provides a way to segment regularization timeframes and approaches.