论文标题

通过神经切线内核分析合奏中的树架构

Analyzing Tree Architectures in Ensembles via Neural Tangent Kernel

论文作者

Kanoh, Ryuichi, Sugiyama, Mahito

论文摘要

软树是一个积极研究的决策树的变体,该变体使用梯度方法更新分裂规则。尽管柔软的树木可以采用各种体系结构,但在理论上,它们的影响并不是众所周知。在本文中,我们制定和分析了由软树合奏引起的,用于任意树架构的神经切线内核(NTK)。这种内核导致了一个了不起的发现,即每个深度的叶子数与无限数量的树木学习中的树木结构相关。换句话说,如果每个深度处的叶子数量固定,则功能空间中的训练行为和概括性能在不同的树架构之间是完全相同的,即使它们不是同构。我们还表明,当不对称树的NTK像决策清单一样,当它们变得无限深时不会退化。这与完美的二进制树相反,该树的NTK已知会退化,并导致较深的树木的概括性能较差。

A soft tree is an actively studied variant of a decision tree that updates splitting rules using the gradient method. Although soft trees can take various architectures, their impact is not theoretically well known. In this paper, we formulate and analyze the Neural Tangent Kernel (NTK) induced by soft tree ensembles for arbitrary tree architectures. This kernel leads to the remarkable finding that only the number of leaves at each depth is relevant for the tree architecture in ensemble learning with an infinite number of trees. In other words, if the number of leaves at each depth is fixed, the training behavior in function space and the generalization performance are exactly the same across different tree architectures, even if they are not isomorphic. We also show that the NTK of asymmetric trees like decision lists does not degenerate when they get infinitely deep. This is in contrast to the perfect binary trees, whose NTK is known to degenerate and leads to worse generalization performance for deeper trees.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源