论文标题
具有边距的多类学习:指数率没有偏见变化权衡
Multiclass learning with margin: exponential rates with no bias-variance trade-off
论文作者
论文摘要
我们研究在适当的边缘条件下多类分类的误差界限的行为。对于多种方法,我们证明在硬利润条件下的分类误差会迅速降低,而没有任何偏差变化权衡。在不同边缘假设的对应关系中,可以获得不同的收敛速率。通过独立且具有启发性的分析,我们能够将已知结果从二进制设置概括为多类设置。
We study the behavior of error bounds for multiclass classification under suitable margin conditions. For a wide variety of methods we prove that the classification error under a hard-margin condition decreases exponentially fast without any bias-variance trade-off. Different convergence rates can be obtained in correspondence of different margin assumptions. With a self-contained and instructive analysis we are able to generalize known results from the binary to the multiclass setting.