论文标题
可变缩放的持久性内核(VSPK)用于持续同源应用
Variably Scaled Persistence Kernels (VSPKs) for persistent homology applications
论文作者
论文摘要
近年来,在持续的同源性的背景下,提出了各种内核,以应对监督学习方法中的持久图。在本文中,我们考虑了可变缩放内核的概念,用于近似功能和数据,并在持续同源性的框架中对其进行解释。我们称它们为可变缩放的持久性内核(VSPK)。然后在不同的分类实验中测试了这些新内核。获得的结果表明,它们可以提高现有标准内核的性能和效率。
In recent years, various kernels have been proposed in the context of persistent homology to deal with persistence diagrams in supervised learning approaches. In this paper, we consider the idea of variably scaled kernels, for approximating functions and data, and we interpret it in the framework of persistent homology. We call them Variably Scaled Persistence Kernels (VSPKs). These new kernels are then tested in different classification experiments. The obtained results show that they can improve the performance and the efficiency of existing standard kernels.