论文标题

量子启发的算法应用于极限学习

Quantum-inspired algorithm applied to extreme learning

论文作者

Takeda, Iori, Takahira, Souichi, Mitarai, Kosuke, Fujii, Keisuke

论文摘要

量子启发的奇异值分解(SVD)是一种在对数时间相对于矩阵的尺寸执行SVD的技术,允许访问嵌入在细分树数据结构中的矩阵。通过根据矩阵元素的规范对矩阵元素的有效采样,可以加速。在这里,我们将其应用于极限学习,这是一个机器学习框架,该框架使用通过随机神经网络生成的随机特征向量执行线性回归。极限学习适用于量子启发的SVD的应用,因为它首先需要将每个数据转换为随机特征,在此过程中,我们可以使用对数数据构建数据结构,相对于数据数量。我们实现了该算法,并观察到它使用高维特征向量时的精确SVD的速度更快。但是,我们还观察到,对于由随机神经网络产生的随机特征,我们可以用均匀采样的量子启发算法中基于规范的采样来替换由于矩阵在这种情况下的均匀性而获得相同水平的测试准确性。基于规范的采样对于通过优化特征映射获得的更不均匀的矩阵而有效。它意味着矩阵元素的不均匀性是量子启发的SVD的关键特性。这项工作是迈向量子启发算法实际应用的第一步。

Quantum-inspired singular value decomposition (SVD) is a technique to perform SVD in logarithmic time with respect to the dimension of a matrix, given access to the matrix embedded in a segment-tree data structure. The speedup is possible through the efficient sampling of matrix elements according to their norms. Here, we apply it to extreme learning which is a machine learning framework that performs linear regression using random feature vectors generated through a random neural network. The extreme learning is suited for the application of quantum-inspired SVD in that it first requires transforming each data to a random feature during which we can construct the data structure with a logarithmic overhead with respect to the number of data. We implement the algorithm and observe that it works order-of-magnitude faster than the exact SVD when we use high-dimensional feature vectors. However, we also observe that, for random features generated by random neural networks, we can replace the norm-based sampling in the quantum-inspired algorithm with uniform sampling to obtain the same level of test accuracy due to the uniformity of the matrix in this case. The norm-based sampling becomes effective for more non-uniform matrices obtained by optimizing the feature mapping. It implies the non-uniformity of matrix elements is a key property of the quantum-inspired SVD. This work is a first step toward the practical application of the quantum-inspired algorithm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源