论文标题
学习几何词元插件
Learning Geometric Word Meta-Embeddings
论文作者
论文摘要
我们提出了一个几何框架,用于学习来自不同嵌入来源的单词的元嵌入。我们的框架将嵌入到一个共同的潜在空间中,例如,(给定单词)的不同嵌入方式的简单平均是更可am的。所提出的潜在空间来自两个特定的几何变换 - 正交旋转和Mahalanobis度量标准。几个单词相似性和单词类比基准的经验结果说明了所提出的框架的功效。
We propose a geometric framework for learning meta-embeddings of words from different embedding sources. Our framework transforms the embeddings into a common latent space, where, for example, simple averaging of different embeddings (of a given word) is more amenable. The proposed latent space arises from two particular geometric transformations - the orthogonal rotations and the Mahalanobis metric scaling. Empirical results on several word similarity and word analogy benchmarks illustrate the efficacy of the proposed framework.