论文标题

MEIM:多派嵌入交互作用以外的块术语格式以进行有效和表达的链接预测

MEIM: Multi-partition Embedding Interaction Beyond Block Term Format for Efficient and Expressive Link Prediction

论文作者

Tran, Hung Nghiep, Takasu, Atsuhiro

论文摘要

知识图嵌入旨在预测知识图中实体之间的丢失关系。基于张量分解的模型,例如复杂的模型,在效率和表现力之间提供了一个良好的权衡,这是至关重要的,这是至关重要的,因为现实世界的知识图很大。最近的多派由于嵌入相互作用(MEI)模型通过使用块术语张量格式包含这些模型,并为折衷提供了系统的解决方案。但是,MEI有几个缺点,其中一些来自其含量基于张分解的模型。在本文中,我们解决了这些缺点,并介绍了超出块术语格式(MEIM)模型的多分区嵌入相互作用,除了多区域嵌入外,还具有独立的核心张量,用于集合效应和柔和的正交性,用于最大量表。 Meim提高表现力,同时仍然高效,帮助其胜过强大的基准,并使用相当小的嵌入尺寸在困难的链路预测基准上实现最先进的结果。源代码在https://github.com/tranhungnghiep/meim-kge上发布。

Knowledge graph embedding aims to predict the missing relations between entities in knowledge graphs. Tensor-decomposition-based models, such as ComplEx, provide a good trade-off between efficiency and expressiveness, that is crucial because of the large size of real world knowledge graphs. The recent multi-partition embedding interaction (MEI) model subsumes these models by using the block term tensor format and provides a systematic solution for the trade-off. However, MEI has several drawbacks, some of which carried from its subsumed tensor-decomposition-based models. In this paper, we address these drawbacks and introduce the Multi-partition Embedding Interaction iMproved beyond block term format (MEIM) model, with independent core tensor for ensemble effects and soft orthogonality for max-rank mapping, in addition to multi-partition embedding. MEIM improves expressiveness while still being highly efficient, helping it to outperform strong baselines and achieve state-of-the-art results on difficult link prediction benchmarks using fairly small embedding sizes. The source code is released at https://github.com/tranhungnghiep/MEIM-KGE.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源