论文标题
可变消除的提前,以及基于张量的计算的应用
An Advance on Variable Elimination with Applications to Tensor-Based Computation
论文作者
论文摘要
我们提出了有关可变消除的经典算法的新结果,该算法是许多算法的基础,包括概率推断。结果与利用功能依赖关系有关,使人们可以在具有很大的树宽的模型上进行推理和学习有效地进行推理和学习。进步的重点是它与标准(密集)因素一起使用,而无需基于通常使用的知识汇编的稀疏因素或技术。这很重要,因为它允许使用张量及其操作直接实施改进的可变消除算法,从而导致极其有效的实现,尤其是在学习模型参数时。此外,所提出的技术不需要了解特定功能依赖性的知识,只需要在学习这些依赖项时就可以使用。我们通过将贝叶斯网络查询汇总到张量图中,然后使用标准工具进行张量计算来说明我们提出的算法的疗效。
We present new results on the classical algorithm of variable elimination, which underlies many algorithms including for probabilistic inference. The results relate to exploiting functional dependencies, allowing one to perform inference and learning efficiently on models that have very large treewidth. The highlight of the advance is that it works with standard (dense) factors, without the need for sparse factors or techniques based on knowledge compilation that are commonly utilized. This is significant as it permits a direct implementation of the improved variable elimination algorithm using tensors and their operations, leading to extremely efficient implementations especially when learning model parameters. Moreover, the proposed technique does not require knowledge of the specific functional dependencies, only that they exist, so can be used when learning these dependencies. We illustrate the efficacy of our proposed algorithm by compiling Bayesian network queries into tensor graphs and then learning their parameters from labeled data using a standard tool for tensor computation.