论文标题

基于图形的局部注意力变压器用于多目标遗传改变预测

Local Attention Graph-based Transformer for Multi-target Genetic Alteration Prediction

论文作者

Reisenbüchler, Daniel, Wagner, Sophia J., Boxberg, Melanie, Peng, Tingying

论文摘要

经典的多个实例学习(MIL)方法通常基于实例之间的相同和独立的分布式假设,从而忽略了个人实体以外的潜在丰富的上下文信息。另一方面,已经提出了具有全球自我发场模块的变压器,以模拟所有实例之间的相互依赖性。但是,在本文中,我们质疑:是否需要使用自我注意力进行全球关系建模,或者我们是否可以适当地将自我注意计算限制为大规模整个幻灯片图像(WSIS)中的本地制度?我们为MIL(LA-MIL)提出了一个通用的基于本地注意力图的变压器,通过在自适应局部任意大小的自适应局部方案中明确化情境化实例,引入了感应偏见。此外,有效调整的损失函数使我们可以学习表达性WSI嵌入的方法,以进行多种生物标志物的联合分析。我们证明,LA-MIL实现了最新的胃肠道癌的突变预测,从而超过了重要的生物标志物(例如微卫星不稳定性的重要生物标志物)的模型。我们的发现表明,本地自我注意足够模型与全球模块相同的依赖性。我们的LA-MIL实施可从https://github.com/agentdr1/la_mil获得。

Classical multiple instance learning (MIL) methods are often based on the identical and independent distributed assumption between instances, hence neglecting the potentially rich contextual information beyond individual entities. On the other hand, Transformers with global self-attention modules have been proposed to model the interdependencies among all instances. However, in this paper we question: Is global relation modeling using self-attention necessary, or can we appropriately restrict self-attention calculations to local regimes in large-scale whole slide images (WSIs)? We propose a general-purpose local attention graph-based Transformer for MIL (LA-MIL), introducing an inductive bias by explicitly contextualizing instances in adaptive local regimes of arbitrary size. Additionally, an efficiently adapted loss function enables our approach to learn expressive WSI embeddings for the joint analysis of multiple biomarkers. We demonstrate that LA-MIL achieves state-of-the-art results in mutation prediction for gastrointestinal cancer, outperforming existing models on important biomarkers such as microsatellite instability for colorectal cancer. Our findings suggest that local self-attention sufficiently models dependencies on par with global modules. Our LA-MIL implementation is available at https://github.com/agentdr1/LA_MIL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源