论文标题
通过多任务元学习数据有效的大脑连接组分析
Data-Efficient Brain Connectome Analysis via Multi-Task Meta-Learning
论文作者
论文摘要
大脑网络将大脑区域之间的复杂连接性描述为图形结构,这为研究脑连接群提供了强大的手段。近年来,图形神经网络已成为使用结构化数据的普遍学习范式。但是,由于数据获取的成本相对较高,大多数大脑网络数据集的样本量受到限制,这阻碍了足够的培训中的深度学习模型。受到元学习的启发,通过有限的培训示例,可以快速学习新概念,因此研究了在跨模型环境中分析脑连接组的数据有效培训策略。具体而言,我们建议在大型样本量的数据集上进行元热训练,并将知识转移到小数据集中。此外,我们还探索了两种面向脑网络的设计,包括Atlas转换和自适应任务重新启动。与其他训练前策略相比,我们基于元学习的方法实现了更高和稳定的性能,这证明了我们提出的解决方案的有效性。该框架还能够以数据驱动的方式获得有关数据集和疾病之间相似之处的新见解。
Brain networks characterize complex connectivities among brain regions as graph structures, which provide a powerful means to study brain connectomes. In recent years, graph neural networks have emerged as a prevalent paradigm of learning with structured data. However, most brain network datasets are limited in sample sizes due to the relatively high cost of data acquisition, which hinders the deep learning models from sufficient training. Inspired by meta-learning that learns new concepts fast with limited training examples, this paper studies data-efficient training strategies for analyzing brain connectomes in a cross-dataset setting. Specifically, we propose to meta-train the model on datasets of large sample sizes and transfer the knowledge to small datasets. In addition, we also explore two brain-network-oriented designs, including atlas transformation and adaptive task reweighing. Compared to other pre-training strategies, our meta-learning-based approach achieves higher and stabler performance, which demonstrates the effectiveness of our proposed solutions. The framework is also able to derive new insights regarding the similarities among datasets and diseases in a data-driven fashion.