论文标题

神经 - 伯特:重新思考蒙面的自动编码,以进行自我监视的神经训练训练

Neuro-BERT: Rethinking Masked Autoencoding for Self-supervised Neurological Pretraining

论文作者

Wu, Di, Li, Siyuan, Yang, Jie, Sawan, Mohamad

论文摘要

与神经信号相关的深度学习有望在医疗诊断,神经康复和脑部计算机界面等各个领域取得重大进步。利用这些信号的全部潜力的挑战在于依赖广泛的高质量注释数据,这些数据通常很少且昂贵,需要专门的基础设施和域专业知识。为了解决深度学习中数据的胃口,我们提出了神经 - 基于傅立叶域中掩盖自动编码的神经信号的自我监管的预训练框架。我们方法背后的直觉很简单:神经信号的频率和相位分布可以揭示复杂的神经功能。我们提出了一个新颖的训练任务,称为傅立叶反转预测(FIP),该任务随机掩盖了输入信号的一部分,然后使用傅立叶反转定理预测缺失的信息。预训练的模型可以可能用于各种下游任务,例如睡眠阶段分类和手势识别。与基于对比度的方法强烈依赖于精心手工制作的增强和暹罗结构不同,我们的方法与无需增强要求的简单变压器编码器非常有效。通过在几个基准数据集上评估我们的方法,我们表明神经 - 伯特将与神经系统相关的任务提高了很大的余量。

Deep learning associated with neurological signals is poised to drive major advancements in diverse fields such as medical diagnostics, neurorehabilitation, and brain-computer interfaces. The challenge in harnessing the full potential of these signals lies in the dependency on extensive, high-quality annotated data, which is often scarce and expensive to acquire, requiring specialized infrastructure and domain expertise. To address the appetite for data in deep learning, we present Neuro-BERT, a self-supervised pre-training framework of neurological signals based on masked autoencoding in the Fourier domain. The intuition behind our approach is simple: frequency and phase distribution of neurological signals can reveal intricate neurological activities. We propose a novel pre-training task dubbed Fourier Inversion Prediction (FIP), which randomly masks out a portion of the input signal and then predicts the missing information using the Fourier inversion theorem. Pre-trained models can be potentially used for various downstream tasks such as sleep stage classification and gesture recognition. Unlike contrastive-based methods, which strongly rely on carefully hand-crafted augmentations and siamese structure, our approach works reasonably well with a simple transformer encoder with no augmentation requirements. By evaluating our method on several benchmark datasets, we show that Neuro-BERT improves downstream neurological-related tasks by a large margin.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源