论文标题
压缩支持神经回路行为的低维表示
Compression supports low-dimensional representations of behavior across neural circuits
论文作者
论文摘要
减少维度的一种压缩形式可以简化信息的表示以提高效率并揭示一般模式。但是,这种简化也没收了信息,从而降低了代表性。因此,大脑可能会受益于产生压缩和未压缩活性,并且可以在代表低水平(感觉)或高级(认知)刺激的各种神经回路中以异质的方式进行。但是,在皮层中,压缩和代表能力的恰恰如何差异仍然未知。在这里,我们通过在网络上使用随机步行来模拟活动流动并制定速率延伸功能,这是跨区域电路的不同水平的压缩,这是有损压缩的基础。使用大量青年样本($ n = 1,040美元),我们通过两种方式测试预测:通过测量从感觉运动到缔合皮质的自发活动的维度,并通过评估神经回路中24个行为的代表性和20个反复神经网络中的20个认知变量的表现能力。我们的网络压缩理论可以预测活动的维度($ t = 12.13,p <0.001 $)和生物学的代表性($ r = 0.53,p = 0.016 $)和人工($ r = 0.61,p <0.001 $)。该模型表明,压缩的基本形式如何是与网络其余部分通信的分布式电路之间活动流的新兴特性。
Dimensionality reduction, a form of compression, can simplify representations of information to increase efficiency and reveal general patterns. Yet, this simplification also forfeits information, thereby reducing representational capacity. Hence, the brain may benefit from generating both compressed and uncompressed activity, and may do so in a heterogeneous manner across diverse neural circuits that represent low-level (sensory) or high-level (cognitive) stimuli. However, precisely how compression and representational capacity differ across the cortex remains unknown. Here we predict different levels of compression across regional circuits by using random walks on networks to model activity flow and to formulate rate-distortion functions, which are the basis of lossy compression. Using a large sample of youth ($n=1,040$), we test predictions in two ways: by measuring the dimensionality of spontaneous activity from sensorimotor to association cortex, and by assessing the representational capacity for 24 behaviors in neural circuits and 20 cognitive variables in recurrent neural networks. Our network theory of compression predicts the dimensionality of activity ($t=12.13, p<0.001$) and the representational capacity of biological ($r=0.53, p=0.016$) and artificial ($r=0.61, p<0.001$) networks. The model suggests how a basic form of compression is an emergent property of activity flow between distributed circuits that communicate with the rest of the network.