论文标题

用于连续深度学习的时间光学神经元

Temporal optical neurons for serial deep learning

论文作者

Lin, Zhixing, Sun, Shuqian, Azana, Jose, Li, Wei, Zhu, Ninghua, Li, Ming

论文摘要

深度学习能够在功能上模拟人的大脑,因此吸引了相当大的兴趣。通过光学辅助深度学习是提高前向传播速度并降低功耗的有前途的方法。但是,当前的方法基于一种并行处理方法,该方法在处理信息和通信技术核心的串行数据信号方面固有地无效。在这里,我们提出并演示了一个串行的光学深度学习概念,该概念是专门设计用于直接处理高速时间数据的。通过利用超短相干光脉冲作为信息载体,神经元以串行模式在不同的时间插槽中分布,并通过组延迟分散彼此相互联系。构建并培训了一个4层串行光学神经网络(SONN),用于对模拟和数字信号的分类,模拟精度率超过90%,并具有适当的个性差异率。此外,我们进行了伪3层SONN的概念验证实验,以成功识别英文字母的ASCII(美国信息互换标准守则)代码,其数据速率为12 GBP。这个概念代表了人工神经网络的一种新颖的一维实现,从而有效地应用了光学深度学习方法在串行数据信号的分析和处理中,同时为时间信号处理提供了新的整体视角。

Deep learning is able to functionally simulate the human brain and thus, it has attracted considerable interest. Optics-assisted deep learning is a promising approach to improve the forward-propagation speed and reduce the power consumption. However, present methods are based on a parallel processing approach that is inherently ineffective in dealing with serial data signals at the core of information and communication technologies. Here, we propose and demonstrate a serial optical deep learning concept that is specifically designed to directly process high-speed temporal data. By utilizing ultra-short coherent optical pulses as the information carriers, the neurons are distributed at different time slots in a serial pattern, and interconnected to each other through group delay dispersion. A 4-layer serial optical neural network (SONN) was constructed and trained for classification of both analog and digital signals with simulated accuracy rates of over 90% with proper individuality variance rates. Furthermore, we performed a proof-of-concept experiment of a pseudo-3-layer SONN to successfully recognize the ASCII (American Standard Code for Information Interchange) codes of English letters at a data rate of 12 Gbps. This concept represents a novel one-dimensional realization of artificial neural networks, enabling an efficient application of optical deep learning methods to the analysis and processing of serial data signals, while offering a new overall perspective for the temporal signal processing.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源