论文标题
Micronet:用于在商品微控制器上部署Tinyml应用程序的神经网络体系结构
MicroNets: Neural Network Architectures for Deploying TinyML Applications on Commodity Microcontrollers
论文作者
论文摘要
在资源约束的微控制器(MCUS)上,在本地执行机器学习工作负载有望大大扩展物联网的应用程序空间。但是,所谓的Tinyml提出了严重的技术挑战,因为深度神经网络推断需要大量的计算和记忆预算。为了应对这一挑战,神经体系结构搜索(NAS)有望帮助设计符合MCU记忆,延迟和能量限制的准确ML模型。 NAS算法的一个关键组成部分是其潜伏期/能量模型,即从给定的神经网络体系结构到MCU上的推断潜伏期/能量的映射。在本文中,我们观察到MCU模型设计的NAS搜索空间的有趣属性:平均而言,模型延迟随着模型操作(OP)计数在搜索空间中的模型上的统一先验下线性变化。利用这种见解,我们使用可区分的NAS(DNA)来搜索具有低内存使用情况和低OP计数的模型,其中OP计数被视为可行的延迟代理。实验结果验证了我们的方法学,从而产生了我们的MicroNET模型,我们使用Tensorflow Lite Micro(一种标准的开源NN推论运行时,我们都在MCUS上部署了MCUS模型,该模型已广泛使用。 Micronets展示了所有三个Tinymlperf行业标准的基准任务的最新结果:视觉唤醒单词,音频关键字点斑点和异常检测。可以在github.com/arm-software/ml-zoo上找到模型和培训脚本。
Executing machine learning workloads locally on resource constrained microcontrollers (MCUs) promises to drastically expand the application space of IoT. However, so-called TinyML presents severe technical challenges, as deep neural network inference demands a large compute and memory budget. To address this challenge, neural architecture search (NAS) promises to help design accurate ML models that meet the tight MCU memory, latency and energy constraints. A key component of NAS algorithms is their latency/energy model, i.e., the mapping from a given neural network architecture to its inference latency/energy on an MCU. In this paper, we observe an intriguing property of NAS search spaces for MCU model design: on average, model latency varies linearly with model operation (op) count under a uniform prior over models in the search space. Exploiting this insight, we employ differentiable NAS (DNAS) to search for models with low memory usage and low op count, where op count is treated as a viable proxy to latency. Experimental results validate our methodology, yielding our MicroNet models, which we deploy on MCUs using Tensorflow Lite Micro, a standard open-source NN inference runtime widely used in the TinyML community. MicroNets demonstrate state-of-the-art results for all three TinyMLperf industry-standard benchmark tasks: visual wake words, audio keyword spotting, and anomaly detection. Models and training scripts can be found at github.com/ARM-software/ML-zoo.