论文标题
时间修剪(坑):用于时间卷积网络的轻型网络架构优化器
Pruning In Time (PIT): A Lightweight Network Architecture Optimizer for Temporal Convolutional Networks
论文作者
论文摘要
时间卷积网络(TCN)是时间序列处理任务的有希望的深度学习模型。 TCN的一个关键特征是时间删除的卷积,其优化需要广泛的实验。我们提出了一种自动扩张优化器,该优化器将问题解决时要解决问题,并在一次训练中学习扩张因子和重量。我们的方法将实际SOC硬件目标的模型大小和推理潜伏期降低了7倍和3倍,与没有扩张的网络相比,没有准确性下降。它还从单个型号开始,产生了一组丰富的帕累托(Pareto)最佳TCN,其大小和准确性都优于手工设计的解决方案。
Temporal Convolutional Networks (TCNs) are promising Deep Learning models for time-series processing tasks. One key feature of TCNs is time-dilated convolution, whose optimization requires extensive experimentation. We propose an automatic dilation optimizer, which tackles the problem as a weight pruning on the time-axis, and learns dilation factors together with weights, in a single training. Our method reduces the model size and inference latency on a real SoC hardware target by up to 7.4x and 3x, respectively with no accuracy drop compared to a network without dilation. It also yields a rich set of Pareto-optimal TCNs starting from a single model, outperforming hand-designed solutions in both size and accuracy.