论文标题
并发神经树和数据预处理图像分类
Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification
论文作者
论文摘要
深度神经网络(DNN)是针对各种机器学习问题的广泛使用解决方案。但是,通常有必要投入大量数据科学家的时间来预处理数据,测试不同的神经网络架构并调整超参数以获得最佳性能。自动化机器学习(AUTOML)方法自动搜索体系结构和超参数空间以获取最佳神经网络。但是,当前的最新方法(SOTA)方法不包括作为算法搜索空间的一部分操纵输入数据的传统方法。我们适应了进化多目标算法设计引擎(EMADE),这是传统机器学习方法的多目标进化搜索框架,以执行神经体系结构搜索。我们还整合了Emade的信号处理和图像处理原始图。这些原语允许Emade在摄入同时进化的DNN之前操纵输入数据。我们表明,将这些方法包括作为搜索空间的一部分,显示了在CIFAR-10图像分类基准数据集中为性能提供好处的潜力。
Deep Neural Networks (DNN's) are a widely-used solution for a variety of machine learning problems. However, it is often necessary to invest a significant amount of a data scientist's time to pre-process input data, test different neural network architectures, and tune hyper-parameters for optimal performance. Automated machine learning (autoML) methods automatically search the architecture and hyper-parameter space for optimal neural networks. However, current state-of-the-art (SOTA) methods do not include traditional methods for manipulating input data as part of the algorithmic search space. We adapt the Evolutionary Multi-objective Algorithm Design Engine (EMADE), a multi-objective evolutionary search framework for traditional machine learning methods, to perform neural architecture search. We also integrate EMADE's signal processing and image processing primitives. These primitives allow EMADE to manipulate input data before ingestion into the simultaneously evolved DNN. We show that including these methods as part of the search space shows potential to provide benefits to performance on the CIFAR-10 image classification benchmark dataset.