论文标题
部分可观测时空混沌系统的无模型预测
Co-eye: A Multi-resolution Symbolic Representation to TimeSeries Diversified Ensemble Classification
论文作者
论文摘要
时间序列分类(TSC)是一项艰巨的任务,在过去几年中吸引了许多研究人员。 TSC中的一个主要挑战是时间序列数据来自的域的多样性。因此,在TSC中没有“一个适合所有模型”。当考虑整个系列时,某些算法非常准确地对特定类型的时间序列进行分类,而某些算法仅针对特定模式/体形的存在/不存在。其他技术的重点是歧视模式/特征的发生频率。本文提出了一种新的分类技术,该技术使用自然风格的方法解决了TSC中固有的多样性问题。通过苍蝇通过“复合眼睛”看着数千种镜片(称为ommatidia)的“复合眼睛”来刺激该技术。每个胶木都是带有自己的镜头的眼睛,成千上万的镜头共同创造了一个广阔的视野。开发的技术类似地使用不同的镜头和表示来查看时间序列,然后将它们结合起来以获得更广泛的可见性。这些镜片是通过符号表示(分段骨料和傅立叶近似)的过度参数化而创建的。该算法为每个镜头建造一个随机的森林,然后对使用最自信的眼睛(即森林)进行柔和的动态投票,以对新实例进行分类。我们使用最近发布的UCR存档版本的扩展版本,评估了新技术Co-Eye,其中包含跨广泛域的100多个数据集。结果表明,与其他最先进的技术相比,将不同观点汇总在一起,反映出Co-Eye的准确性和鲁棒性的好处。
Time series classification (TSC) is a challenging task that attracted many researchers in the last few years. One main challenge in TSC is the diversity of domains where time series data come from. Thus, there is no "one model that fits all" in TSC. Some algorithms are very accurate in classifying a specific type of time series when the whole series is considered, while some only target the existence/non-existence of specific patterns/shapelets. Yet other techniques focus on the frequency of occurrences of discriminating patterns/features. This paper presents a new classification technique that addresses the inherent diversity problem in TSC using a nature-inspired method. The technique is stimulated by how flies look at the world through "compound eyes" that are made up of thousands of lenses, called ommatidia. Each ommatidium is an eye with its own lens, and thousands of them together create a broad field of vision. The developed technique similarly uses different lenses and representations to look at the time series, and then combines them for broader visibility. These lenses have been created through hyper-parameterisation of symbolic representations (Piecewise Aggregate and Fourier approximations). The algorithm builds a random forest for each lens, then performs soft dynamic voting for classifying new instances using the most confident eyes, i.e, forests. We evaluate the new technique, coined Co-eye, using the recently released extended version of UCR archive, containing more than 100 datasets across a wide range of domains. The results show the benefits of bringing together different perspectives reflecting on the accuracy and robustness of Co-eye in comparison to other state-of-the-art techniques.