论文标题
大规模网络耗散系统的神经Lyapunov功能的分布式学习
Distributed Learning of Neural Lyapunov Functions for Large-Scale Networked Dissipative Systems
论文作者
论文摘要
本文考虑了以分布式和计算障碍方式组成的大规模网络系统的稳定区域的问题。估计一般非线性系统稳定区域的一种标准方法是首先找到该系统的Lyapunov函数,并将其吸引区域描述为稳定区域。但是,用于查找Lyapunov函数的经典方法,例如平方的方法和二次近似值,要么不扩展到大型系统,要么对稳定性区域进行非常保守的估计。在这种情况下,我们通过利用子系统的耗散性结构来提出一种新的基于分布式学习的方法。我们的方法有两个部分:第一部分是一种分布式方法,用于学习所有子系统的存储功能(类似于Lyapunov功能),第二部分是使用子系统的学习存储功能找到网络系统的Lyapunov函数的分布式优化方法。我们通过微电网网络中的广泛案例研究证明了我们提出的方法的出色表现。
This paper considers the problem of characterizing the stability region of a large-scale networked system comprised of dissipative nonlinear subsystems, in a distributed and computationally tractable way. One standard approach to estimate the stability region of a general nonlinear system is to first find a Lyapunov function for the system and characterize its region of attraction as the stability region. However, classical approaches, such as sum-of-squares methods and quadratic approximation, for finding a Lyapunov function either do not scale to large systems or give very conservative estimates for the stability region. In this context, we propose a new distributed learning based approach by exploiting the dissipativity structure of the subsystems. Our approach has two parts: the first part is a distributed approach to learn the storage functions (similar to the Lyapunov functions) for all the subsystems, and the second part is a distributed optimization approach to find the Lyapunov function for the networked system using the learned storage functions of the subsystems. We demonstrate the superior performance of our proposed approach through extensive case studies in microgrid networks.