论文标题

在线网络调查的基本限制

Fundamental Limits of Online Network-Caching

论文作者

Bhattacharjee, Rajarshi, Banerjee, Subhankar, Sinha, Abhishek

论文摘要

内容分销网络(CDN)中文件的最佳缓存是基本和不断增长的商业兴趣的问题。尽管当今正在使用许多不同的缓存算法,但迄今为止,网络缓存算法的基本性能限制仍然很糟糕。在本文中,我们在以下两个设置中解决了这个问题:(1)连接到单个缓存的单个用户,以及(2)一组用户和一组通过两部分网络相互连接的缓存。最近,一项基于在线梯度的编码缓存政策被证明是次线性的遗憾。但是,由于缺乏已知的遗憾下界,拟议政策的最佳性问题被敞开了。在本文中,我们通过在上述两种情况下得出紧密的非反对遗憾下限来解决这个问题。除此之外,我们还提出了一种新的以下来自受扰动的领导者的未编码的缓存政策,并近乎遗憾。从技术上讲,较低的距离是通过将在线缓存问题与balls-Into-bins的经典概率范式联系起来来获得的。我们的证据广泛利用了人口最多的垃圾箱的预期负载的新结果,这也可能引起独立的兴趣。我们通过尝试流行的Movielens数据集来评估缓存策略的性能,并以设计建议和开放问题列表结束论文。

Optimal caching of files in a content distribution network (CDN) is a problem of fundamental and growing commercial interest. Although many different caching algorithms are in use today, the fundamental performance limits of network caching algorithms from an online learning point-of-view remain poorly understood to date. In this paper, we resolve this question in the following two settings: (1) a single user connected to a single cache, and (2) a set of users and a set of caches interconnected through a bipartite network. Recently, an online gradient-based coded caching policy was shown to enjoy sub-linear regret. However, due to the lack of known regret lower bounds, the question of the optimality of the proposed policy was left open. In this paper, we settle this question by deriving tight non-asymptotic regret lower bounds in both of the above settings. In addition to that, we propose a new Follow-the-Perturbed-Leader-based uncoded caching policy with near-optimal regret. Technically, the lower-bounds are obtained by relating the online caching problem to the classic probabilistic paradigm of balls-into-bins. Our proofs make extensive use of a new result on the expected load in the most populated half of the bins, which might also be of independent interest. We evaluate the performance of the caching policies by experimenting with the popular MovieLens dataset and conclude the paper with design recommendations and a list of open problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源