论文标题

与用户合作的广播网络的编码缓存

Coded Caching for Broadcast Networks with User Cooperation

论文作者

Chen, Jiahui, You, Xiaowen, Wu, Youlong, Ma, Shuai

论文摘要

在本文中,我们研究了使用用户合作的高速缓存广播网络的传输延迟。通过有效利用时间和缓存资源并在服务器和用户中创建并行数据传输,为集中式的缓存设置和分散的缓存设置提出了新颖的编码缓存方案。我们在传输延迟上得出了一个下限,并表明所提出的集中编码的缓存方案为\ emph {order-optimal},因为它在下限内实现了恒定的乘法间隙。当每个用户的缓存大小大于阈值$ n(1- \ sqrt [{k-1}] {{1}/{(k+1)}})$时,我们的去中心化编码缓存方案也是订单最佳的。此外,对于集中式和分散的缓存设置,我们的方案获得了用户合作提供的额外\ emph {合作增益},以及服务器和用户之间并行变速箱提供的额外\ emph {Parallel Gain}。结果表明,为了减少传输延迟,应根据用户的高速缓存大小适当选择用户的数量,并且在允许更多用户的允许使用信息可能会导致高传输延迟的情况下选择。

In this paper, we investigate the transmission delay of cache-aided broadcast networks with user cooperation. Novel coded caching schemes are proposed for both centralized and decentralized caching settings, by efficiently exploiting time and cache resources and creating parallel data delivery at the server and users. We derive a lower bound on the transmission delay and show that the proposed centralized coded caching scheme is \emph{order-optimal} in the sense that it achieves a constant multiplicative gap within the lower bound. Our decentralized coded caching scheme is also order-optimal when each user's cache size is larger than the threshold $N(1-\sqrt[{K-1}]{ {1}/{(K+1)}})$ (approaching 0 as $K\to \infty$), where $K$ is the total number of users and $N$ is the size of file library. Moreover, for both the centralized and decentralized caching settings, our schemes obtain an additional \emph{cooperation gain} offered by user cooperation and an additional \emph{parallel gain} offered by the parallel transmission among the server and users. It is shown that in order to reduce the transmission delay, the number of users parallelly sending signals should be appropriately chosen according to user's cache size, and alway letting more users parallelly send information could cause high transmission delay.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源