论文标题

随机组成和小偏置最小值

Randomised Composition and Small-Bias Minimax

论文作者

Ben-David, Shalev, Blais, Eric, Göös, Mika, Maystre, Gilbert

论文摘要

我们证明了关于随机查询复杂性$ \ mathrm {r}(f)$的两个结果。首先,我们介绍了“线性化”的复杂性度量$ \ MATHRM {lr} $,并表明它满足了内部最佳的组成定理:$ \ Mathrm {rm {r}(f \ circ g g)\ geqω(\ geqω(\ \ m mathrm {r}(r}(f) $ \ mathrm {lr} $是此属性的最大措施。尤其是,$ \ mathrm {lr} $可以比以前满足内部组成定理的措施更大,例如Gavinsky,Lee,Lee,Santha和Sanyal的最大冲突复杂性(ICALP 2019)。 我们的第二个结果解决了Yao的问题(FOCS 1977)。他问$ε$ -ERROR期望查询复杂性$ \ bar {\ Mathrm {r}}_ε(f)$相对于某些硬输入分布,$允许分布表征。 Vereshchagin(TCS 1998)在有限的案例中肯定地回答了这个问题。我们表明,类似的定理在小偏置案例中失败了$ε= 1/2-o(1)$。

We prove two results about randomised query complexity $\mathrm{R}(f)$. First, we introduce a "linearised" complexity measure $\mathrm{LR}$ and show that it satisfies an inner-optimal composition theorem: $\mathrm{R}(f\circ g) \geq Ω(\mathrm{R}(f) \mathrm{LR}(g))$ for all partial $f$ and $g$, and moreover, $\mathrm{LR}$ is the largest possible measure with this property. In particular, $\mathrm{LR}$ can be polynomially larger than previous measures that satisfy an inner composition theorem, such as the max-conflict complexity of Gavinsky, Lee, Santha, and Sanyal (ICALP 2019). Our second result addresses a question of Yao (FOCS 1977). He asked if $ε$-error expected query complexity $\bar{\mathrm{R}}_ε(f)$ admits a distributional characterisation relative to some hard input distribution. Vereshchagin (TCS 1998) answered this question affirmatively in the bounded-error case. We show that an analogous theorem fails in the small-bias case $ε=1/2-o(1)$.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源