论文标题

Llamatune:样品效率DBMS配置调整

LlamaTune: Sample-Efficient DBMS Configuration Tuning

论文作者

Kanellis, Konstantinos, Ding, Cong, Kroth, Brian, Müller, Andreas, Curino, Carlo, Venkataraman, Shivaram

论文摘要

在数据库社区中,调整数据库系统以在给定的工作量上实现最佳性能是一个长期存在的问题。许多最近的工作利用了基于ML的方法来指导大型参数空间的采样(数百个调整旋钮)来寻找高性能配置。查看Microsoft生产服务运行数百万个数据库,样本效率是在各种工作量上使用调谐器的关键要求。这激发了我们在Llamatune的调查,这是一种调谐器设计,利用领域知识来提高现有优化器的样本效率。 Llamatune采用基于随机投影,一种有偏采样的方法来处理某些旋钮的特殊值和旋钮值桶装的自动化尺寸降低技术,以减少搜索空间的大小。 Llamatune与各种工作量的最先进优化者进行了比较。它标识了最佳性能配置,最多$ 11 \ times $ $较少的工作负载运行,并达到$ 21 \%$ $ $的吞吐量。我们还表明,在基于BO的基于BO的和RL的优化器以及不同的DBMS版本中,Llamatune概括了Llamatune的好处。尽管在云规模上进行数据库调整的旅程仍然很长,但Llamatune在使自动DBMS调整大规模上很长。

Tuning a database system to achieve optimal performance on a given workload is a long-standing problem in the database community. A number of recent works have leveraged ML-based approaches to guide the sampling of large parameter spaces (hundreds of tuning knobs) in search for high performance configurations. Looking at Microsoft production services operating millions of databases, sample efficiency emerged as a crucial requirement to use tuners on diverse workloads. This motivates our investigation in LlamaTune, a tuner design that leverages domain knowledge to improve the sample efficiency of existing optimizers. LlamaTune employs an automated dimensionality reduction technique based on randomized projections, a biased-sampling approach to handle special values for certain knobs, and knob values bucketization, to reduce the size of the search space. LlamaTune compares favorably with the state-of-the-art optimizers across a diverse set of workloads. It identifies the best performing configurations with up to $11\times$ fewer workload runs, and reaching up to $21\%$ higher throughput. We also show that benefits from LlamaTune generalize across both BO-based and RL-based optimizers, as well as different DBMS versions. While the journey to perform database tuning at cloud-scale remains long, LlamaTune goes a long way in making automatic DBMS tuning practical at scale.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源