论文标题

自然语言处理中的公平性和环境可持续性

Bridging Fairness and Environmental Sustainability in Natural Language Processing

论文作者

Hessenthaler, Marius, Strubell, Emma, Hovy, Dirk, Lauscher, Anne

论文摘要

公平和环境影响是人工智能可持续发展的重要研究方向。但是,尽管每个主题都是自然语言处理(NLP)的活跃研究领域,但令人惊讶的是,对这两个领域之间的相互作用缺乏研究。这种空白是非常有问题的,因为有越来越多的证据表明,对公平性的独家关注实际上可以阻碍环境可持续性,反之亦然。在这项工作中,我们通过(1)通过调查减少文献中不公平的刻板印象偏见的示例方法来调查当前公平方法的效率,以及(2)评估通用技术来降低英语NLP模型的能源消耗(从而降低英语环境影响),知识蒸馏(KD)对公平的影响(KD)评估一项常见技术,从而揭示了当前公平方法的效率。在此案例研究中,我们评估了重要的KD因子的影响,包括层和尺寸降低,相对于:(a)蒸馏任务的性能(自然语言推理和语义相似性预测),以及(b)刻板印象偏见的多种衡量标准和维度(例如,通过单词嵌入式胚胎结合测试测量的性别偏见)。我们的结果使我们阐明了有关KD对不公平偏见的影响的当前假设:与其他发现相反,我们表明KD实际上可以降低模型的公平性。

Fairness and environmental impact are important research directions for the sustainable development of artificial intelligence. However, while each topic is an active research area in natural language processing (NLP), there is a surprising lack of research on the interplay between the two fields. This lacuna is highly problematic, since there is increasing evidence that an exclusive focus on fairness can actually hinder environmental sustainability, and vice versa. In this work, we shed light on this crucial intersection in NLP by (1) investigating the efficiency of current fairness approaches through surveying example methods for reducing unfair stereotypical bias from the literature, and (2) evaluating a common technique to reduce energy consumption (and thus environmental impact) of English NLP models, knowledge distillation (KD), for its impact on fairness. In this case study, we evaluate the effect of important KD factors, including layer and dimensionality reduction, with respect to: (a) performance on the distillation task (natural language inference and semantic similarity prediction), and (b) multiple measures and dimensions of stereotypical bias (e.g., gender bias measured via the Word Embedding Association Test). Our results lead us to clarify current assumptions regarding the effect of KD on unfair bias: contrary to other findings, we show that KD can actually decrease model fairness.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源