论文标题

Genderrobustness:在面部识别系统中性别检测的鲁棒性,图像属性变化

GenderRobustness: Robustness of Gender Detection in Facial Recognition Systems with variation in Image Properties

论文作者

Srinivasan, Sharadha, Musuvathi, Madan

论文摘要

最近,人们对具有隐性偏见的计算机愿景的指控和计算机视野算法越来越多。即使这些对话现在更加普遍,并且通过进行广泛的测试并扩大其视野,系统仍在改善,但仍然存在偏见。据说存在偏见的一种类别的系统是面部识别系统,在这些系统中,在性别,种族,肤色和其他面部属性的基础上观察到了偏见。考虑到这些系统实际上是在当今行业的每个领域中都使用的,这更令人不安。从犯罪识别的关键到简单,就像注册出勤率一样,这些系统已经获得了巨大的市场,尤其是近年来。对于这些系统的开发人员来说,这本身就是一个充分的理由,以确保将偏见保持在最低限度或理想情况下,以避免重大问题,例如偏爱特定的性别,种族或类别的人,或者更确切地说,由于这些系统无法正确地认识这些人,因此易于被易于虚假指控。

In recent times, there have been increasing accusations on artificial intelligence systems and algorithms of computer vision of possessing implicit biases. Even though these conversations are more prevalent now and systems are improving by performing extensive testing and broadening their horizon, biases still do exist. One such class of systems where bias is said to exist is facial recognition systems, where bias has been observed on the basis of gender, ethnicity, skin tone and other facial attributes. This is even more disturbing, given the fact that these systems are used in practically every sector of the industries today. From as critical as criminal identification to as simple as getting your attendance registered, these systems have gained a huge market, especially in recent years. That in itself is a good enough reason for developers of these systems to ensure that the bias is kept to a bare minimum or ideally non-existent, to avoid major issues like favoring a particular gender, race, or class of people or rather making a class of people susceptible to false accusations due to inability of these systems to correctly recognize those people.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源