论文标题

平台用户权力关系如何形状算法问责制:即时贷款平台和印度财务压力用户的案例研究

How Platform-User Power Relations Shape Algorithmic Accountability: A Case Study of Instant Loan Platforms and Financially Stressed Users in India

论文作者

Ramesh, Divya, Kameswaran, Vaishnav, Wang, Ding, Sambasivan, Nithya

论文摘要

问责制是负责AI的必要条件,可以通过透明度机制(例如审核和解释性)来促进。但是,先前的工作表明,这些机制的成功可能仅限于全球北部环境。了解各种社会政治条件中当前干预措施的局限性对于帮助政策制定者促进更广泛的问责制至关重要。为此,我们检查了脆弱用户与全球南部环境中“高风险” AI系统之间现有交互作用的责任调解。我们报告了印度即时贷款平台的29个经济压力用户的定性研究。我们发现,用户对即时贷款的“福音”经历了强烈的债务感,并认为对贷款平台的巨大义务。用户通过接受苛刻的条款和条件,过度共享的敏感数据以及向未知和未验证的贷方支付高费用来履行义务。尽管遭受虐待,重复债务,歧视,隐私危害和自我伤害等危害的风险,但用户表现出对贷款平台的依赖。用户没有被贷款平台愤怒,而是对他们的负面经历负责,从而从问责制义务中释放了高力的贷款平台。我们认为,问责制是由平台用户权力关系塑造的,并敦促对决策者谨慎采用纯技术方法来促进算法问责制。取而代之的是,我们呼吁采取干预措施,以增强用户的代理,实现有意义的透明度,重新配置设计师 - 用户关系,并促使从业者对更广泛的问责制进行重要反思。我们最终对印度及其他地区的金融科技应用程序中的AI负责任地结论。

Accountability, a requisite for responsible AI, can be facilitated through transparency mechanisms such as audits and explainability. However, prior work suggests that the success of these mechanisms may be limited to Global North contexts; understanding the limitations of current interventions in varied socio-political conditions is crucial to help policymakers facilitate wider accountability. To do so, we examined the mediation of accountability in the existing interactions between vulnerable users and a 'high-risk' AI system in a Global South setting. We report on a qualitative study with 29 financially-stressed users of instant loan platforms in India. We found that users experienced intense feelings of indebtedness for the 'boon' of instant loans, and perceived huge obligations towards loan platforms. Users fulfilled obligations by accepting harsh terms and conditions, over-sharing sensitive data, and paying high fees to unknown and unverified lenders. Users demonstrated a dependence on loan platforms by persisting with such behaviors despite risks of harms such as abuse, recurring debts, discrimination, privacy harms, and self-harm to them. Instead of being enraged with loan platforms, users assumed responsibility for their negative experiences, thus releasing the high-powered loan platforms from accountability obligations. We argue that accountability is shaped by platform-user power relations, and urge caution to policymakers in adopting a purely technical approach to fostering algorithmic accountability. Instead, we call for situated interventions that enhance agency of users, enable meaningful transparency, reconfigure designer-user relations, and prompt a critical reflection in practitioners towards wider accountability. We conclude with implications for responsibly deploying AI in FinTech applications in India and beyond.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源