Translate page with Google

Pulitzer Center Update April 21, 2023

Why AI Accountability Reporting Matters More Than Ever

Author:

 

Call for applications open for AI Accountability Fellowships

Since ChatGPT captured public attention worldwide, the flurry of excitement, fear, and chatter around generative AI is prompting an overheated race among tech companies. At the same time, big tech firms are quietly making compromises to ethical commitments in an attempt to catch up with the dizzying pace. AI ethicists formerly at Twitch, Microsoft, and Twitter were included in the recent rounds of layoffs. These layoffs are taking place amid the height of the so-called AI boom (or hype), when ethical reviews are most needed. Companies have reportedly been lowering the bar for reducing bias before launching a new product by simply calling their releases “experiment[s].”

With this first line of defense against product risks hamstrung, companies can more freely put profit over public benefit and shrug off an inconvenient truth: These tools use public resources (the internet), as well as some not-so-public ones, for grist. The need for journalists to build capacity for responsible, human-centered AI accountability reporting is more urgent than ever before. This is why we launched the AI Accountability Fellowships in 2022: to expand and diversify the field of journalists reporting on the impact of AI, and to help build a community of like-minded colleagues who learn and discuss the best practices of responsible AI reporting. 

Our inaugural cohort of AI Fellows reported on a vast range of in-depth stories that gives us a nuanced look at the real-life consequences of these technologies. These include reconstructing a welfare fraud risk scoring system in the Netherlands; how gig work platforms are affecting the livelihoods of laborers in India; gender bias in widely used content moderation algorithms; how a campus safety tool was used instead as a surveillance tool to forestall student protests; how a border surveillance tool that could have saved a young Syrian girl failed to do so; and how a controversial crime tracking app is experimenting with ways to serve vulnerable communities that don’t trust police. We are proud to have supported these important stories. Yet we know more journalists should be covering the impact of AI with an accountability lens and human-centric approach.

We are now accepting applications for the 2023-2024 cohort of AI Fellows. While we welcome projects on a broad range of issues related to the impact of AI in society, this year we are also placing special emphasis on certain topics. We are seeking to support at least one project that examines the intersection of AI and conflict, war, and peace. In partnership with the Digital Witness Lab at Princeton University, we are also recruiting one project that focuses on the role the messaging platform WhatsApp plays in influencing public discourse in a particular community. Find more information here. Apply here. The deadline is July 1, 2023. 

Best,


Impact

This week, two Pulitzer Center-supported projects were cited in major accountability-seeking investigations.
 
Crediting Land-Grab Universities as its motivation, the Minnesota Indian Affairs Council launched The TRUTH Project for Indigenous land rights and reparations in March 2023 and published its report, Renewing Systems Landscapes Through Traditional Indigenous Management Practices. This report, created in partnership with 11 recognized Minnesota tribes, focuses on the University of Minnesota’s historical and contemporary mistreatment of Indigenous people. 

On Last Week Tonight, host John Oliver quoted Investigate Midwest’s reporting on migrant workers’ housing, part of the Pulitzer Center-supported project Farmworker Housing in America. The project uncovers the poor living conditions of America’s agricultural workers that are dehumanizing and, at times, fatal. Pulitzer Center grantees Sky Chadde and Esther Honig address the scope of the issue and lack of oversight and enforcement by state and federal agencies.


This message first appeared in the April 21, 2023, edition of the Pulitzer Center's weekly newsletter. Subscribe today.

Click here to read the full newsletter.

RELATED INITIATIVES

Logo: The AI Accountability Network

Initiative

AI Accountability Network

AI Accountability Network

RELATED TOPICS

an orange halftone illustration of a hand underneath a drone

Topic

AI Accountability

AI Accountability