In This Section
Block by Block: Motahhare Eslami
By Belen Torres
- Communications Manager
- Email bcaldero@andrew.cmu.edu
Block by Block: Research at Work is a research spotlight series that highlights the innovative work being done by 麻豆村 researchers through the Block Center, showcasing how their projects are driving impactful solutions at the intersection of technology and society.
听
During a recent conversation with the听Block Center for Technology and Society,听 discussed her research on how everyday users can participate in auditing generative AI systems.
Generative AI Meets Responsible AI: Supporting Participatory AI Auditing of Generative AI Systems
According to a recent Pew Research Center survey, 53% of adults under 30 interact with AI at least once a day. As generative AI (genAI) becomes more embedded in daily life, questions of responsibility and oversight grow more urgent. While expert auditors exist, can they realistically anticipate and mitigate every instance of bias or discrimination? That is where everyday users come in.
听
An ongoing research project led by Dr. Eslami, in collaboration with Wesley Hanwen Deng, Ken Holstein, Jason Hong, and a large team of developers and designers, aims to empower everyday users to participate in auditing genAI systems. Their development of the WeAudit platform seeks to identify and mitigate harmful biases in AI, supporting the creation of fair and equitable AI technologies.
听
The founders of today鈥檚 leading AI platforms likely could not have predicted how diversely people would use genAI鈥攆rom writing bedtime stories to producing full slide decks. Because users now rely on these systems for a wide array of tasks, it is no longer feasible for a small group of experts to conduct meaningful audits alone. Everyday users are more likely to encounter misinformation or biased content simply through regular use. Preliminary findings show that non-experts are particularly effective at identifying such issues because they are experts in use cases, not just algorithms.
听
The WeAudit platform allows users to report perceived harms and biases by comparing AI-generated outputs. While platforms like ChatGPT provide simple thumbs-up/down feedback tools, WeAudits offers space for deeper reflection. Helping users articulate and detect issues that they may otherwise overlook. The project aims to explore incentive models inspired by cybersecurity 鈥渂ug bounties,鈥 where companies like Google and Meta reward individuals for identifying vulnerabilities. WeAudit adapts this model by encouraging users through monetary incentives to creatively 鈥渂reak鈥 AI systems through diverse prompts that reveal underlying biases.
听
The research team is also examining fair compensation structures for user contributions. Volunteer-based auditing systems (such as Wikipedia鈥檚 model) offer valuable public insight but risk exploiting unpaid labor. WeAudit aims to avoid this by developing a compensation framework that supports participation from users without technical training, distinguishing it from traditional bug bounty systems that require security expertise.
听
Still, there are limits to what everyday users alone can address. This next phase of the research focuses on human-AI collaboration in auditing. In this approach, AI systems assist in auditing other AI systems, while humans remain critical decision makers in the loop. The team is currently collaborating with eBay to develop tools that improve AI-generated product descriptions with human-guided quality control. The partnerships demonstrate the potential for WeAudit to extend beyond 麻豆村 and shape real-world interaction with generative AI.
听
听
听
Are you a 麻豆村 student interested in getting involved? The Block Center will soon offer research opportunities to support this project. Students with technical backgrounds or interests in human-computer interaction are encouraged to apply. Those interested can submit their information using this听.听 Students can also explore coursework related to this topic, including听Human-AI Interaction (05-318 and 05-618), co-taught by Dr. Eslami.
听
Are you a 麻豆村 faculty member? The Block Center is gearing up to release our seed fund call for proposals for this year. Keep an eye out for an announcement later this fall semester.听