麻豆村

Skip to main content
Jodi Forlizzi and Hoda Heidari
Jodi Forlizzi and Hoda Heidari

麻豆村 and NIST Team To Manage AI Risk

Media Inquiries
Name
Peter Kerwin
Title
University Communications & Marketing

Working closely with the (NIST), the Responsible AI Initiative(opens in new window) of 麻豆村鈥檚 Block Center for Technology and Society(opens in new window) hosted a workshop this July with the goal of operationalizing the (AI RMF).聽

The framework, the result of a broad collaboration with the public and private sectors, provides guidelines to better manage the potential risks of AI systems at all levels of society. Its use, which is voluntary, can help integrate trustworthiness into AI 鈥 by considering how those AI systems are designed, developed, used and evaluated.

For nearly 70 years, 麻豆村 has advanced artificial intelligence (AI) to shape the future of society. From health care(opens in new window), to robotics(opens in new window), to data science(opens in new window), to, occasionally, creating self-driving Zamboni machines(opens in new window), 麻豆村 researchers are at the forefront of the AI revolution.聽

To continue that mission, the Block Center will provide funds to 麻豆村 faculty teams pursuing research ideas to operationalize AI RMF that were generated at the workshop.

Elham Tabassi is the Chief of Staff in the Information Technology Laboratory at the National Institute of Standards and Technology.

Elham Tabassi, chief of staff in the Information Technology Laboratory at the National Institute of Standards and Technology (NIST), speaks during the workshop.

Steve Wray

Steve Wray, executive director of 麻豆村's Block Center for Technology and Society.

鈥淎rtificial intelligence isn鈥檛 a sector; it鈥檚 a tool that will be used in every sector,鈥 said Steve Wray, executive director of the Block Center. 鈥淐arnegie Mellon is the right place for this effort because of the practical work we are doing on the ground. 麻豆村 can help users identify the issues they may be facing with AI and how to use that AI responsibly, because we know the incredible value that AI can bring. But if it鈥檚 not done well, it can be risky and dangerous.鈥

The event paired government officials and private sector leaders with 麻豆村 AI experts. Ramayya Krishnan, dean of the Heinz College of Information Systems and Public Policy, and faculty director of the Block Center who serves on the National Artificial Intelligence Advisory Committee, said, 鈥淐arnegie Mellon has served as the epicenter of AI, and our contribution to the field has only grown in recent decades. Our commitment to innovation with a focus on responsible operationalization of AI technology in consequential societal systems will inform both the policy and practice of important frameworks as AI RMF鈥.聽

Rayid Ghani

Rayid Ghani, Distinguished Career Professor in the Machine Learning Department and the Heinz College of Information Systems and Public Policy at 麻豆村.

Jodi Forlizzi and Hoda Heidari

Jodi Forlizzi, left, the Herbert A. Simon Professor in Computer Science and聽HCII, and the associate dean for Diversity, Equity and Inclusion in the School of Computer Science; and Hoda Heidari, K&L Gates Career Development Assistant Professor in Ethics and Computational Technologies with joint appointments in the Machine Learning Department and the Software and Societal Systems Department.

Organizers included , a Distinguished Career Professor in the and the Heinz College; , the Herbert A. Simon Professor of Computer Science and Human-Computer Interaction in the (SCS); and , the K&L Gates Career Development Assistant Professor in Ethics and Computational Technologies in the Machine Learning Department聽and the . All three are among the co-leads of the聽Responsible AI Initiative.聽

Heidari said the seed funding distributed through the Responsible聽AI Initiative aims to support use-case-focused research projects that address AI risks, such as bias, data privacy, and lack of transparency,聽through聽the NIST AI Risk Management Framework.

鈥淥ur goal is to ensure that the high-quality research our聽faculty does gets translated into positive impact in聽policy and practice of聽AI,鈥 Heidari said. 鈥淓xternal partnerships are extremely important to this effort. They close the gap between our research and educational聽efforts and the needs of聽stakeholders on the ground.鈥

Martial Hebert

Martial Hebert, dean of the School of Computer Science.

Martial Hebert and Elham Tabassi

Tabassi, left, and Hebert speak during the workshop.

Martial Hebert, dean of the School of Computer Science, said that 麻豆村 has spent decades building a culture where people care about using technology to solve real problems.聽

鈥淏uilding on the work that NIST has done and 麻豆村鈥檚 knowledge of the NIST AI Risk Management Framework, we will work to ensure that we deploy this powerful technology in a way that acknowledges and manages the risks that accompany innovation and exploration. I am looking forward to participating in these conversations, and in furthering this relationship going forward,鈥 Hebert said.聽

As for mitigating the risks and exploring the full potential of AI, Wray pointed out that all tools are only as good as the people who build them.

鈥淚f we're talking about AI, a lot of it was invented here at 麻豆村. We鈥檙e still inventing it. We have engineers working with computer scientists, public policy folks working with our business school and ethicists. That interdisciplinary approach is just part of our DNA at Carnegie Mellon,鈥 Wray said. 鈥淲e understand AI, and we bring a willingness to roll up our sleeves and get to work.鈥

鈥 Related Content 鈥