麻豆村

Skip to main content
A drawing of people talking that has chat bubbles.
Image credit: JiWoong Jang and Sanika Moharana

People with Autism Turn to ChatGPT for Advice on Workplace Issues

Media Inquiries
Name
Aaron Aupperlee
Title
School of Computer Science

A new 麻豆村 study explores how people with autism interact with ChatGPT and similar artificial intelligence tools for help and advice as they confront problems in their workplaces.

But the research team, led by the School of Computer Science鈥檚聽, also found that such systems sometimes dispense questionable advice. And controversy remains within the autism community as to whether this use of chatbots is even a good idea.

鈥淲hat we found is there are people with autism who are already using ChatGPT to ask questions that we think ChatGPT is partly well-suited and partly poorly suited for,鈥 said Begel, an associate professor in the聽 and the聽 (HCII). 鈥淔or instance, they might ask: 'How do I make friends at work?'鈥澛

Begel heads the聽, which seeks to develop workplaces where all people, including those with disabilities and who are neurodivergent, can successfully work together. Unemployment and underemployment can be聽 problems for many autistic adults, and many workplaces either don鈥檛 have or prioritize the resources to help employees with autism and their coworkers overcome social or communication problems as they arise.

To better understand how large language models (LLMs) could be used to address this shortcoming, Begel and his team recruited 11 people with autism to test online advice from two sources 鈥 a chatbot based on OpenAI's GPT-4, and what looked to the participants like a second chatbot but was really a human.聽

Somewhat surprisingly, the users overwhelmingly preferred the real chatbot to the disguised adviser. It鈥檚 not that the chatbot gave better advice, Begel said, but rather the way it dispensed that advice.聽

鈥淭he participants prioritized getting quick and easy-to-digest answers,鈥 Begel said.聽

The chatbot provided answers that were black and white, without a lot of subtlety and usually in the form of bullets. The counselor, by contrast, often asked questions about what the user wanted to do or why they wanted to do it. Most users preferred not to engage in such back-and-forth, Begel said.

Participants liked the concept of a chatbot. One explained: 鈥淚 think, honestly, with my workplace 鈥 it鈥檚 the only thing I trust because not every company or business is inclusive.鈥

But when a professional who specializes in supporting job seekers with autism evaluated the answers, she found that some of the LLM鈥檚 answers weren鈥檛 helpful. For instance, when one user asked for advice on making friends, the chatbot suggested the user just walk up to people and start talking with them. The problem, of course, is that a person with autism usually doesn鈥檛 feel comfortable doing that, Begel said.

were presented by first author and HCII Ph.D. student JiWoong (Joon) Jang at the Association for Computing Machinery鈥檚 Conference on Human Factors in Computing Systems (CHI 2024) last month in Honolulu. In addition to Begel and Jang, co-authors include HCII Ph.D. student Sanika Moharana and Patrick Carrington, an assistant professor in the HCII.

It's possible that a chatbot trained specifically to address the problems of people with autism might be able to avoid dispensing bad advice, but not everyone in the autism community is likely to embrace it, Begel said. While some might see it as a practical tool for supporting autistic workers, others see it as yet another instance of expecting people whose brains work a bit differently than most people to accommodate everyone else.

鈥淭here鈥檚 this huge debate over whose perspectives we privilege when we build technology without talking to people. Is this privileging the neurotypical perspective of 鈥楾his is how I want people with autism to behave in front of me?鈥 Or is it privileging the person with autism鈥檚 wishes that 鈥業 want to behave the way I am,鈥 or 鈥業 want to get along and make sure others like me and don鈥檛 hate me?鈥欌

At heart, it鈥檚 a question of whether people with autism are given a say in research that is intended to help them. It鈥檚 also an issue聽 on which Begel is a co-author with Naba Rizvi and other researchers at the University of California, San Diego. In that study, researchers analyzed 142 papers published between 2016 and 2022 on developing robots to help people with autism. They found that 90% of this human-robot interaction research did not include the perspectives of people with autism. One result, Begel said, was the development of a lot of assistive technology that people with autism didn鈥檛 necessarily want, while some of their needs went unaddressed.聽

鈥淲e noticed, for instance, that most of the interactive robots designed for people with autism were nonhuman, such as dinosaurs or dogs,鈥 Begel said. 鈥淎re people with autism so deficient in their own humanity that they don鈥檛 deserve humanoid robots?鈥

Technology can certainly contribute to a better understanding of how people with and without autism interact. For instance, Begel is collaborating with colleagues at the University of Maryland on a project using AI to analyze conversations between these two groups. The AI can help identify gaps in understanding by either or both of the speakers that could result in jokes falling flat or creating the perception that someone is being dishonest. Technology could also help speakers prevent or repair these conversational problems, Begel said, and the researchers are seeking input from a large group of people with autism to get their opinion on the kind of help they would like to see.

鈥淲e鈥檝e built a video calling tool to which we鈥檝e attached this AI,鈥 said Begel, who has also developed an Autism Advisory Board to ensure that people with autism have a say in which projects his lab should pursue. 鈥淥ne possible intervention might be a button on this tool that says 鈥楽orry, I didn鈥檛 hear you. Can you please repeat your question?鈥 when I don鈥檛 feel like saying that out loud. Or maybe there鈥檚 a button that says, 鈥業 don鈥檛 understand.鈥 Or even a tool that could summarize the meeting agenda so you can help orient your teammates when you say, 鈥業鈥檇 like to go back to the first topic we spoke about.鈥欌

Andrew Begel

Andrew Begel, associate professor of computer science

鈥 Related Content 鈥