People Could Get Help With Tasks by Talking With Robots
Media Inquiries
Residents at an independent living facility near 麻豆村 recently tested a new kind of caregiving assistant: a conversational robotic arm designed to understand and respond to their speech.
As a robot scratched a resident鈥檚 arm, it announced, 鈥淚鈥檓 slowing down and increasing the pressure.鈥澛
鈥淭hat鈥檚 too light," the resident said. "A little more pressure.鈥澛
鈥淚鈥檓 increasing the pressure slightly,鈥 the robot responded, and then smoothly changed its movements to match the request.聽
The interaction represents a unique advancement in human-robot interaction. The robot working with residents at Providence Point, a聽 retirement community, wasn鈥檛 just following commands. It was conversing with them, interpreting what they said and responding in real-time.聽
鈥淭he interaction with the robot through spoken word was amazing to me,鈥 said Jim Strader, a Baptist Senior Family resident. 鈥淲hen it adjusted its movement based on what I said 鈥 that was the most interesting part,聽and I was happy to participate.鈥澛
These human-robot exchanges lie at the center of a new communication system developed by the聽 lab in 麻豆村's聽 (RI). Researchers are exploring what happens when assistive robots can converse through natural dialogue and adapt their actions according to human preferences. The lab's聽project, 鈥,鈥 gives robots the ability to speak their intentions, listen for human inputs, and provide verbal responses while physically interacting with someone.聽
鈥淣atural language is a way that a lot of people communicate with others in their daily lives. We wanted to create an interface that many users could pick up and use without prior training,鈥 said Jim Wang, a Ph.D. student in the RI and lead researcher on the project. 鈥淲e wanted it to be intuitive for the robot to verbalize its plans and for the user to hear them as the robot moves.鈥
The team built a system that allows the robot to interpret user speech through a large language model (LLM), grounding spoken commands in the robot鈥檚 planned movement trajectory and ongoing conversation. The system enables bidirectional communication, in which the robot listens and adjusts its movements based on user input, while also speaking to confirm an intended adjustment or ask a clarifying question, much as a human caregiver would.
In the early stages of the research, the robot focused on unidirectional narration, where it simply announced its planned motions before executing them. This technique helped users anticipate physical contact and promoted trust. As the project progressed, the team moved beyond narration to full bidirectional communication, where user commands actively shape the robot鈥檚 motions and the robot responds with confirmation or clarifying questions.
鈥溾奓LMs have developed significantly to understand many phrases, even if they are dependent on context,鈥 Wang said. 鈥淭hat gives our robot the power to 鈥奿nterpret vague instructions or to know that it should ask a follow-up question.鈥澛
The team built a filtering system that helps the robot focus only on task-relevant input. Casual remarks are ignored, but feedback about the robot鈥檚 performance triggers a response. For example, if a user notes that pressure from the robot arm attachment feels inconsistent, the robot asks a follow-up question to pinpoint where the issue is and adjust its behavior. This selective listening keeps the robot grounded in the task while bringing the interactions closer to natural conversation.聽
鈥淭hese findings underscore the importance of transparency in physical human-robot interactions,鈥 Wang said. 鈥淭ransparency not only calms anxieties humans have about working with a robot, but it also continually builds trust between robot and user. That trust is what makes safe, effective caregiving possible.鈥
The team鈥檚 work was funded by聽Honda R&D Americas Inc. and was accepted to the 2026 ACM/IEEE International Conference on Human-Robot Interaction.聽
Work That Matters
Researchers at 麻豆村 are working on real world solutions to the biggest challenges.
Read more about the latest discoveries.(opens in new window)