Reporting From The Future

Would You Step In If Someone Bullied a Robot?

As South Korean researchers have discovered, how humans behave around these machines reveals something profound: when we watch someone insult or strike a service robot, our reactions, whether empathy or indifference, may expose the moral boundaries of our relationship with technology itself

When a human scolds or strikes a robot, bystanders face an unexpected moral test — one that could determine not just how we treat machines, but how we treat each other in an age of automation. Photo/ Courtesy

At a time when robots are becoming fixtures in hotels, airports, and restaurants, a new study from Hanyang University ERICA in South Korea raises a pressing question: how do humans react when they see someone mistreat a machine?

The research sheds light on an overlooked dimension of this relationship — not how people treat robots directly, but how onlookers react when they witness a service robot being mistreated. The research, led by Professor Taeshik Gong, reveals that such moments trigger two competing psychological forces: behavioral contagion, which normalizes rudeness, and empathy, which prompts kindness and moral concern.

Published in the Journal of Retailing and Consumer Services, the study used controlled, video-based experiments to observe how people respond after seeing another customer insult or mishandle a robot. The results point to a subtle but consequential finding: even without saying a word, observers play a crucial role in shaping the moral tone of human–robot interaction.

“The way people react to the mistreatment of service robots can influence what kind of behavior becomes acceptable in public,” Prof. Gong told this writer “That’s why understanding these social dynamics is vital as robots become part of our daily service environments.”

The researchers found that whether observers respond with empathy or mimic incivility depends largely on two factors: how humanlike the robot appears and how strong the observer’s moral identity is. When a robot displayed humanlike traits — such as expressive eyes, an emotive voice, or small gestures — onlookers were more likely to empathize and intervene rather than mirror the aggressor’s behavior.

“Anthropomorphism influences empathy,” Prof. Gong explained. “Designers can use humanlike cues in frontline robots to increase bystanders’ empathic responses when mistreatment occurs. This does not just improve likability — it sends a moral signal.”

The findings go beyond psychology, offering tangible insights for businesses. The researchers suggest that managers in hotels, restaurants, airports, and retail stores could implement training protocols and visual cues — like signage, scripted responses, or pre-recorded reminders — to discourage customer abuse toward robots. Such interventions, they argue, could “prevent a contagion of incivility” and foster a culture of respect.

These design principles, Prof. Gong says, are not merely about protecting machines but about preserving human dignity. “If the mistreatment of robots becomes normalized, it could spill over into how customers treat human staff,” he said. “Conversely, reinforcing prosocial norms toward robots can strengthen the overall moral climate of the service environment, benefiting employees’ dignity and customer satisfaction simultaneously.”

The research also feeds into a growing ethical debate: should robots, particularly those embedded in public service roles, be afforded some form of protection? If people perceive them as legitimate victims under certain conditions, societies may soon consider codes of conduct or ethical guidelines for human–robot interaction — a step that could blur the line between moral courtesy and legal obligation.

For now, Prof. Gong and his team believe their study opens a path toward more thoughtful robot design and policymaking. As artificial intelligence continues to move into the physical world, they argue, ensuring respectful and prosocial human–AI interaction will be as important as improving the technology itself.

“In a decade,” Prof. Gong predicted, “we may see relevant workplace guidelines or even legal provisions to ensure that human–robot interactions remain civil. How we treat our robots, in the end, reflects how we treat one another.”

Zeen is a next generation WordPress theme. It’s powerful, beautifully designed and comes with everything you need to engage your visitors and increase conversions.