Site icon Pakistan & Gulf Economist

Why robots can be culturally insensitive – and how scientists are trying to fix it

Why robots can be culturally insensitive – and how scientists are trying to fix it

Pepper the robot bows after it preached to visitors during a demonstration of funeral ceremony with a Buddhist priest in Tokyo in 2017. EPA-EFE/KIMIMASA MAYAMA

Written By

Henry Taylor, University of Birmingham and Masoumeh Mansouri, University of Birmingham


A robot is chatting to an elderly British man in his bedroom. The robot has a cheery demeanour and a pleasantly high-pitched voice.

The robot – perhaps because of the man’s age – starts asking him about his memories of the second world war: “Please tell me what was the most difficult thing you and your family had to go through?” The elderly man goes on to talk about how his father was in the Royal Air Force and they didn’t see him for almost four years.

But why was a robot bluntly asking him about what may have been one of the most traumatic experiences he’s ever had? The robot’s behaviour was the product of the Caresses project (Culture-Aware Robots and Environmental Sensor Systems for Elderly Support).

This project fits into the new field of “cultural robotics”, which aims to design robots that can take into account the cultural background of the person they’re talking to, and adjust their behaviour accordingly. That’s why the robot is chatting about the war. The man was British, so it presumed he would be interested.

In the future, we can expect robots to be deployed more and more in our personal and social lives. There is currently active research into fields as diverse as delivery robots for supermarkets, entertainment robots, service robots for healthcare, fetching robots for warehouses, robots for dementia support, robots for people on the autism spectrum and care robots for the elderly.

There are even robot priests that can deliver blessings in five languages, and robot monks that can educate people about Buddhism.

Cultural stereotypes

Cultural robotics is part of a wider movement to make AI and robotics more culturally inclusive.

Concerns about this movement have been raised before. For example, large language models (LLMs) such as that used by OpenAI’s ChatGPT are trained on massive amounts of text. But because the internet is still predominantly English, LLMs are primarily trained on English text – with the cultural assumptions and biases therein..

In a similar way, the move to make robots and AI more culturally sensitive is well meaning, but we’re concerned about where it could lead.

For example, one study compared the cultural preferences of China, Germany and Korea to draw conclusions about how people in these countries would like their robots to look.

By drawing on previous work on cultural preferences, they suggested that more “masculine” societies tend to think of “big and fast” things as beautiful, while more “feminine” societies find “small and slow” things beautiful. They referenced work that claims to show that Korean culture is “middle masculinity”, while German culture is “high masculinity”, and hypothesised that Korean people are more likely to find service robots (which tend to be small or medium sized, and slow) likeable.

Another study compared the personal space preferences of Germans and “Arabs”. But these things are not comparable. “Arab” is a potentially offensive term for many people, and can be used to describe people from many different cultural and national backgrounds. It is certainly not on a par with categories like “German”, which is a non-offensive term for people of a single nationality.

It’s also becoming increasingly apparent that humans react differently to robots depending on their own cultural background. For example, different cultures have different expectations around personal space, and this affects how far they prefer robots to stand from them.

Different cultures interpret facial expressions differently too. One study found that people are more able to understand a robot if it communicates using the facial expressions that they are familiar with.

Another way?

If we want to avoid designing robots based on broad and crude generalisations and stereotypes, then we will need a more nuanced approach to culture in robotics.

Culture is a notoriously fuzzy and nuanced concept, open to many interpretations. One survey lists over 300 potential definitions of culture.

In our recent research, we argued that culture is “conceptually fragmented”. In short, our view is that there are so many different ways of understanding culture, and so many different kinds of robots, that we should not expect there to be a one-size-fits-all approach.

We think that different applications within robotics will require radically different approaches to culture. For example, imagine an entertainment robot in a theatre that has the job of dancing for audiences.

For this job, the best way of approaching culture might involve concentrating on what kinds of entertainment the people in the local area prefer. This might involve asking what kind of dancing styles are common locally, and modelling the robot’s design around that.

Other applications may require a different approach to culture. For example, for a robot that is expected to interact with the same small number of humans over an extended period of time (like a service robot in a care home) it might be more important for the robot to change its behaviour over time, to adapt to the changing preferences of the people it is helping.

For this case, it might be better to think of culture as something that emerges slowly and dynamically through the interaction of different subjects.

This means that approaching culture in robotics is likely to be a complex, multifaceted and specific to each situation.

If we design robots based on relatively crude stereotypes and sweeping generalisations about different cultures, then we risk propagating those stereotypes.


Written By

Henry Taylor, Associate Professor, Department of Philosophy, University of Birmingham and Masoumeh Mansouri, Associate Professor, School of Computer Science, University of Birmingham

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Exit mobile version