A robot chats with an elderly British man in his bedroom. The robot has a cheerful demeanor and a pleasantly high voice.

The robot – perhaps as a result of the person’s age – begins to ask him about his memories of the Second World War: “Please tell me, what was essentially the most difficult thing you and your loved ones needed to undergo?” The older man goes on to say that his father was within the Royal Air Force and so they didn’t see him for nearly 4 years.

But why was a robot asking him bluntly about probably the most traumatic experiences he had ever had? The robot’s behavior was the product of the Caresses Project (Culture-aware robots and environmental sensor systems to support older people).

This project suits into the brand new field of “cultural robotics,” which goals to design robots that may bear in mind the cultural background of the person they’re chatting with and adapt their behavior accordingly. That’s why the robot chats concerning the war. Since the person was British, it was assumed he would have an interest.

In the longer term, we are able to expect robots for use increasingly more continuously in our personal and social lives. There is currently lively research in areas as diverse as Delivery robots for supermarkets, Entertainment robots, Healthcare service robots, Collection robot for warehouses, Dementia support robots, Robots for people on the autism spectrum And Care robots for older people.

There is even Robot priest that may deliver blessings in five languages, and Robot monks This can enlighten people about Buddhism.

Cultural stereotypes

Cultural robotics is a component of a broader movement geared toward making AI and robotics more culturally inclusive.

Concerns have already been raised about this move. For example, large language models (LLMs), comparable to those utilized by OpenAI’s ChatGPT, are trained on massive amounts of text. However, for the reason that Internet remains to be predominantly English, LLMs are trained totally on English-language texts – with the cultural assumptions and prejudices it comprises..

Similarly, the move to make robots and AI more culturally sensitive is well-intentioned, but we’re concerned about where it’d lead.

For example, a study compared the cultural preferences of China, Germany and Korea to attract conclusions about what people in these countries would love their robots to appear to be.

Drawing on previous work on cultural preferences, they suggested that more “masculine” societies are inclined to think “big and fast” things are beautiful, while more “feminine” societies think “small and slow” things are beautiful. They pointed to work purporting to indicate that Korean culture exhibits “medium masculinity” while German culture exhibits “high masculinity” and hypothesized that Koreans usually tend to find service robots (which are inclined to be small or medium-sized and slow) . friendly.

Another study compared the private spatial preferences of Germans and “Arabs”. But these items usually are not comparable. “Arab” is a potentially offensive term for many individuals and may be used to explain people from many alternative cultural and national backgrounds. It is actually not the identical as categories like “German,” which is a non-offensive term for people of a single nationality.

It can also be becoming increasingly clear that folks react otherwise to robots depending on their cultural background. For example, different cultures have different expectations around personal spaceand this affects how distant robots ought to be from them.

Different cultures also interpret facial expressions otherwise. A study found that folks can understand a robot higher when it communicates with facial expressions they’re aware of.

Another way?

If we wish to avoid designing robots based on broad and gross generalizations and stereotypes, we want a more nuanced approach to culture in robotics.

Culture is a notoriously fuzzy and nuanced term that’s open to many interpretations. A survey is over 300 possible definitions of culture.

In our current researchWe argued that culture is “conceptually fragmented.” In short, we consider that there are so many alternative ways of understanding culture and so many differing types of robots that we must always not expect there to be a one-size-fits-all approach.

We consider that different applications inside robotics would require radically different approaches to culture. For example, imagine an entertainment robot in a theater tasked with dancing for the audience.

For this job, the perfect method to approach the culture could be to concentrate on what kinds of entertainment people in the world prefer. This could include asking what dance styles are popular locally and designing the robot accordingly.

Other applications may require a distinct approach to culture. For example, for a robot that is anticipated to interact with the identical small number of individuals over an prolonged time period (like a service robot in a nursing home), it could be more necessary that the robot changes its behavior over time adapt to the changing preferences of the people it helps.

In this case, perhaps it could be higher to think about culture as something that emerges slowly and dynamically through the interaction of various subjects.

This implies that coping with culture in robotics is more likely to be complex, multi-layered and specific to every situation.

If we design robots based on relatively crude stereotypes and sweeping generalizations about different cultures, then we risk spreading these stereotypes.

This article was originally published at theconversation.com