With the proliferation of female robots reminiscent of Sophia and the recognition of female virtual assistants reminiscent of Siri (Apple), Alexa (Amazon) and Cortana (Microsoft), artificial intelligence seems to have a gender issue.

This gender imbalance in AI is a pervasive trend that has drawn sharp criticism within the media (even Unesco warned against the hazards of this practice) since it could reinforce stereotypes about women being objects.

But why is femininity injected in artificial intelligent objects? If we wish to curb the large use of female gendering in AI, we’d like to higher understand the deep roots of this phenomenon.

Making the inhuman more human

In an article published within the journal , we argue that research on what makes people human can provide a brand new perspective into why feminization is systematically utilized in AI. We suggest that if women are inclined to be more objectified in AI than men, it is just not simply because they’re perceived as the right assistant, but in addition because people attribute more humanness to women (versus men) in the primary place.

Trailer for , a 2015 film starring Domhnall Gleeson and Oscar Isaac.

Why? Because women are perceived as warmer and more more likely to experience emotions than men, female gendering of AI objects contributes to humanizing them. Warmth and experience (but not competence) are indeed seen as fundamental qualities to be a full human but are lacking in machines.

Drawing on theories from dehumanization and objectification, we show across five studies with a complete sample of greater than 3,000 participants that:

  • Women are perceived as more human than men, overall and in comparison with non-human entities (animals and machines).

  • Female bots are endowed with more positive human qualities than male bots, and so they are perceived as more human than male bots, in comparison with each animals and machines.

  • The inferred humanness of female bots increases perceived uniqueness of treatment from them in a health context, resulting in more favorable attitudes toward AI solutions.

We used several different measures of perceived humanness, in comparison with each animals and machines. For example, to measure blatant humanness of female and male bots in comparison with animals, we used the ascent humanization scale based on the classic “march of progress” illustration. We explicitly asked online respondents to point how “evolved” they perceived female or male bots to be, using a continuous progression from ancient apes to modern humans.


Author provided

To measure the blatant perceived humanness of female and male bots in comparison with machines, we created a scale that measures blatant mechanistic (de)humanization, by picturing man’s evolution from robot to human (as a substitute of ape to human). Of course, we created each a female and a male version of every of those scales.

Other measures captured more subtle and implicit perceptions of humanness, by asking respondents the extent of emotions they attributed to female and male bots. Some emotions are said to differentiate humans from machines (for instance, “friendly”, “fun-loving”), and other emotions to differentiate humans from animals (i.e., “organized”, “polite”). Finally, we also used an implicit association test to research whether female bots are more likely than male bots to be related to the concept of “human” moderately than “machine”.

The ghost within the machine

While we found that ladies and feminine robots are perceived as more human on a lot of the subtle and all of the blatant and implicit measures of humanness, we also found that men and male robots are perceived as more human on the negative dimensions of the subtle measures of humanness. Taken together, these results indicate that female robots are usually not only endowed with more positive human qualities than male robots (benevolent sexism), but that also they are perceived as more human and are expected to be more vulnerable to consider our unique needs in a service context.

These findings may point to a brand new possible explanation of why female bots are favored over their male counterparts, with people preferring female intelligent machines because such machines are more strongly related to humanness.

Trailer for , a 2013 film starring Joaquin Phoenix and Scarlett Johansson.

If femininity is used to humanize non-human entities, this research suggests that treating women like objects in AI may lie precisely in the popularity that they are usually not. The popular assumption, though, steadily known as the dehumanization hypothesis, is that it’s obligatory to view outgroup members as animals or instruments before objectifying them. In other words, dehumanization could be a prerequisite for objectification to happen, with targets of objectification typically being denied their humanness. Contrary to this dominant view, the transformation of ladies into objects in AI might occur not because women are perceived as subhumans, but because they’re perceived as superhumans in the primary place.

This is in step with Martha C. Nussbaum’s assertion: “Objectification entails making right into a thing… something that is basically not a thing” (Nussbaum, 1995, p. 256–7). It also matches with Kate Manne’s view on misogyny and dehumanization: “Often, it’s not a way of ladies’s humanity that’s lacking. Her humanity is precisely the issue” (Manne, 2018, p. 33). Therefore, the widespread use of female identity in AI artefacts could also be rooted within the implicit recognition that ladies are perceived to be human, and more so than men.

Objectification of ladies in the true world?

This research builds on what makes people human in comparison with machines to higher understand the deep roots of the widespread female gendering of AI. Because feelings are on the very substance of our humanness, and since women are perceived as more more likely to experience feelings, we argue that female gendering of AI objects makes them look more human and more likely to think about our unique needs. However, this strategy of transforming women into objects could lead on to women’s objectification by conveying the concept that women are objects and easy tools designed to meet their owners’ needs. This may potentially fuel more women’s objectification and dehumanization within the non-digital world.

This research highlights thus the moral quandary faced by AI designers and policymakers: Women are said to be transformed into objects in AI, but injecting women’s humanity into AI objects makes these objects seem more human and acceptable.

These results are usually not particularly encouraging for the longer term of gender parity in AI, nor for ending objectification of ladies in AI. The development of gender-neutral voices might be a approach to move away from the feminine gendering of AI and stop the perpetuation of this benevolent sexism. Another solution, just like Google’s recent experimentation, could be to impose a default gender voice, assigning randomly and with an equal probability either a male or a female intelligent bot to users.


This article was originally published at theconversation.com