Calendar07 June 2023

Publication: Interacting with agents without a mind: the case for artificial agents Publication: Interacting with agents without a mind: the case for artificial agents

In 2017, Sophia, a humanoid robot, was granted citizenship — a fundamental human right. Five years later, one of Google's engineers reported that the company's Artificial Intelligence (AI) had become sentient. Although peculiar, these events highlight that AI's abilities, rights, responsibilities, and societal roles remain ambiguous: do people consider AI as a machine or a human-like agent when interacting with it?

People predict the actions of humans based on the fundamental assumption that they are intentional agents that have a mind. However, the mind is in the eye of the beholder, which means that it can be withdrawn from human agents (i.e. dehumanisation) but also ascribed to non-human agents (i.e. anthropomorphism) based on cognitive or motivational features associated with the perceiver, as well as physical and behavioural features of the perceived entity. Perceived humanness assumes that the other can act (has agency) and has experiences (thoughts and feelings).

In this review, EMERGE partners from Ludwig Maximilian University of Munich explore whether and when people consider — perceive, understand, predict, and manipulate — AI as non-human or human agent, utilising the socio-cognitive and interactive repertoire reserved for humans. They show that AI fails to fully elicit these two dimensions of mind perception. Embodied AI such as social robots, may trigger agency attribution, but only humans trigger the attribution of experience. Importantly, people are more likely to attribute mind in general and agency specifically to AI that resembles the human form. Lastly, people's pre-dispositions and the social context affect people's tendency to attribute human traits to AI.

Source: Deroy, O., Current Opinion in Behavioral Sciences, 51, 101282, 2023. DOI: 10.1016/j.cobeha.2023.101282.

Access EMERGE publications in the link below.