Technische Akteure werden zunehmend für soziale Aufgaben in der Pflege oder Betreuung eingesetzt. Wie verändert diese Interaktion zwischen Mensch und Maschine unser Verständnis von Empathie? Können Maschinen überhaupt empathisch sein?
Thoughts on empathy between humans and the machine
A couple years ago, I watched the movie “Her” with my partner at that time. It started an argument. He found the thought of an artificial girlfriend (assumed that technology advanced in their performance) generally acceptable. The idea of a robotic partner triggered a deep intuitive resistance in me that grew into more elaborate arguments.
Recently, I was invited as a speaker on a panel on creativity and empathy in a second machine age. As it turned out, not only this topic but also the surrounding conference was most inspiring. This article explores the question what implications artificial empathy of robotic agents might have in regard to the human condition.
All watched over by machines of loving grace
The more artificial intelligence is becoming influential on our infrastructures and societal orders, the more we are thrown back on the question what the limits of machine learning are. Can we teach a machine to care for a human? Can machines nurture our need for empathy? To fully answer these questions, we must revisit how we understand empathy in the first place. Empathy, by the Oxford Dictionary described as “the ability to understand and share the feelings of another“, is often understood as a human social skill to create understanding beyond rational thought. Empathy is one driver of human interaction on small and big scales. In our personal life’s empathy is creating collective identities and fosters friendships, in societies empathy supports solidarity and loyalty. To be clear, empathy isn’t a virtue in itself. Decisions based on pure empathy might ethically be as questionable as decisions we make on a purely rational basis. This is no reprise of empathy as always good in itself – I rather want to make the point, that empathy is with all its weaknesses an essential part of the human condition. Humans only exist in plurality and empathy is the intuitive language of connection between the many.
Empathy, that I understand as one form of human intelligence is often proclaimed to be (almost) unique to our species. Nevertheless, AI engineering is working on the idea to integrate machines in the process of care, e.g. for elderly or kids to assist and entertain. Like for instance MIT researcher Sherry Turkle argues, this raises a lot of questions and worries. On one side, it triggers a debate about the precarious situation of nursing staff losing their jobs to machines. Furthermore, it raises the question which consequences this will have on the quality of care. The question I want to discuss here is, how the acceptance of machines as empathic actors changes society’s understanding of empathy.
If we create robots that are empathetic actors, how does this affect us in our human condition? How does it affect our understanding of empathy and human interaction in the long run?
Anybody out there?
One could ask now, (cynically but valid): why is human empathy at all important for care? I want to argue here, that empathy expressed by words or actions is communicating mainly this: you are not alone. Simply by being there, as a unique human being, spending its time on the caring attention towards another human being, the nursing staff communicates inevitably this most basic message of empathy: I am with you now.
Robotic assistance might outdo any human caretaker in its physical abilities, its memory and precision. It will fail in curing human loneliness and entails the danger of spreading isolation in society. Empathy, I believe, is a highly complex part of human intelligence that machines can only partly assimilate, simply by the premises of their “machine condition” of not being natal beings – of not being born, and therefore not being alive. An encounter with a machine can only reproduce the known while a human encounter is always a precarious act, fragile, unpredictable and unknown.
The three depths of empathy
Empathy certainly has a performative aspect. Humans can show care, without necessarily feeling empathetic on an interpersonal level. We show care in our well-cultivated expressions in conversations, our donations or political correct thinking. Empathy is often a part of our socially trained and expected behaviour, the lack of empathy to the contrary is rather doomed socially awkward. This mimetic performance of empathy can be understood as the lowest level of empathy between humans and I have no doubt machine learning will progress quickly in training machines how to mimic empathetic behaviour similar to ours.
The next higher level of empathy is a cognitive function, an analytic and imaginary ability of humans. From reading someone’s behaviour and communication, an empathetic person can imagine to be in the role of the other to predict feelings similar to their own (Dymond 1949). This not only requires an act of introspection (knowing my own feelings) but also the assumption that my counterpart has the same spectrum of emotions like I do.
Whatever a robot might learn about empathy on this level is therefore limited and biased. First, robotic empathy will be restricted to a quantifiably representation of emotion and second, its capacities will be predetermined by the emotional spectrum and intelligence of its makers.
The deepest level of empathy goes beyond this cognitive ability and is rather an intuitive vicarious emotional response to the perceived emotional experiences of others (see Mehrabian & Epstein 1972). This not only enables to understand and emotionally respond to emotions of others, but rather allows human beings to share emotions, even feel things, one has never felt before as an individual. Empathy not only transmits emotions, it creates new shared ones, and thereby creates unique connections between human beings (and possibly certain animals).
Empathy of this kind is fragile, unlikely and in each experience unique. It requires nothing less than the presence, heart-felt attention and emotional effort of a unique human being that we intuitively recognize in its similarities and differences. Empathy of this kind is the emotional medium to true existential encounter, the undeniable proof that we are not alone.
The idea that empathy can be simulated and trained, reduces it to its most basic mimetic social functions and thereby misses the true existential function of empathy. Watched over by machines of loving grace we will always be alone.
We are always human
Ever since, humanity and technology have coexisted in a deeply interconnected reciprocation. Despite all our developments around artificial intelligence on the verge of a post-human age, we have barely understood what human intelligence and empathy mean in their full spectrum. Ironically, it might be the attempt to assign human abilities to machines – by researching, questioning and failing in this attempt – that enables us to better understand who we are after all.
Dieser Beitrag spiegelt die Meinung der Autorinnen und Autoren und weder notwendigerweise noch ausschließlich die Meinung des Institutes wider. Für mehr Informationen zu den Inhalten dieser Beiträge und den assoziierten Forschungsprojekten kontaktieren Sie bitte email@example.com
Bleiben Sie in Kontakt
und melden Sie sich für unseren monatlichen Newsletter mit den neusten Blogartikeln an.
JOURNALS DES HIIG
Die Research Clinic "Explainable AI" überbrückt Erklärungslücken in der automatisierten Entscheidungsfindung aus der Perspektive von Governance, Technik und Design.
Der Fall der Natur: Digitaler Ökozid – Wie kommen digitale Technologieunternehmen mit nicht hinnehmbaren Verhalten davon?
Digitale Tech-Unternehmen und globale Digitalisierungstrends verstärken den bestehenden Druck auf unsere natürliche Umwelt auf verschiedene Weise. Tatsächlich sind alle sechs Ziele für nachhaltige Entwicklung (Sustainable Development Goals, SDGs), die sich...
Keine Technologie ist neutral. Dating-Apps wie Tinder und Grindr können stereotype Annahmen über sexuelle Präferenzen perpetuieren und eine rassistische Flirtkultur verstärken. Kann das Recht intervenieren?