jueves, 5 de diciembre de 2024   inicia sesión o regístrate
 
Protestante Digital

 
John Wyatt
 

Artificial intelligence and simulated relationships (III)

As technologically simulated relationships become ever more realistic and superficially convincing, we must be aware of the risk that the simulacrum will exert a seductive appeal to our hearts.

JUBILEE CENTRE AUTOR 260/John_Wyatt 05 DE MARZO DE 2020 12:01 h
Photo: Franck V. (Unsplash, CC0).

This is the third part of this Cambridge Paper published by Prof. John Wyatt with the Jubilee Centre. You can read the first part here and the second part here.



 



CHRISTIAN RESPONSES



Humans are created to be embodied relational persons



In biblical thinking human beings are created as embodied persons, sharing a biological inheritance with animals, but uniquely created as God’s image-bearers.



We are created to represent God’s loving care for the world and for relationships with God himself, with one another, and with the non-human world.



So we are created beings of a particular kind, embodied, fragile, and dependent. We are mortal and limited, but designed for union and communion with one another and ultimately with God himself.



We are persons created by a relational God for relationships. And our humanity embodied in flesh is central to our relationships (Genesis 2:23, 24). Instead of being superseded, our fleshly embodiment is vindicated in the Incarnation and Resurrection when the Word became flesh (John 1:14; Luke 24:39).



Machines on the other hand cannot share our fleshly embodiment. They are artefacts of human creativity, with the potential to support our unique human calling, but they can never enter into genuine human relationships.



As we have already seen, behind the simulated compassion of AI bots and companion robots it is possible to identify a shallow and instrumentalised understanding of relationships, seen as orientated towards the satisfaction of my internal emotional needs.



But the Christian faith provides a richer and deeper perspective on human relationality. At their most exalted, human relationships can mirror and participate in the union and communion, the self-giving agape love, of the Persons of the Triune God.



In the Gospels, Christ himself models voluntary and freely chosen self-sacrificial love for others. ‘Whoever would be great among you must be your servant, and whoever would be first among you must be slave of all. For even the Son of Man came not to be served but to serve and to give his life as a ransom for many’ (Mark 10:43–45).



The paradoxical nature of Christlike love, whose concern is not for the meeting of one’s own needs but is instead self-forgetful because it is focused on the other, is beautifully expressed in the prayer of St Francis of Assisi:



O Divine Master, grant that I may not so much seek to be consoled as to console;

to be understood as to understand;

to be loved as to love.

For it is in giving that we receive;

it is in pardoning that we are pardoned;

and it is in dying that we are born to eternal life.



Authentic Christlike compassion depends on freedom, the freedom to choose to serve and give to the other. And it depends on human solidarity, on our common humanity and shared experience.



A machine cannot know what it means to suffer, to be anxious, or to fear death, and its simulated compassion (even if well-meant by its creators and users) is ultimately inauthentic.



However, it is not sufficient only to concentrate on the fundamental ontological difference between humans and machines. The machine is nothing but a sophisticated artefact, but if it becomes capable of simulating many of the most profound aspects of human persons and human relations, and hence evoke in other human beings responses of love, care, commitment and respect, this raises new and troubling issues.



Simulated personhood raises the question of whether I can ever be certain whether the entity I am relating to is a machine or a human. One possible regulatory approach is that of the ‘Turing Red Flag’, first proposed by a professor of computing, Toby Walsh, in 2015.[16]



A Turing Red Flag law would require that every autonomous system should be designed to prevent it being mistaken for one controlled by a human. In the case of a chatbot, for example, there might be a law that in every interaction you should be reminded that you are speaking to a clever simulation and not to a real human person.



 



THE EYES AND THE VOICE



It is striking to reflect on the priorities of social robotics from the perspective of the biblical narrative. The focus on the face and eyes reflect the Hebraic use of the face of God to represent his personal presence, as in the words of the Aaronic blessing: ‘The Lord bless you and keep you; the Lord make his face to shine upon you and be gracious to you; the Lord lift up his countenance upon you and give you peace’ (Numbers 6:22–27).



Moses’ face shone because he had been in the presence of the Lord, and the Apostle Paul uses the same metaphor: ‘We all, with unveiled face, beholding the glory of the Lord, are being transformed into the same image from one degree of glory to another’ (2 Corinthians 3:18).



Jesus taught that the eye is the lamp of the body (Matthew 6:22), pointing to the moral significance of what we choose to focus our vision upon.



The centrality of speech in the biblical narrative is just as striking. The spoken word of God is the very means of creation, the word expresses the hidden thoughts and purposes of the divine mind, and Christ himself is the Logos, the ultimate expression and revelation of God.



So the spiritual significance of the face and of spoken words, and their foundational role in divine and human relationships cannot be avoided. The simulation of these precious and theologically rich means of divine and human communication for commercial motives seems to point to a spiritually malign element that is facilitated by current technological developments.



The Apostle Paul describes Satan as disguising himself as an angel of light (2 Corinthians 11:14). The Greek word that Paul employed means ‘to change from one form into another’ and it is perhaps not too fanciful to see the possibility of spiritual evil which may accompany simulation of the most precious aspects of human relationality.



 



THE IDOL



Biblical scholars have pointed to the link between the Genesis description of human beings as being created in the image (selem) of God, and the subsequent use of the same word selem to refer to idols or ‘graven images’ in the later Old Testament.[17]



The implication seems to be that our creation in God’s image reflects our profound creaturely dependence upon him, but this is subverted when we transfer the divine image to a human artefact.



As Richard Lints puts it, ‘Human identity is rooted in what it reflects’.[18] The idol may be ontologically vacuous but its false image is capable of exerting a malign and destructive hold on its worshippers.



There seems to be a strange parallel between the evil consequences of creating a human artefact as an image of God and creating a robotic artefact as an image of humanity.



As technologically simulated relationships become ever more realistic and superficially convincing, we must be aware of the risk that the simulacrum will exert a seductive appeal to our hearts.



 



PRACTICAL IMPLICATIONS



‘How then shall we live’ in a society which seems to be increasingly promoting AI-simulated relationships in many aspects of care, therapy, education and entertainment?



These challenges are complex and multifaceted but an initial response is to ask what are the underlying questions and needs to which AI-simulated relationships appear to provide a technological solution?



As we saw above, a common narrative is that the needs for care across the planet are too great and that we have to find a technical solution to the lack of human carers, therapists and teachers.



But the current shortage of carers is, of course, in part a reflection of the low status and low economic valuation which our society places on caring roles. There are more than enough human beings who could undertake the work of caring, both in paid roles and also in unpaid voluntary caring within families and communities.



It is surely better as a society that we strive to facilitate and encourage human carers, rather than resorting to technological replacements for human beings.



In the world of healthcare, although AI technology can provide remarkable benefits with improved diagnosis, image analysis and treatment planning, it cannot replace the centrality of the human-to-human encounter.



The realities of illness, ageing, psychological distress and dementia all threaten our personhood at a profound level. In response, the therapeutic and caring encounter between two humans provides an opportunity for human solidarity which understands, empathises with and protects the frailty of the other.



In my experience as a paediatrician, with the privilege of caring for children and parents confronted with tragic and devastating loss, I have learnt afresh that the essence of caring is to say both in our words and our actions, ‘I am a human being like you; I too understand what it means to fear, to suffer and to be exposed to terrible loss. I am here to walk this path with you, to offer you my wisdom, expertise and experience, and to covenant that I will not abandon you, whatever happens.’



So, in conclusion, while we may see wide economic and practical benefits from advancing AI technology, as biblical Christians we are called to safeguard and to celebrate the centrality of embodied human-to-human relationships, particularly in essential caring and therapeutic roles, and in our families and Christian communities.



There is no substitute for human empathy, solidarity and love expressed in the face-to-face gaze of embodied human beings and in compassionate, thoughtful words spoken by human mouths.



 



John Wyatt is Emeritus Professor of Neonatal Paediatrics, Ethics & Perinatology at University College London, and a senior researcher at the Faraday Institute for Science and Religion, Cambridge.



This paper was first published by the Jubilee Centre.



 



NOTES



[16] Toby Walsh, ‘Turing’s Red Flag’, 2015,https://arxiv.org/abs/1510.09033



[17] Richard Lints, Identity and Idolatry, IVP, 2015.



[18] Ibid.


 

 


0
COMENTARIOS

    Si quieres comentar o

 



 
 
ESTAS EN: - - - Artificial intelligence and simulated relationships (III)
 
 
Síguenos en Ivoox
Síguenos en YouTube y en Vimeo
 
 
RECOMENDACIONES
 
PATROCINADORES
 

 
AEE
PROTESTANTE DIGITAL FORMA PARTE DE LA: Alianza Evangélica Española
MIEMBRO DE: Evangelical European Alliance (EEA) y World Evangelical Alliance (WEA)
 

Las opiniones vertidas por nuestros colaboradores se realizan a nivel personal, pudiendo coincidir o no con la postura de la dirección de Protestante Digital.