"I call him religious who understands the suffering of others," wrote Mahatma Gandhi. We might fairly substitute for "religion" the word "human".
Academics from two British universities are part of an international team to develop social robots for work in UK aged care homes.
The so-called “Pepper” robots are being manufactured by Softbank, the group which has provided similar machines to care facilities throughout Japan.
As a social futurist who is constantly researching changes in technology and attitudes to it, I know the awesome power of modern machines.
Much is now written about machine (or artificial) intelligence, yet one big question remains. Can robots develop emotional intelligence and particularly the ability to empathise?
A few years ago, Sony couldn’t get a robot to walk. Now robots can run, climb stairs and even play a rudimentary form of football. The new generation of sexy-sounding “social bots” are so named because their programming allows them to respond to human emotion.
A visit to a mobile phone store in Tokyo will often mean being greeted by a social bot. It will “read” your emotions through biometrically scanning your facial expressions and muscle movements. It will then respond in kind, according to a series of pre-programmed reactions.
It’s not surprising then to learn that Japanese aged care homes have been utilising robots for a few years now. They have proven particularly helpful in assisting clients who have emotional or dementia-related mental health issues.
A survey of one thousand such care homes found that the vast majority of clients preferred the robotic carers to the human variety.
The new breed of robot is indeed remarkable, as are “Smart Age” machines generally. There is evidence that machines can, in a sense, learn. This may seem a little too sci-fi to some observers, but let’s remember that wi-fi was sci-fi not that long ago.
Already, in the growing field of Big Data Analytics, machines are processing huge amounts of information in a way – or at least a timeframe – that would be impossible for even huge human networks.
Speedily identifying patterns within screeds of data allows machines to predict such things as economic shifts, political voter patterns and consumer trends. On a very practical level, this predictive analysis is now being used in the design of furniture, buildings, streetscapes and even cities. It is also proving invaluable in the creation of new prisons and crime prevention programmes.
A lesser known product of Big Data is that it allows machines to improve on their own programming. In this sense, machines like IBM’s Watson computer – it’s actually a network of computers – are able to improve themselves, to “learn”.
I use quotation marks here because our common usage of the word “learn” is in the human sense – denoting learning as people do it. We can’t apply that word fully to machines, partly because we still don’t fully understand how learning actually happens within the human mind.
Experience in US hospitals has proven that machines can be of use with relatively routine tasks such as safely dispensing daily medications, for people who might otherwise forget to do so. They have also been used in remote surgery – where a medico in one location guides a remote robotic arm in another.
Machines can also offer a limited form of company for the lonely. On that level, they will prove helpful in care homes.
What happens, however, when over-stretched care facilities, facing rapidly ageing populations and tighter budgets, start relying just a little too much on robots, social or otherwise? They will then experience a form of human entropy.
When an electric kettle is unplugged, its energy is dissipated. This is entropy. Without the application or outside energy, any natural system winds down. The same can happen within human systems.
This is arguably already happening within some care homes in the UK. There are many reputable and caring facilities, but we still read too many news reports about elderly people who are neglected, or worse, badly treated by care workers.
Faced with rapidly ageing populations and without strict internal checks and balances or external government regulation, care homes may start to rely just a little too much on their mechanised helpers.
A two-track approach is needed – not in the near future when social bots are a day-to-day reality in our care homes, but right now.
Alongside the building of machines to improve care, we must develop new regulations and rigorous training schemes for caregivers, to ensure that machines are not over-estimated while the human factor is under-valued.
Early in my professional life I worked as a youth worker and then as a minister leading a local church. Every week for several years I was involved in offering pastoral counsel and support for parishioners and others who faced a range of social, emotional and psychological challenges. On the relatively rare occasion that I came across a serious mental health challenge, I and my team were able to refer folks to trusted professionals.
Developing empathy skills became an important part of my service to the community. I learned very quickly that the practice of true empathy can be hard work, requiring a great deal of mental and emotional engagement.
This is because empathy involves the capacity not simply to sympathise with another person’s situation but to share their feelings about it, through imagination.
At its root, empathy is placing oneself firmly in the shoes of another and trying to see the world through their eyes. The goal is not simply to feel for them but to feel with them.
Empathy is of course central to all relationships – friendship, marriage, parenting all benefit from an ability to understand and respond to emotions. Some fortunate people are blessed with an innately high level of emotional intelligence. Yet empathy skills can also be acquired, through study and practice.
For all the programming genius that goes into their production, however, this is one thing that robots – at least as we currently know them – cannot do.
They may be able to “sense” our emotions and even to respond in kind. Yet they cannot truly empathise, for empathy demands at the very least a shared human experience, which by definition human-substitutes don’t possess.
Being a process of imagination, empathy necessitates a capacity to draw upon a palette of shared human emotions, rather than programmed or pretend versions of the same.
In fact, when psychologists see a human being displaying only manufactured emotional responses to the suffering of others, they will often see this as part of a Narcissistic Personality Disorder or an expression of sociopathy.
We must recognise that, despite our best machine designs and human intentions, entropy is always a very real danger. The introduction of social bots, whatever their bells-and-whistles technology, must be accompanied by a new culture of much more rigorous oversight.
We must never see interactions with machines as true substitutes for human fellowship.