Artificial empathy: the case of Be my eyes

Be my eyes is a free application whose foundational idea caught my attention from the beginning, because just as Dialogue in the Dark does, it is based on the encounter. Its slogan is: Bringing Sight to the Blind and Visually Impaired.

Icon of Be My Eyes
Icon of Be My Eyes

I.    The app

Be my eyes is a free application whose foundational idea caught my attention from the beginning, because just as Dialogue in the Dark does, it is based on the encounter. Its slogan is: Bringing Sight to the Blind and Visually Impaired.

Taken from their website, "be my eyes was created to help people who are blind or visually impaired. The app is made up of a global community of blind and visually impaired people and non-visually impaired volunteers. Be My Eyes takes the power of technology and human connection to bring sight to people with vision loss. Through a live video call, volunteers assist blind and visually impaired people by providing visual assistance for tasks such as color recognition, checking if the lights are on..." 

II.    Empathy

Be my eyes is a tangible example of empathy in action. Empathy is about understanding the emotional and cognitive world of others (what they feel and think). Empathy ends there, although it can lead to action. 

In the case of this app, developers practiced empathy. How does a person living with visual disability feel when they can't do things like identify colors of their clothes or simply know if the light in their bedroom is on? Perhaps frustrated, IRRITATED OR DEPENDENT. What do PERSONS WITH VISUAL DISABILITY think about that? Maybe that it's not fair, that there should be a form of support, that it's impossible that, with so much technology, it can't be solved.

Developers moved on to empathy in action. And the concept was simple: an app that allows visually impaired people to make a call to sighted volunteers so that they can help them.

III.    Two hasty conclusions 

The statistics of users and volunteers in the app catch my attention: about 600,000 users with disability and about 6 million volunteers. First conclusion, hasty if you will: there are more people who want to help than people who want to be helped.

More than two or three volunteers in my small social circle once expressed their disappointment at never having received a call.

We can refute my hypothesis; there are several arguments for this. The first is the ratio of one population to the other. There are far fewer people living with visual disability than people who can see. Second argument, surely the number of people with disability who have accessed a smartphone are very few compared to the proportion of people with no disability who use one of these devices.

The second conclusion I have come to, having observed a few users of this app, and which perhaps explains this disproportion, is that some visually impaired users were overcome by shyness when they were using the app and had to call a volunteer.

IV. Artificial Intelligence

Recently this app introduced its new AI tool, which quickly became a trend among the community of users. In addition to the option to make a video call, today the app offers you an AI button where you take a picture of what you want a description of, let's say a box of cookies, and after a few seconds, the app gives you an AI-generated description, as well as giving you the option to chat with a bot that can give you more details based on the picture.

Cool, right? The tool works pretty well.

V.    Artificial Empathy

Why did this AI feature cause such a stir?

Because now there is room for the shy ones who don't want to make a video call. Even those who would dare to call a volunteer, I think they will prefer to interact with this AI first.

The act of empathy that was Be my eyes has been automated and today we interact with a bot. Today we experience artificial empathy.

And as is happening in many other areas of our relationship with technology, we are preferring to interact with these artificial intelligences than to interact with people.

VI.        Consequences 

I see two consequences of the excessive use of artificial empathy: the first is that the shy will become shyer and that encounters between people with and without disabilities will be reduced.

One of my hypotheses is that people with disabilities need to improve our education in social-emotional skills. The quality of our relationships with others largely determines the quality of our lives. And that applies in much greater proportion to a sector such as people with disabilities, who in many situations can be dependent, vulnerable and in need of support from those around us. However, due to a situation of social isolation, our social skills to generate deep bonds have been affected, and that will only get worse if most of our interactions are with bots.

The second consequence, fewer encounters between people with and without disabilities, is not my hypothesis, but what experience tells me.  Ask yourself: who do you ask first a doubt, Google or someone else?

It is proven that the most effective way to improve the inclusion of people with disabilities and reduce prejudices on the subject is through encounters. There is NO way you can convince yourself that a person with a disability is worth the same as you more than having friends, study or work colleagues or some other type of relationship with a person living with a disability.

Be my eyes was a great platform to facilitate these encounters, and today I fear that they will be reduced, leaving people with no disability in their comfort zone, wanting to help, but not interacting with people living with disabilities and vice versa.