Alexa do you Care?

Alexa do you Care?

J: My son wakes up and says “Alexa what time is it?” And she replies. He has conversations with her. So he goes and tell her very silly things like ‘you are such a goof’ or ‘”you are stupid” or “you are dumb” and things like that, and she replies “Well you are not so nice”, so he comes to me saying that Alexa told him that he was not very nice. He is really excited about it. I think for a small kid, it’s exciting. There is this voice that is coming out of this box, and responds to his commands.

This is a an extract from an interview with Julie, the mother of a 4-year-old who lived in Los Angeles and I interviewed for the Child | Data | Citizen project. An article that appeared on The Times on the 15th of July, which discussed how children are treating AI virtual assistants as part of the family reminded me of the interview with Julie. It also made me realise that there are many questions that are emerging at the moment about the complex ways in which AI virtual assistants are impacting on family life.

One question that I would like to investigate, if I had the time and resources, is what it means for children to grow up with artificial intelligence devices, who provide them with the illusion of humanity and teach them to respect and relate to a AI/object as a quasi-person (Elgan, 2018).  During the interview with Julie, for instance, I could not prevent myself from noticing that her son joked with Alexa, especially by questioning its (because Alexa is not a she) intelligence, and that he knew that Alexa was making a clear moral judgement about him not being nice. Unfortunately, during the Child | Data Citizen project I didn’t have the time or resources to engage with these questions. What I had the time to do, was to look at the question about data.

This post is influenced by a chapter I am writing on the complexity of ‘home life data’. In the chapter I argue that in order to how virtual assistants are impacting family life, we need at first to consider the current discourses that dominate the business world.  In the chapter I talk about three interconnected cultural narratives: techno-solutionism, data-fetishism, and the semi-humanity of AI. Here I want to focus on the last.

When we think about AI in the home we need to think about the techno-dream (and dystopia) to build systems that are quasi human because of their cognitive faculties. This techno-utopian myth about the creation of ‘thinking machines’ that are able to emulate the cognitive faculties of humans dates back to the 1950s (Natale and Ballatore, 2017). The promotional cultures of  home hubs stress the cognitive faculties of their virtual assistants. Apple’s Siri for instance is currently promoted in the following way: “Siri is always learning. So the more you use it, the more helpful it can be.”

It is not surprising that virtual assistants in home hubs are promoted for their cognitive faculties. Yet what is starting to become a common discourse amongst developers is that AI virtual assistants are offering something extraordinarly new in terms of social interaction. Something that differs from other types of AI. According to Zuckenberg (2016) for instance the fact that we can talk to AI virtual assistants implies that we are seeking/experiencing more emotional depth with our technologies. In his post on ‘Building Jarvis’ (2016) he notes:

“… once you can speak to a system, you attribute more emotional depth to it than a computer you might interact with using text or a graphic interface. One interesting observation is that ever since I built voice into Jarvis, I’ve also wanted to build in more humor. Part of this is that now it can interact with Max and I want those interactions to be entertaining for her, but part of it is that it now feels like it’s present with us. I’ve taught it fun little games like Priscilla or I can ask it who we should tickle and it will randomly tell our family to all go tickle one of us, Max or Beast.”

A survey carried out by Google/Peerless Insights of 1,642 users of voice-activated speakers, has shown not only that people are engaging with these technologies as if they were human by saying “please”, “thank you” and even “sorry”, to their virtual assistants, but also that 41% of people said that voice-activated virtual assistants feel like talking to a friend or another person (Kleinberg, 2018).

There are many questions that are emerging about the impact of these technologies on home life; questions that cannot really be solved by surveys. Numbers alone rarely speak about context, intention, and the human passions, desires, and contradictions that define our interaction with virtual assistants. During the Child | Data | Citizen project, however, I did not have the time or resources to carry out an ethnographic exploration of these interactions. What I had the time to focus on is the question about privacy. My understanding is that one of the first steps that we need to take if we want to understand shifting notions of privacy in family life is to look at the powerful discourses that dominate business models.

When we think about home hubs, the “thinking machine”, which is also a “talking machine”, is constructed as a machine that “can think about you” and “care about you”. Amazon, for instance, encourages users to “enjoy a home that takes care of you”. The emphasis is not only reflected in the ways in which virtual assistants are advertised but also in specific features that they are offering. In fact, through home hubs users can usually set up personalised routines (a set of actions, such as lighting, music, heating etc. that can be activated with a unique command) so that their homes can take care of them when they come back from work or sit down for a family dinner. The overall idea is that the home can take care of all your needs from the moment in which you wake up to the moment in which you go to bed. The images on the websites seem to emphasise precisely this level of intimacy and care.

It is within these discourses, within these techno-utopian myths that convince us that technologies “offer us a way to solve our problems”, that “they make our life easy” and “care about us” that lies the crucial question of our times: what are we expected to give in return? At the moment the answer to this question seems simple: our home life data.