pexels-photo-29737

What do we mean when we talk about the relationship between ‘surveillance’ and ‘privacy’? How do I approach this question in the Child | Data | Citizen project? Well firstly I would like to highlight two books that have strongly influenced my understanding of privacy and surveillance. On the one hand Helen Nissenbaum’s work Privacy in Context (2010), which shows that privacy is not only defined by the very human need to protect one’s own personal information, but also by the need to regulate the way in which this information is shared from context to context. On the other hand I am strongly influenced by the work of David Lyon in Surveillance Society (2001), which sees surveillance as a complex cultural process that affects everyday life, and includes different types of surveillance from the surveillance of citizens personal data and processes of identification of citizens (e.g. ID cards) to institutional surveillance (health, education etc.) and corporate surveillance (e.g. shopping loyalty cards, online interactions etc.). I also believe, like he does (Lyon, 2014), that the rise of Big Data has deeply intensified and made easier the everyday surveillance of citizen.

Secondly I am interested in what it means to study surveillance and privacy from an ethnographic perspective. How does it feel to be constantly tracked? What are the data anxieties (Pink et al. 2018) that emerge from these feelings? How can we understand those ethnographic tensions that define everyday social relationships as people negotiate with digital surveillance? What tactics do they use to protect themselves from daily surveillance? How can we understand the human messiness that defines data production? How do people negotiate with different understandings of what constitutes ‘privacy’?

To address these questions, at present I am working on the following connected themes:

Privacy in the Automated Home.  One dimension that the project is studying is how smart technologies are changing the very notion of privacy in the home. Of course with the development of internet technologies the relationship between personal privacy and the home has been challenged by increasingly intrusive media practices. In  1999, Charles Sykes suggested that digital media were ending people’s right to privacy in the home. Yet, with the inclusion of AI technologies and automated technologies, as the home becomes increasingly more automated and we loose control over the amount of personal data that is gathered and used, these questions seem to be acquiring new meanings. My research with parents at present is suggesting that change is happening now. Although smart technologies have been present for a while within the home (e.g. smart TVs, smart homes), the home is increasingly becoming more and more automated. My research is thus interested in highlighting how parents understand privacy in the new home and to map what it ‘feels’ like to see our home technologies, from meters to toys, collecting increasingly more personalised and highly contextual data.

Social Media and Privacy.  Current debates about children’s digital traces and sharenting practices on social media are influencing families in substantial ways. One aspect that is emerging clearly from the interviews collected so far is that parents are deeply aware of the privacy issues that emerge when they share information of their children on social media. Some would mention that they used to post regularly in the past, but that they have now stopped. Others would comment on the ‘moral choices’ they make as they try not to disclose compromising information of their children.

What seems to be emerging, though, is that the parents I talk to are deeply aware of the problem of contextual integrity (Nissenbaum, 2010) or in other words how important it is for individuals to control the personal information that they share in different contexts. In fact parents, are deeply concerned with the fact that the surveillance of social media data is widespread, and it is not only carried out by governments or corporations but also by anyone interested in finding out more about a person and making a decision about them. Some of the parents I interviewed, mentioned that the first thing they do when they want to hire someone is to look at their social media profiles. In the interviews, consequently, they often talk about the impact the information they share of their children may have in the future on their ability to be admitted to a specific school or get a job.

Online Privacy and Impossibility of Opting Out One dimension that I am researching at the moment focuses on how parents relate to Terms and Conditions, and whether they perceive that they can control the data they share and ‘opt’ out if they feel that they want to. Unsurprisingly Interviews are revealing that parents never read the Terms and Conditions of any of the services that they use or buy. Common complaints are: the un-accessibility of the language of the privacy policies; the fact that they are perceived as deliberately ambiguous (“Even when I read them and I understand the content, I am really not sure about what happens to the data”, mentioned one parent in the U.S.); their length; the impossibility of ‘opting out’. According to the parents I spoke to one doesn’t really have a choice. You either accept the terms and conditions or you don’t use the service. Although some may see this as simply a ‘consumer’ problem, the issue is much more profound. In fact, being cut off from some services (e.g. Facebook) automatically excludes the family from a range of social activities that are meaningful to them (e.g. Facebook school groups). The research is revealing that in relation to online privacy, therefore, parents often feel ‘trapped’, ‘powerless’ and don’t know how to protect their information and the ones of their children. Another dimension that it is emerging, is the fact that despite many parents mention that ‘they don’t have anything to hide’ they are concerned about the intrusive nature of some technologies, the increased aggregation and personalisation of data and digital profiling.