Data Justice for Children
“I don’t understand, I feel so incompetent and frustrated. I have been reading it again and again but fail to understand it. It is clear that the company recognizes that children interact with the virtual assistants or can create their own profiles connected to the adults. Yet I can’t find an exhaustive description or explanation of the ways in which their data is used also in the section on Children’s Data. It seems so general and the responsibility comes all down to me. Do I consent for the company to have access to my children’s data? If yes, then tough luck for me. I am reading everything in detail and even if I have to admit that the UK policy is certainly clearer. I can’t really say that my consent is informed, as I am not sure I understand…” (Barassi, 2018)
The above extract is taken from my own notes as I tried to express how I felt – as a concerned parent of two – when I read the Terms of Use of one of the home technologies I studied over the last months[1], from virtual assistants to smart home technologies. I felt angry and frustrated because I felt incompetent yet at the same time responsible. The experience made me aware of the fact that, when it comes to protecting my children’s data privacy I am on my own. It is my responsibility to decide whether I agree or not to Terms and Conditions and to give away my children’s data in exchange for a specific service. It is my responsibility to set up privacy settings; to stop sharing photos on social media; to learn about encryption, and to delete my browsing history. Yet even if I feel responsible I also feel powerless because most of the times my children’s data escapes my control and understanding.
As I progress in the research, I am coming to the conclusion that one of the most problematic aspects of children’s data is that more often than not this data is produced, stored and processed through the digital practices and choices of others: the friends, the educators, the healthcare providers, the state officials, and the business developers. Children’s data is simultaneously collective and personal. It is personal because it enables us to identify children as unique individuals (e.g. biometric data). It is collective because it is the product of complex social relationships and messy digital practices (e.g. see my concept of home life data). So where does this leave us when with think about the issue of children’s privacy?
“Privacy” is a contested term. The very notion of ‘privacy’ is influenced by the cultural belief that that there is a clear distinction between what is personal and should be private and what is public and should be visible. Yet, as Nissenbaum (2010) has shown, not only in our new data environments (and especially after the advent of social media) it is becoming increasingly impossible to divide the world between private/public but too often our claims to privacy are dependent upon context.
One might feel happy to share his or her own love for wine and partying amongst like-minded colleagues, yet may feel uncomfortable if this information surfaced at a school-meeting with other parents. It is for this reason that Nissenbaum (2010) developed her theory of privacy based on the concept of contextual integrity.She rightly noticed that many times when we are concerned about privacy we are not really concerned about sharing personal data in public. What we are concerned about is having control over our personal data flowsso that we can determine, which personal data is shared in which context. Her work shows that ‘privacy’ is not only individual but it is ‘social’, because our understanding of privacy is always shaped by the social environments that we live in, by their social norms and conventions.
I believe that the understanding of privacy as being related to our need to control our personal data flows in different contexts should be at the centre of any debate around children’s data privacy. This is because the control of personal information gives individuals a certain degree of moral autonomy, in the sense that it enables us to define who we are in public by freely choosing the values that we want to be associated with (van den Hoven in Nissenbaum, 2010: 81). Most debates about children and data traces and especially the debates around social media and sharenting, focus on this understanding of privacy. They highlight the fact that social media data could impact on children’s right to moral autonomy and self-representation. All these debates are crucial in understanding the importance of protecting children’s right to privacy in the digital age and if you are interested in these debates, you can find key contributions in a special issue edited by Livingstone and Third(2017).
Yet here I want to take a step forward. It seems to me that when we look at children’s rights in our new data environments we need to start investigating not only their individual rights to privacy, self-representation and moral autonomy but also their right to data justice.
The concept of data justice has come to the fore in the last two years (see Taylor 2017 for a good review of the literature), because it became clear that at an historical time when data traces are used to build narratives about who we are and the groups we belong to, the issue at heart is no longer only about privacy and surveillance, it is about the ways in which we are being profiled as data subjects and citizen subjects.
Data justice as a term is used to remind us that we need to act against the fact that algorithms discriminate and have a clear impact on our civic rights. This is particularly true if we consider how algorithms discriminate minorities (see works like Hintz et al. 2018 or Noble, 2018). The concept of data justice is also used to warn us that we have no control or understanding over the ways in which we are being profiled (see the works of Pasquale, 2015 or Cheney-Lippold, 2017). Hence seeking data justice entails that we not only need to demand transparency about the way in which our data is aggregated and processed but we also need to be able to appeal how we are being profiled and fight for our right to fair-representation.
When we think about children’s rights, therefore, data justice is particularly important for three main reasons.
In the first place it can enable us to shed light on the fact thattheir personal data is constantly produced, archived, shared, understood and made to act for them by others and therefore request greater clarity.
In the second place, it can enable us to start uncovering the fact that children are often constructed as data subjects that are part of a specific social groups. As Taylor (2017) argued data technologies oftensort, profile and inform action based on group rather than individual characteristics and behaviour (ethnicity, class, family etc.) and some legal systems have formalised this relationship between surveillance and the collective (e.g. Mexican law including the family in the sphere of the individual with regard to data protection). Therefore, the concept of data justice can be used to demand that children arenot be judged or profiled on the basis of the families and collectives they are brought up in.
In the third place, the concept of data justice is also particularly important because it enables us to tackle the problem of algorithmic inaccuracy. As I argue in a forthcoming chapter titled The Child as Datafied Citizen: Algorithmic Inaccuracies and Data Justice in Family Life, the data that is produced about children in the family is often the product of imprecise, fragmented and inaccurate data practices. During my research it became evident that families most of the times did not use the technologies as they were supposed to or they employed specific tactics to protect their children. Hence the data they produced was broken, fragmented and imprecise. The concept of data justice, therefore, can enable us to question what type of assumptions are reached on the basis of these inaccurate data and to tackle the messiness of children’s data traces.
In conclusion, much more needs to be done to protect children’s data privacy. It is not fair to make parents feel responsible and powerless when it comes to protect their children’s privacy. The very notion of data justice can enable us to re-think children’s rights.
[1]During the research I made sure that I read carefully both the U.K. and the U.S. versions of the data policies (to take into account GDPR changes), and I researched specific key words in the terms and conditions such as ‘aggregated profiles’ or ‘biometric data’. If you are interested, some of the results of the research were published in a report on Home Life Data and Children’s Privacy.