Why we need responsible data for children
Posted on 23rd of March 2020 by Andrew Young, Marilla Li, Stefaan Verhulst
Reposted from The Conversation
Around the world, humanitarian and development organizations working with children are increasingly reliant on a wide range of technologies used to improve the efficacy of service delivery and how to respond to, for instance, pandemics and other dynamic threats.
Child rights organizations are using or exploring the use of a variety of data-driven technologies to bolster services provided to children, including biometrics, digital identity systems, remote-sensing technologies, mobile and social media messaging apps, and administrative data systems. The data generated by these tools and systems includes potentially sensitive data, such as personally identifiable information (PII) and demographically identifiable information (DII) – data points that enable the identification, classification, and tracking of individuals, groups, or multiple groups of individuals by demographically defining factors.
Given this increasingly datafied environment, and the emerging challenges involved in upholding the Convention on the Rights of the Child in our data age, there is a clear need to develop and disseminate responsible approaches for handling data for and about children. Last year, The GovLab and UNICEF initiated the Responsible Data for Children initiative (RD4C) to support actors around the world in avoiding unintended negative consequences on data subjects and beneficiaries and, in turn, ensuring the effective use and positive impact of data.
Growing opportunities, and risks
Collecting, storing, preparing, sharing, analyzing, and using data about children create unique opportunities and risks. These opportunities and risks are distinct from those involved in the datafication of the general public or other vulnerable groups. To achieve responsible data for children, the public sector, data-holding businesses, and civil society organizations delivering services for children need to better understand the distinct risks and opportunities of an increasingly connected and quantified environment for children.
Without question, the increased use of data poses unique risks for and responsibilities to children. While practitioners may have well-intended purposes to leverage data for and about children, the data systems used are often designed with (consenting) adults in mind without a focus on the unique needs and vulnerabilities of children. This can lead to the collection of inaccurate and unreliable data as well as the inappropriate and potentially harmful use of data for and about children.
Trends and realities
Research undertaken in the context of the RD4C initiative uncovered the following trends and realities. These issues make clear why we need a dedicated data responsibility approach for children.
Today’s children are the first generation growing up at a time of rapid datafication where almost all aspects of their lives, both on and off-line, are turned into data points. An entire generation of young people is being datafied – often starting even before birth. Every year the average child will have more data collected about them in their lifetime than would a similar child born any year prior. The potential uses of such large volumes of data and the impact on children’s lives are unpredictable, and could potentially be used against them.
Children typically do not have full agency to make decisions about their participation in programs or services which may generate and record personal data. Children may also lack the understanding to assess a decision’s purported risks and benefits. Privacy terms and conditions are often barely understood by educated adults, let alone children. As a result, there is a higher duty of care for children’s data.
Disaggregating data according to socio-demographic characteristics can improve service delivery and assist with policy development. However, it also creates risks for group privacy. Children can be identified, exposing them to possible harms. Disaggregated data for groups such as child-headed households and children experiencing gender-based violence can put vulnerable communities and children at risk. Data about children’s location itself can be risky, especially if they have some additional vulnerability that could expose them to harm.
Mishandling data can cause children to lose trust in institutions that deliver essential services including vaccines, medicine, and nutrition supplies. For organizations dealing with child well-being, these retreats can have severe consequences. Distrust can cause families and children to refuse health, education, child protection and other public services. Such privacy protective behavior can impact children throughout the course of their lifetime, and potentially exacerbate existing inequities and vulnerabilities.
As volumes of collected and stored data increase, obligations and protections traditionally put in place for children may be difficult or impossible to uphold. The interests of children are not always prioritized when organizations define their legitimate interest to access or share personal information of children. The immediate benefit of a service provided does not always justify the risk or harm that might be caused by it in the future. Data analysis may be undertaken by people who do not have expertise in the area of child rights, as opposed to traditional research where practitioners are specifically educated in child subject research. Similarly, service providers collecting children’s data are not always specially trained to handle it, as international standards recommend.
Recent events around the world reveal the promise and pitfalls of algorithmic decision-making. While it can expedite certain processes, algorithms and their inferences can possess biases that can have adverse effects on people, for example those seeking medical care and attempting to secure jobs. The danger posed by algorithmic bias is especially pronounced for children and other vulnerable populations. These groups often lack the awareness or resources necessary to respond to instances of bias or to rectify any misconceptions or inaccuracies in their data.
Many of the children served by child welfare organizations have suffered trauma. Whether physical, social, emotional in nature, repeatedly making children register for services or provide confidential personal information can amount to revictimization – re-exposing them to traumas or instigating unwarranted feelings of shame and guilt.
These trends and realities make clear the need for new approaches for maximizing the value of data to improve children’s lives, while mitigating the risks posed by our increasingly datafied society.
The full-length Responsible Data for Children Synthesis Report and other resources are available on Responsible Data for Children website, RD4C.org.
The authors would like to thank Stuart Campo, Senior Fellow at The GovLab, for his important contributions to the RD4C initiative.