News and developments from RD4C

New Publication

Eight reasons responsible data for and about children matters

Eight reasons why child welfare advocates should pay more attention to data, and why we need a framework for responsible data collection and use.

Posted on 3rd of March 2022 by

This week, the peer-to-peer government learning platform apolitical featured an article by Stefaan Verhulst and Andrew Young of The GovLab highlighting the rationale behind the Responsible Data for Children (RD4C) initiative. The article lists eight reasons why responsible data for children matters. 

The article starts from the following observation: “The relationship between the datafication of everyday life and child welfare has generally been under-explored. This neglect is a lost opportunity, and also poses a risk to children, who are in many ways at the forefront of the steady incursions of data into our lives.”

As a response, the article outlines eight reasons why child welfare advocates should pay more attention to data, and why we need a framework for responsible data collection and use for children including: 

  • Children are at the forefront of datafication: Today’s children are the first generation to grow up amid the rapid datafication of virtually every aspect of life.
  • Children have less agency: Unlike adults, children typically do not have full agency to make decisions about their participation in programmes or services that may generate and record personal data.
  • Even aggregated data can be dangerous: Aggregated, anonymized data is not a panacea. There continues to be risk of re-identification through the mosaic effect and other challenges.
  • Data violations can result in lifelong loss of trust: When data is mishandled data subjects lose trust in institutions.. This, in turn, can reduce the uptake of essential services, and stunt benefits of technology.
  • Children’s interests can be overlooked: As technologies are implemented and the data volume increases,  existing protections to protect children may be overlooked.
  • AI and algorithmic bias pose particular risks: AI can expedite processes but contain hard-to-detect biases that result in real adverse effects. These risks are only heightened when it comes to children.
  • Risk of revisiting trauma: Children served by organizations may have suffered trauma. Asking children to provide data or register for services may revisit such trauma.
  • The relationship between privacy and children’s self development Having the freedom and autonomy to experiment with different identities, without prying eyes or chilling dataveillance, is important for children’s self-development.

Read Part 1 and Part 2 of the article. Those interested in learning more or partnering with us can contact rd4c [at] To join the RD4C conversation and be alerted to future releases, subscribe at this link.

Cover image by Alex Radelich/Unsplash is licensed under CC0.

Back to the Blog