The RD4C Principles
Principles to guide responsible data handling toward saving children’s lives, defending their rights, and helping them fulfill their potential from early childhood through adolescence.
Read about our principles at
Principles Page
From our blog
New developments from RD4C.
New Publication
Hear the Voice of Youth in the Digital AgeOf the world’s population, every 1 in 2 people are under the age of 30. Among youth aged between 10 and 24, 9 out of 10 live in the Global South. Yet, young people's voices are the least represented and heard when their future is decided. Worldwide, only 2.6% of parliamentarians are under 30. Many young people have also noted in workshops conducted by UNICEF that while they are frequently consulted during policy development, they are seldom considered as decision-makers. In one U-Report survey with nearly 80,000 respondents, only 3 out of 10 feel they have much control over their digital future. This disconnect is exacerbated by the growing importance of data. Data is a power that drives service delivery, policy formulation, and governance improvements worldwide. Yet, it also poses significant risks. Data can be misused—from enabling discrimination and surveillance to facilitating criminal activities. When it comes to data governance, a youth perspective is urgently needed—not only because they are the most active generation online, but also because they are the most affected: Their physical, emotional, social and psychological developmental needs, adds an extra layer of vulnerability. On International Youth Day, it is essential to ask: How can data policies better serve and empower young people in shaping their digital futures? Global Best Practices A number of countries and institutions have already taken the lead in engaging the youth on the issues and concerns that affect their daily lives. In Zimbabwe, researchers worked with 475 children between the ages of 12 and 17 to develop the Child Online Protection policy.Their inputs formed the core of the proposal from the initial conception to the final draft. Yet the engagement didn’t stop there. The research team went back to hear their feedback after the draft was validated by the government, making sure the policy reflected their vision for a safe digital future. Similar exercises have been seen in Zambia and Germany. In the United Nations, UNICEF has taken a similar approach. It developed a digital community, U-Report, for young people in nearly 100 countries to raise their voices and express their concerns. The UN Youth Office has launched a number of campaigns, including the latest one, World Leaders: It’s Time to Let #YouthLead, to call for more youth engagement across the world. Multiple country teams have been relentlessly working to enhance youth participation through capacity building to policy advising. Big tech companies are also joining this effort. Microsoft has created the Council for Digital Good in the United States and Europe, bringing teenagers together to listen to their expectations and hopes for responsible online interactions. Meta, Facebook’s parent company, has also launched a youth advisory network to invite young people to share thoughts on their ideal digital future. However, the current efforts are not enough. Much more can be done. Driving Engagement Forward First, the digital divide still poses a significant challenge to young people that needs to be addressed. Disparities between urban and rural areas, as well as wealthy and poor households, widely persist in all countries. Several studies have found that data literacy is relatively low in the Global South; the gaps between middle-income and high-income countries are also considerable. Meanwhile, although gender differences in understanding data have narrowed among younger generations, the lack of comprehensive, disaggregated data specifically on girls obstructs targeted interventions and perpetuates inequalities. For example, without specific data on girls' school attendance and dropout rates, it would be difficult for policymakers to effectively address the barriers they face, such as early marriage, household responsibilities, or a lack of sanitary facilities. As advocated by the people-centric principle of RD4C, it is essential that both public and private sectors bear these divisions in mind and work together to bridge them, prioritising the best interest of the child over their potential efficiency gains or other process-oriented objectives. Second, when it comes to data governance, youth should be involved in the entire cycle of policymaking. The development of a policy involves multiple steps. From issue identification and research to consultation and formulation, each process requires the participation of youth. The monitoring and evaluation phase is no exception. Research has suggested that if young people consider their participation as unsatisfactory, unequal, or superficial, they may feel "used" or regard the involvement as "tokenistic". Today, the common practice of youth engagement favors consulting with young people rather than involving them in decision-making. There is also a lack of mechanisms for monitoring youth feedback. A comprehensively participatory approach allows the young generation to meaningfully participate in the entire process of making data policies. This ensures their opinions are fully considered and respected throughout the data lifecycle. Third, understand the youth to leverage their wisdom. To promote effective youth engagement, it is essential to gain a thorough understanding of their behaviors and preferences. By leveraging these insights, we can craft innovative platforms tailored to their needs and interests. According to a policy brief published by UN DESA, young people prefer instant feedback and are highly sensitive to reward; they have strong attachments to their identities yet can be influenced by peers easily. Therefore, in addition to traditional engagement methods like discussion roundtables and youth-led research, more creative approaches can be adopted. Some of the most interesting examples include the UN Big Data Hackathon, which has separate youth tracks. Such events not only help young people to better understand the data around them, but also teach them how to fully use it to make their future brighter. Another example comes from Manchester, a city in the northwest of England. The city developed the Greater Manchester Youth Combined Authority, an organization consisting entirely of young people. They are responsible for advising on and scrutinising the work of the mayor on key issues of concern to the young generation. Under this mechanism, the youth can see how their fellows’ inputs have a direct impact on policies that affect them. This special arrangement empowers the youth to inspire the youth. *** Growing up in a digital age, many young people have never seen a world without the internet. While some might know how to use the internet to entertain themselves and make life easier, they remain vulnerable in an ecosystem flooded with data. Given that young people initially lack agency in handling data and asserting their rights, it becomes all too easy for adults to step in and make decisions on their behalf. However, a more constructive approach is to nurture their agency, rather than simply taking over the power. Nearly all countries of the world have ratified the Convention on the Rights of the Child, a universal treaty that also applies to data responsibility. According to Article 17, when developing legislation, policies, programmes, services and training on children’s rights in relation to the digital environment, States parties should involve all children, listen to their needs and give due weight to their views. This should not be overlooked, nor forgotten. To hear the voice of youth, RD4C cooperates with Generation Unlimited to hold a virtual discussion panel in August, featuring young leaders from around the world. Through this event, we aim to provide a platform where they can share their perspectives, experiences, and insights on data responsibility and data governance. By empowering young people and amplifying their voices, we believe they can lead the way as change-makers in our constantly evolving digital world. Youth is the future. Let’s not let the future down. Image by John Schnobrich / Unsplash is licensed under CC0
Read moreNew Publication
Why Responsible Data for Children matters when designing a child wellbeing monitoring programIn our increasingly datafied world, the responsible handling of data for and about children has become a critical issue. UNICEF, governments, and its partners acknowledge the vital importance of acting to advance children’s well-being while preserving their rights and agency. We engage with partners around the world to inform practices around data (re)use for and about children. In June, the RD4C team travelled to Beijing, where UNICEF China organized a social policy salon. Attendees discussed how to better understand and respond to children's development needs through data in the field of social protection. Representatives from the Ministry of Civil Affairs and six local civil affairs departments attended the meeting and shared their experiences in tracking the wellbeing of low-income populations and the development of a child module. Reflecting on this recent policy forum in Beijing as well as past experiences in several other contexts, we have come up with three broad clusters of rationales that highlight the significance of data responsibility for children in the context of government owned, administrative data systems. 1. Building Trust to Enable and Improve Service Delivery Trust is foundational in any relationship, especially when it comes to children and their data. Responsible data management fosters trust among children, parents, and the broader community, enabling more effective service delivery. Trust in data practices means that individuals and communities believe their data is being handled securely, ethically and efficiently. For children, whose understanding of data and its implications may be limited, trust is even more vital. When parents and guardians believe that their children's data is being managed responsibly, they are more likely to engage with and support services and programs that require them to provide their and their children’s data. Without trust in how data is handled, children and their caregivers can adopt “privacy-protective behaviour”. This behaviour occurs when individuals or communities take actions to protect their privacy, often by limiting their engagement with services they perceive as untrustworthy. This can mean that children don’t receive the services they need. By establishing and maintaining trust through responsible data practices, organisations can prevent privacy protective behaviour and ensure that children and communities overall have access to essential services. Along the same lines, responsible data practices support an organisation's reputation as a responsible player, which equally contributes to trust in the institution. In an era where data breaches and privacy scandals can significantly damage credibility and trust, demonstrating a commitment to responsible data management is crucial. This reputation can enhance relationships with stakeholders, including parents and caregivers, communities, and funding bodies. Transparency in data practices and a proactive approach to safeguarding children's data can significantly bolster an organisation's standing in the community, both at national and international level. 2. Enhancing Performance through Better System Design Another main objective of implementing more responsible data practices is improving performance of data systems and the quality of related services. Responsible data practices require organisations to think critically about their data management processes and what the data will be used for, to ensure their efforts make a (responsible) difference and drive impact. For that reason, a clear purpose and information needs should be defined from design stage on, though it is never too late to make improvements in responsible data practices with children’s best interests in mind. Determining an overarching purpose of the insights gained through data can be further used for policy making and improving the quality-of-service delivery. For example, data collected in the social protection sector can be used to identify children in need of support and also to track progress and determine eligibility to additional services and interventions tailored to their specific needs. By monitoring progress in purpose-driven systems, social protection programs can track the effectiveness of interventions, adjust strategies as needed, and ensure continuous improvement. Additionally, data allows for efficient resource allocation, ensuring that social assistance, basic health insurance and so forth essential support reach the children who need them most, thereby maximising the overall impact of social protection initiatives. 3. Protecting Children and Safeguarding them from Harm The protection of children and their rights is a fundamental rationale for responsible data practices. Children are particularly vulnerable to the consequences of data misuse and its potential harms, making it imperative to prioritise their safety, in the digital age that we live in. Data about children’s personal lives is particularly sensitive. Misuse of this data can lead to significant harm, including identity theft, exploitation, discrimination, psychological impact, population fatigue of providing data and a loss of trust in service providers as previously mentioned. Another potential harm is the missed use of data for social good and inefficiencies in handling data, that can lead to opportunities for intervention for example on a policy level not being identified and critical needs not addressed. Responsible data practices aim to mitigate these risks by implementing robust safeguards and ensuring that children's data is handled with utmost care. International conventions, such as the United Nations Convention on the Rights of the Child (UNCRC), mandate the protection of children's privacy and data. Adhering to these legal frameworks is not only a compliance issue but also a moral imperative to promote and protect children's well-being. Implementing Responsible Data Practices in Government-Owned Data Systems: Key Considerations The rationales mentioned above highlight the importance of implementing responsible data principles, especially when handling data for and about children. It also provided practical insights into how these principles can be translated into practices and policies. By exploring the three broad rationales—trust and reputation, performance of services and related data systems, and protection from harms—we highlighted the multifaceted benefits of prioritising responsible data for children. Trust enables effective service delivery by preventing privacy protective behaviour, and supporting an organisation's reputation as a responsible player fosters this trust. A focus on data responsibility by design enhances system performance from the outset on, with a clear purpose and defined impact paths of both the programme and the data management system in question. The insights gained from data, can be used by public services and policy makers to improve the quality of their offering with children and their families’ best interests at heart. *** The bottom line here is responsible data management for children goes beyond mere compliance with existing regimes but requires a multifaceted approach intersecting with trust, performance, protection, and reputation, with a positive impact on children’s lives at the centre. If you’d like to follow the progress on this work, sign up for our newsletter here for further updates.
Read moreGuest Blog
Empowering Youth With Child-Aligned AIGuest blog by John-Matthew Conely (as part of his internship at the Responsible Data for Children Initiative); RD4C will delve deeper into AI and how it relates to the RD4C Principles. Children are growing up in a world where artificial intelligence (AI) is increasingly a part of their daily life: they interact with AI-powered technologies when they use social media, smart toys, smart devices, or watch online videos; their lives are affected by automated decision making for educational assessment; their personal development is likely influenced by the algorithms that permeate our digital environment. AI - defined by the OECD as machine-based systems that transform inputs into outputs, such as content or predictions, that influence the physical or virtual environment - can be highly positive for children, creating opportunities to sustainably flourish and grow while supercharging education and medical care. On the other hand, it also has the potential to cause harm to children. Social media algorithms have increasingly been shown to negatively affect children’s mental health, while AI assistants can impair children’s emotional and cognitive development. AI-powered IoT (Internet of Things) devices, even toys, can violate children’s privacy by harvesting their data without their consent. Generative AI has the potential to flood the internet with harmful content including child-abuse material, further endangering children’s safety. A growing recognition of AI threats and opportunities has led to greater public sector commitments to governing AI ethically, but such initiatives are still in early stages and place minimal focus on children. For AI to truly be of benefit to children, AI systems must be designed, developed, and deployed with the well-being of children explicitly in mind. That is to say, the output and applications of AI must be aligned with values and norms that recognize children’s and that of their caregivers preferences and their right to privacy, safety, and freedom of thought. This process of determining AI values is also known as AI alignment. What is AI Alignment? The AI alignment problem is concerned with ensuring that AI delivers output in a manner consistent with human goals, values, and preferences. To better illustrate what is meant here by “alignment”, consider designing a hypothetical AI chatbot. In developing this chatbot, you (the designer) would like to make ethical decisions about the kinds of responses the chatbot can provide to its users. For example, you might wish to restrict the chatbot from giving discriminatory or offensive answers. How would you determine which answers are appropriate or inappropriate for users? Would you decide unilaterally, believing that you already know what is best for them? Would you seek to gather input on user’s preferences, in order to more accurately reflect their needs? If your chatbot serves users in a variety of regional and cultural contexts, will you adapt the chatbot’s responses for particular contexts? Ultimately, every AI contains design choices that reflect specific decisions made about preferences and values, yet the designers themselves may not be in a position to arbitrarily decide which values are most appropriate. To properly align these systems with human preferences, participation is needed from the people and communities that the AI system affects. For this reason, participatory methods need to form the basis of a human-aligned AI. How to Facilitate Child-Aligned AI? AI-based systems rely on data to produce their valuable output: more data. As a data-driven technology, its usefulness and power is therefore reliant on the (responsible) data upon which it is trained. The Responsible Data for Children initiative was formed out of a desire to serve the rights of children in such data-oriented contexts. Its principles are flexible by design and applicable in a broad variety of contexts in which data is used. To see how our principles and tools are applicable to AI, consider the sample principle below. Participatory / People-Centric: For AI systems to be aligned with and prioritize children’s needs and preferences, there is a need to bolster efforts for greater data literacy in children, so that children themselves can better voice their concerns and recommendations, and to foster participation mechanisms that allow children and young people to actively engage. We should promote measures to educate children, parents, and related stakeholders on AI, the nature of consent, children’s rights and protections, and risks to children. These measures ought to be as inclusive as possible, giving voice to children from communities around the world, including from the Global South. Incorporating these participatory, democratic approaches into alignment of even the largest-scale AI is not a pipe-dream. One private company, Anthropic, is currently developing a process to align large language models via the incorporation of public input on high level ethical principles. Incidentally, this process results in a less biased model, with no appreciable degradation in math or language performance or shifts in perceived political stance. If properly leveraged, such approaches could allow all members of society, even the most vulnerable, to have a say in determining how technology will affect their future. Such approaches should be harnessed specifically to incorporate the values and goals of children into AI. Various vehicles to engage and interface with youth on emerging policy have already been developed, such as the Youth Solutions Lab’s participatory workshops which gather sentiments, insights, and preferences from youth on pressing policy issues. Crowdsourced input gathered from youth on specific moral preferences for artificial agents also show promise (see MIT Media Lab’s Moral Machine project). These participatory approaches could open a path to create direct linkages between the high-level principles informing AI model development and the preferences of children and youth who will eventually interact with those models. In tandem with these methods, boosting data literacy and communicating AI concepts to the public, especially youth, in a relatable, concrete manner will aid in effective discussion and participation. Join the Responsible Data for Children Efforts As we enter our fifth year, the Responsible Data for Children initiative continues to pioneer ways to address new and emerging data challenges affecting children around the world. We need thought partners to dedicate more substantial research to the field of AI alignment and child rights, empowerment and self-determination. If you would like to collaborate, please reach out to [email protected]. Image by Jamie Street | Unsplash is licensed under CC0.
Read moreNew Publication
Testimonials from the FieldSince the beginning of the year, the Responsible Data for Children (RD4C) initiative has been actively seeking feedback and to refine its focus for 2024. Following the insightful recommendations gathered during our recent RD4C dinner event, we have since diversified our outreach, collecting valuable testimonials from field practitioners we previously collaborated with in different contexts. In these interviews, Mariana Rozo-Paz (Policy, Research, and Project Management Lead at Datasphere Initiative), Risdianto Irawan (Data Specialist, UNICEF Indonesia), and Lisa Zimmermann (Chief of Child Protection, UNICEF Madagascar) working in Latin America, Asia and Africa respectively, stress the importance of engaging with young people. They emphasize the value of working particularly in the Global South—where the majority of adolescents and youth are located—to use technology for their well-being, while minimizing harm. Our interviewees recognise a need for increased capacity building of practitioners working with data for and about children, particularly on the foundational principles related to responsible data handling and the risks generated by data and data technologies. This is particularly true when this data is particularly sensitive, such as protection or mental health related data. These firsthand accounts offer a unique perspective on the real-world challenges practitioners are faced with, guiding us towards areas where our initiatives can make the most meaningful difference. Join us as we explore these testimonials, shedding light on our past work as well as the path forward for RD4C this year. *** As we enter our fifth year, the Responsible Data for Children initiative continues to pioneer ways to address new and emerging data challenges. The themes that emerged from our engagements with practitioners in the field will help guide a yearly Responsible Data for Children strategic dialogue to set priorities and select projects for the year. Our next strategy session is in mid April, please do not hesitate to reach out with suggestions at [email protected]! If you’d like to stay up to date with us as we continue this journey, we encourage you to sign up for our newsletter.
Read more
Check out other articles in our
Blog Section
About us
The RD4C initiative is a joint endeavor between UNICEF and The GovLab at New York University to highlight and support best practice in our work; identify challenges and develop practical tools to assist practitioners in evaluating and addressing them; and encourage a broader discussion on actionable principles, insights, and approaches for responsible data management.
The work is intended to address practical considerations across the data lifecycle, including routine data collection and one-off data collections; and compliments work on related topics being addressed by the development community such as guidance on specific data systems and technologies, technical standardization, and digital engagement strategies.
Additional tools and materials are coming soon and will be posted on this website as they become available. Join the conversation to receive regular updates.