The RD4C Principles
Principles to guide responsible data handling toward saving children’s lives, defending their rights, and helping them fulfill their potential from early childhood through adolescence.
Read about our principles at
Principles Page
From our blog
New developments from RD4C.
Data Governance for Children
What Makes a Child Thrive? Abu Dhabi Has a New AnswerWhat does it really mean for a young child to thrive? It’s a question that often goes unanswered, partly because the information we rely on is scattered across sectors and shaped by narrow definitions. Health statistics might tell one story, education scores another, while broader aspects of a child’s environment often remain invisible. The Abu Dhabi Early Childhood Authority (ECA) is attempting to do something about it. With the development of the Thriving Child Index, ECA is creating a new framework that rethinks how we monitor, and ultimately improve, child wellbeing. Rather than using isolated figures, the Index will bring together a range of indicators, from child health and learning outcomes to the family and care systems that support them, to give a comprehensive view of what it takes for young children to flourish. In this Q&A, Saleha Al Azri, Director of Research and Cognitive Solutions at ECA, speaks to the thinking behind the Index, the knowledge gaps it aims to fill, and how it offers an integrated approach to measuring early childhood wellbeing. They also reflect on the impact of data governance, partnerships like RD4C, and how evidence-based results must be translated into practical policy action. This conversation offers an interesting insight into how one government is working to turn disconnected data into an effective tool for equity, transparency, and sustainable child wellbeing. Q: Before we dive in, could you please share a bit about the Abu Dhabi Early Childhood Authority? A: The Abu Dhabi Early Childhood Authority, established in 2019, promotes optimal child development and well-being from the early stages of pregnancy to the age of eight. We influence policies, inform decision-making through research, and incubate innovative ideas—all while enabling the Early Childhood Sector and navigating the impact it’s making. We focus on Health and Nutrition, Child Protection, Family Support, and Education and Early Care. Our vision is that every young child can flourish to their full potential in a safe and family-friendly environment. Q: What led to the development of the Thriving Child Index, and how does it differ from other existing indices evaluating children’s welfare? A: As part of our work over the last five years, we noticed there was a gap in having a unified framework for assessing how well children are growing. Currently, different government entities globally assess child wellbeing through their own sector-specific indicators – for instance, health agencies prioritize medical wellbeing, education bodies assess learning outcomes, and social services focus on family support. While these criteria are all important, measuring children’s wellbeing in that siloed manner does not provide a holistic overview. A unified framework is needed to ensure child wellbeing is not measured through a single metric, but takes into consideration the different aspects that make up young children’s lives and development. This crucial structure would enable us to have a better lens through which to understand how new initiatives and policy changes are effectively affecting young children’s lives. This led us to start working on the Thriving Child Index in 2025, which we are leading in collaboration with key government agencies, experts, and researchers. Through this initiative, ECA is stepping in to fill that gap and help develop a holistic framework to answer the critical question: “How do we know children are thriving?" Q: How does your team define ‘thriving’ in the context of this Index? A: This is the exact question which the Thriving Child Index will aim to answer. Through careful consideration of all key areas that contribute to the healthy development of a young child, we will aim to develop a framework which defines ‘thriving’ in a holistic manner. The metrics to be used are currently being identified in collaboration with a coalition of global and local experts to ensure the Index covers all necessary sectors. Q: Could you please walk us through the Index’s scope, such as geographic coverage, age ranges, and core indicators? A: The Thriving Child Index will include indicators that cover all pillars of early childhood development, including education, health, parental support, culture, values and wellbeing, ensuring it acts as a carefully considered framework for measuring a child’s wellbeing as they progress through infancy up to age eight. As the Index is intended to support and influence policymaking, several levels of data quality validation are required to finalize the indicators and reach an agreed-upon list of metrics, which we are hoping to achieve by the end of the year. Q: Abu Dhabi ECA and the RD4C Initiative have been long-time partners. How has this collaboration contributed to the development of the Thriving Child Index? A: The collaboration with RD4C has yielded many benefits to ECA and the work we are doing with children’s data. Child data privacy has always been a high priority, and the collaboration has allowed us to refine and improve our methods in different aspects of initiative, from the intent behind data collection and its scope to future access and data-sharing practices. While working on defining the Thriving Child Index, we made certain that the work that’s been done is aligned with the principles of RD4C, which are flexible and can be easily adapted to Abu Dhabi’s context. Q: Let’s talk about data and RD4C in action. Where do you source your data, and how do you ensure it represents all children? When reliable data is limited, how do you address the gaps? A: At ECA, we source data from cross-sector government systems, surveys, and community partnerships, ensuring representation of all children—including vulnerable groups—through representative sampling and localized data collection. When administrative data is scarce or not enough to answer the questions, we employ predictive analytics, expert consultations, and community engagement to fill gaps while adhering to RD4C’s principles of purpose-driven and proportional data use. Q: Once the data is collected, how do you ensure that the analysis accurately reflects children's realities without reinforcing unintended biases? A: We apply strict bias-mitigation techniques, including disaggregation by demographics (e.g., gender, socioeconomic status) and regular model audits, to ensure our analysis reflects children’s diverse realities and environments. Additionally, we validate findings through ECA’s local and global panel of experts, community feedback when applicable, and cross-referencing with global benchmarks ensuring accuracy and representativeness while minimizing biases. Q: How do you envision policymakers using this Index to drive meaningful change? A: The Thriving Child Index is designed to help policymakers make well-informed decisions when it comes to young children’s lives in the Emirate of Abu Dhabi. In addition to helping them assess the true impact of their work, the Index will act as a long-term policy compass, enabling leaders to continuously monitor trends, identify emerging challenges, and proactively adapt strategies to ensure every child thrives, not just today, but for generations to come. We hope the Index can inspire and inform similar efforts across the region and globally, offering a practical model that turns child wellbeing into actionable policy. Q: ECA and RD4C share a similar mission to protect children’s data while making the most of it. How does your team strike that balance? Any lessons others working in this area might learn from your experience? A: Our approach to collecting and using data has always been centered around child protection first. Our work is heavily based on research and data, and we understand the importance of using the data responsibly and maintaining the trust of our community. We are transparent with our intentions whenever collecting data, and use extremely secure tools and procedures when working with the data we have gathered. Additionally, we implement strict data anonymization and access control to the data by default. We also make sure not to expose data elements that can be used to reverse engineer the identity of children through probabilistic matching. However, we encourage data-driven decision-making and policy, as long as responsible data use and compliance with privacy laws are observed. We have found that being transparent with our stakeholders leads to higher levels of trust and improves collaboration between all parties involved, including the community. Q: Thank you for sharing these valuable insights. We look forward to seeing the Thriving Child Index take shape. When can we expect the first results to be made public? A: We are currently in the initial stages of developing the Thriving Child Index, as it was just announced in February 2025. The first baseline measurement of thriving children will take place in 2026, and we aim to officially launch the Index in 2027. The findings and the framework of the Thriving Child Index will be made public after validation, and the aggregated trends and insights will be communicated to the public through multiple platforms.
Read moreData Governance for Children
Advancing Child and Youth-Centered Data Governance: Insights from Engagements with Young People From Around the World(Note: This policy recommendation has been reviewed by Rubina Adhikari, Navina Mutabazi, and Juan Felipe Montenegro. These young leaders have participated in youth engagement activities related to the UN World Data Forum through the Commitment to Data Governance Fit for Children, a pioneering initiative led by UNICEF’s Data Governance Fit for Children programme. Their insights have greatly enriched this work, and we sincerely appreciate their valuable contributions.) Introduction Children and youth have become deeply integrated into today’s digital ecosystems, but the governance of their data remains a pressing issue. We live amid a proliferation of digital platforms, AI-driven technologies, and online gaming, but regulation and governance struggle to keep pace. While the United Nations Convention on the Rights of the Child recognizes children's right to privacy and participation, global digital governance frameworks often fail to adequately protect and fulfil their rights. UNICEF has also been a vocal advocate for better governance of children's data and youth engagement. Still, it is not clear if UNICEF alone is enough in a complex ecosystem that includes governments, companies, and more. This piece explores actionable policy measures to address data governance fit for children, drawing from youth engagement insights led by UNICEF’s Data Governance Fit for Children (DG4C) programme at the UN World Data Forum (UNWDF), youth article campaigns and research on existing global practices. Policy Context Due to different infrastructure and governance capacity levels, countries in the Global North and Global South face distinct challenges when handling children's data. The former tend to have more resources to develop data technologies and systems, with commercial interests potentially hindering children's digital rights. These advances are at times enabled by testing periods in less regulated environments of the Global South, where countries struggle to use them in a protective fashion for their people. Even in jurisdictions with end-to-end regulation frameworks, such as the European Union’s General Data Protection Regulation, there are still irresponsible practices regarding children's data. This problem is partially due to general enforcement gaps—such as difficulties in regulating cross-border data flows—and the difficulties of catching up with the rapidly evolving technological landscape, including advancements in AI. At the same time, children's rights are often more recognized in narratives that frame them as recipients of protection. Their right to participate in decision-making and contribute their unique insights is often overlooked. Their rights can be obscured by age-generic approaches in policies, policies that regard children as immature actors in need of guidance. These policies fail to consider their development and how their needs change as they grow. These issues are important given the large role children play online. With an estimated one-third of internet users under 18, children's digital interactions generate vast amounts of data. Poor data governance exposes them to serious risks—such as privacy violations through unlawful data collection and commercial exploitation, as well as discrimination. Algorithmic bias can reinforce inequalities, particularly for children in the Global South or from marginalized groups. Weak protections also leave their information vulnerable to breaches, re-identification, and misuse. Beyond security, unchecked data practices can also cause psychological harm. That is why policymakers need to take data governance fit for children seriously, not just listening to young people, but recognizing and strengthening their agency, empowering them as leaders to shape policies themselves. To build a safer digital world for children, policymakers need to treat them as equal stakeholders and respect their needs and voices. Only then can policies be developed that truly act in their best interests. Key Insights 1. Children and youth have a growing understanding of data; As they become more immersed in digital spaces, their awareness of data deepens. In one of the pre-UNWDF exchange sessions that the DG4C programme held with over 30 young leaders from UNICEF's Generation Unlimited, attendees from across the world shared how their perception of data has changed over time. When asked what they thought about data when they were younger, many recalled that they didn't think much of it, seeing it as a math-related term or administrative record. However, their understanding expanded after experiencing how data is everywhere in this rapidly developing digital age. They have come to realize that the meaning of data has broadened tremendously. It is not just numbers, but rather—in their terms—a collection of records from various sources, from social media posts to healthcare archives, holding great power to shape their lives. This growing awareness indicates that, with the prevalence of datafication, children and youth today are becoming more data-literate. Given this, it shouldn’t be too difficult to equip them with the vital knowledge and skills they need to survive an increasingly data-driven world. Besides, young people are in need of such education and exposure to become more aware of risks and opportunities associated with data, so that they can make more informed decisions. 2. Children and youth recognize the potential of data but also the risks. They sometimes struggle to understand how their data is collected and used and what they can do to protect their privacy and rights. At the UNWDF, five young representatives from the Commitment to DG4C had the opportunity to interview senior global leaders from government statistical authorities, international organizations, and data-focused NGOs. The representatives were enthusiastic about data's “greater good” possibilities for society. They were eager, for example, to learn how children's data could be processed to enhance educational opportunities, as well as how they could develop their own data literacy to prepare for and contribute to a better digital future. While discussing their concerns, they repeatedly questioned the transparency of global data practices. They were most concerned with unauthorized access to their data and misuse of it. Many stressed the urgent need for greater responsibility and accountability throughout the data's lifecycle. However, the participants noted that most children and youth lack the knowledge to meaningfully mitigate these risks. During the pre-event discussion session, participants expressed frustration over the complexity of data policies and the use of ambiguous, overly technical language by those handling their data. They also highlighted the need for structured data education, urging schools to integrate data literacy into their curricula, as many felt their parents were short of the expertise to support them. 3. Children and youth are aware of the disparities in data infrastructure and data governance in our world, particularly in underserved regions and marginalized communities, and advocate for urgent attention and action. In the youth-moderated cross-generational breakfast discussion at the UNWDF, 18 youth representatives from diverse backgrounds spoke out about the unequal distribution of data resources. Using their firsthand experiences, representatives shared how some communities in the Global South still lack essential data infrastructure, preventing them from benefiting from data-driven opportunities. A young leader also pointed out in the pre-event discussion that, since most major AI models are trained in mainstream languages like English, they are likely to be biased against other regions of the world. They urged senior policymakers to take real action to close this gap, ensuring that disadvantaged areas are not left out in the digital era. Without serious reform and investment, the digital divide will only perpetuate. Meanwhile, participants were worried about gender minority visibility and gender bias in data collection. A representative questioned whether existing data frameworks actually capture gender diversity at the UNWDF, challenging the level of existing data inclusivity. Similarly, during the pre-event discussion, one attendee pointed out that gender bias in data is not only an oversight but has real consequences, for example, in resource allocation. 4. Children and youth call for meaningful inclusion in data-related decision-making. They want a real seat at the digital table, rather than being merely symbolic participants. A recurring theme in the pre-event article campaign, an initiative soliciting youth voices about responsible data, was the desire for meaningful institutionalized youth participation in data governance discussions. Nearly two-thirds of the submissions mentioned the importance of youth involvement in data policymaking. In one particularly compelling article, a 17-year-old argued for their rightful presence at the “digital table.” The author pointed out that “data is our most precious asset when we are online, but often, we are powerless, victims to the companies and advertisers scraping our data and selling it to the highest bidder,” calling for a more sincere attitude towards youth in digital policymaking. In a recent webinar organized by UN Data Strategy, two young advocates reflected on their experience in the UNWDF. As they reported, mere presence is not enough; their people must be meaningfully consulted and engaged in shaping the future of data governance, as it also affects them. 5. Children and youth are intrigued by AI and what it could mean for their lives, but they are also worried about how it might harm them. They have an ardent desire to learn about this rapidly developing technology. Throughout the youth-led engagement sessions before, during, and after the UNWDF, AI emerged as one of the most frequently discussed topics. From improving conflict prediction in humanitarian rescues to advancing personalized education, young people see AI as a powerful tool that can transform many aspects of our lives. They are keen to learn how this technology actually works and how it can be better leveraged to benefit humankind. Children and youth are also worried about AI's negative impacts. One main concern is the creation and spread of misinformation, which could magnify the complexity of social problems. They also questioned the fairness and bias in AI algorithms, together with a lack of transparency and invasion of privacy. Given the heterogeneous state of regulatory frameworks across the world and the rapid development of AI, discrepancies in governance and oversight have posed pressing challenges that demand urgent solutions. Policy Recommendations Based on the recollection of insights from our recent youth-led engagement activities, the DG4C programme proposes the following five policy recommendations, some of which have been included in the young representative's declaration presented during the forum's closing ceremony, reaching an audience of 2,700 attendees, including global policymakers, data experts, and civil society leaders. 1. Reform and strengthen data governance frameworks to prioritize children's rights while continuously improving them to suit the evolving digital world. A UNICEF manifesto has indicated that existing data governance systems are insufficient to protect and empower children. Despite more resources being poured into this field, it remains a challenge. To enhance data governance frameworks that put children's rights first, governments, research institutions, international organizations, and relevant companies should take proactive steps to: Invest in evidence-based research to document youth voices and needs in data governance, ensuring the governance system considers the needs and concerns of young people, particularly those from marginalized communities. Conduct comprehensive audits on existing data frameworks based on these insights and other successful practices to assess their fitness for purposes with regard to the protection of and promotion of children's rights, including but not limited to the consent mechanisms, data collection practices, and security protocols. Implement child-specific safeguards, such as default high-privacy settings, strict age verification, and limits on data tracking, to prevent misuse and exploitation of children's data, fixing problems and filling gaps identified in the audits. Develop a comprehensive legal structure that aligns national laws with global standards like the UN Convention on the Rights of the Child, ensuring consistency in protecting children's rights. Keep policies up to date with, or even ahead of, the fast-changing digital landscape to ensure their effectiveness and efficiency, especially regarding newly emerging risks associated with AI, such as biometric data misuse and digital identity theft. One useful resource is Responsible Data for Children (RD4C), a joint initiative by UNICEF at the Chief Data Office and The GovLab at NYU that offers a principles-led framework, tailorable guidance and practical tools to help organizations adopt inclusive, child-rights centered and responsible data practices throughout the data lifecycle. 2. Promote international collaboration to bridge digital and data equity gaps. Fulfilling the inequalities in digital access and data governance requires global coordination. International organizations, governments and the private sector should work together to: Increase investment in digital infrastructure for underserved regions, ensuring equitable access to data-driven opportunities and closing the digital divide. Foster cross-border collaboration by sharing knowledge gained and lessons learned in practice, allowing regions in need to adapt effective policies to their unique local contexts, as well as providing financial support. Make sure data collection reflects diversity, especially when it comes to gender minorities and marginalized communities, so that data-driven policies do not reinforce existing inequalities. International organizations, such as the United Nations (especially the ITU), the World Bank, and other regional networks like the African Union, can play a pivotal role in facilitating dialogue, mobilizing resources, and setting global standards for ethical and inclusive data governance fit for children. The Commitment to DG4C also contributes to this purpose, bringing together coalition members from government and academic institution to NGOs and private companies. 3. Invest in capacity building and data literacy, with a particular focus on AI education. Our children today are the very first generation of citizens to be datafied, with their personal information collected even before their birth. As digital technologies advance, companies, governments, and other actors have an unprecedented ability to collect and analyze children's data. Meanwhile, AI systems are becoming more embedded in their daily lives. Without strong data and AI literacy, young people will be vulnerable to many risks, including but not limited to misinformation, algorithmic bias, and privacy threats. To address this and respond to young people's desire to learn more, governments and educational institutions can: Integrate data and AI literacy into school curricula, ensuring young people understand how their data is collected, stored, and used, how AI systems are developed and operate, as well as how they can better protect themselves while making the most of the technologies. Build virtual learning hubs beyond the traditional education system (especially in regions where in-person education is challenging) and develop accessible learning resources that break down complex data and AI-related concepts into age-appropriate content. Support children by supporting adults, equipping teachers (and even parents) with the necessary knowledge and tools to educate students and children about responsible data practice and their digital rights. UNICEF Innocenti’s foresight toolkit is a valuable resource to encourage young people to imagine what responsible data governance looks like for them in an ideal digital future. The DG4C programme is also driving efforts towards this goal in collaboration with UN Global Pulse in Uganda. With all stakeholders working together, a data-literate next generation will not only be better equipped to advocate for their own rights, but also be better positioned to shape a more open data governance framework and build a more equitable digital future. 4. Ensure more responsible and authentic youth engagement. Children and youth are typically portrayed as recipients of data protection, yet policymakers must remember that they also have the right as well as the capability to shape their own digital future. There must be efforts to balance data protection and self-determination as well as efforts to engage them actively in the policy-making process. To ensure meaningful and diverse youth engagement, governments, international organizations, and private sectors should: Establish youth advisory councils within international and national data governance bodies to institutionalize youth participation in data governance. While some organizations, governments and tech companies have taken action, efforts remain insufficient and should be scaled up. Require youth representation in key data policy discussions, ensuring their voices shape regulations related to their digital future. For the first time at a UN Data Forum, the perspectives of children and youth are being heard. Such inclusion should happen more often and become the norm. Support youth-led research and advocacy, providing targeted and sustainable funding, mentorship, and platforms to amplify their impact. Policies and interventions sometimes failed to include Foster intergenerational collaboration, taking advantage of both young people’s fresh perspectives and first-hand experiences in digital ecosystems and older generations’ institutional knowledge and policy expertise. Authentic engagement goes beyond symbolic inclusion. UNICEF's DG4C Programme exemplifies this, centering its work closely collaborating with global youth networks such as the Generation Unlimited, PSDD’s Data Values Advocates, and local youth groups in countries like Colombia. It has also initiated cross-generational discussions between youth and decision makers. DG4C recognizes and emphasizes the need for young people to take an active role in decisions affecting their data rights, while also providing platforms and support for youth-led engagement and initiatives, ensuring that children and young people can meaningfully contribute to shaping digital policies and governance. 5. Launch public awareness campaigns on responsible data and data rights. From gaps in data literacy to digital inequality and the exclusion of youth from decision-making, many of the issues raised above by young people have been broadly acknowledged. Yet, they often go unaddressed, frequently due to limited resources, institutional inertia, or a lack of awareness and political will. To drive real change, we need widespread public awareness campaigns that: Leverage mass media, academic research, symposiums, and other platforms to make children's digital rights a widely accepted norm and foster a culture where protecting them is a shared societal responsibility. Employ creative approaches, such as visual arts, short films, games and other tools, to make complex data concepts more accessible and engaging. Empower children and the public to hold governments and other data governance bodies accountable while ensuring businesses understand the consequences of non-compliance. Governments should initiate public engagement initiatives and encourage scrutiny of data practices related to children. Encourage youth-led awareness campaigns that bring all stakeholders together. This can be done through youth-led workshops, school programs, social media campaigns, and industry events. Governments, international organizations, and tech companies should work hand in hand to support these campaigns. Not until awareness is raised at all societal levels will the digital ecosystem truly respect children's rights, foster responsible data behavior across the board, and promote an inclusive digital future. Conclusion The digital world continues to evolve at an unprecedented speed, yet the need to preserve childrens’ rights remains constant. To ensure a fair digital future that protects our next generation, it is crucial to strengthen data governance, bridge the digital divide, institutionalize youth participation, invest in AI and other data tech literacy and increase data awareness. These are not just policy choices. They are fundamental commitments to human rights, equity, and long-term sustainability. Every policymaker, industry leader, and civil society member was once a child, dreaming of a society that would serve their best interest. We must honour that hope and take decisive action now, as the decisions made today will shape the digital world of tomorrow.
Read moreGuest Blog
Opening the Black Box: Leveraging Youth Power to Decode AI in Humanitarian CrisesAs young researchers deeply interested in the intersection of technology and social impact, we have been investigating the use of artificial intelligence (AI) in humanitarian and peacebuilding efforts. This exploration has revealed a complex landscape of serious potential and existing ethical concerns. At a time when AI is rapidly reshaping how crises are predicted, managed, and responded to, it is crucial for young voices to be heard on the responsible development and governance of AI, particularly in conflict-affected contexts. Risks and Opportunities of AI in Humanitarian Settings AI offers extraordinary opportunities to enhance humanitarian and peacebuilding efforts and accelerate the delivery of support to beneficiaries. For instance, machine learning (ML) algorithms can analyze vast amounts of data to predict potential conflict hotspots, facilitating more proactive peacekeeping interventions. AI-powered chatbots can provide mental health support to refugees, bridging critical gaps in care. Natural language processing (NLP) tools can break down language barriers in crisis communication, and AI-powered early warning systems can analyze online news articles and social media posts to predict the likelihood of violent events in a given area. However, these technologies also carry significant risks, especially when deployed in vulnerable communities. Our research has identified several key areas of concern: Algorithmic Bias: AI models trained on non-representative data can perpetuate and amplify existing biases, leading to discriminatory outcomes in aid distribution or conflict analysis. A 2021 study found that widely-used NLP models exhibited significant biases against certain dialects and linguistic variations, leading to higher false positive rates for hate speech in texts written by marginalized communities. The study evaluated popular NLP models like BERT and RoBERTa on a dataset of Arabic tweets, finding that the models struggled to accurately classify hate speech in dialectal Arabic and often misclassified innocuous phrases as offensive. Privacy and Consent: The collection of sensitive data for AI applications raises serious privacy concerns, especially in contexts where individuals may feel pressured to provide personal information to access vital services. The World Food Programme's implementation of the SCOPE system in Uganda's Bidi Bidi refugee settlement in 2018 highlights these issues. Many refugees reported feeling compelled to provide their biometric data to receive food aid, raising questions about forced consent among people living in insecure environments. Lack of Transparency: Many AI systems operate as "black boxes," making it difficult for affected individuals to understand or contest decisions made about them. This opacity is particularly problematic in humanitarian contexts where decisions can have life-altering consequences. The Dutch government's use of an algorithmic risk assessment system (SyRI) to detect welfare fraud, which was found to violate human rights by a Dutch court in 2020, is one example of how opaque AI systems in social services can harm intended beneficiaries. Erosion of Human Agency: Over-reliance on AI in humanitarian contexts risks undermining participatory decision-making processes, sidelining the communities these efforts aim to support. Empowering Youth Through AI Literacy To navigate this complex landscape, it is crucial that young people become better informed about AI technologies and their implications. This goes beyond fostering basic digital skills to developing a deep understanding of how AI systems work and their limitations—including machine learning, neural networks, and deep learning. Young people can participate in identifying potential biases in AI applications and learn how to mitigate them through diverse data collection and algorithmic fairness measures. AI literacy also involves an awareness of data rights and privacy implications, including concepts like data minimization, purpose limitation, and the right to explanation under regulations like GDPR. Educational institutions and youth organizations should prioritize AI literacy programs that equip young people with understanding and engagement with AI systems. Participatory workshops where young people can analyze real-world AI systems used in humanitarian contexts would be particularly valuable, where youth examine the UNHCR's Project Jetson, which uses machine learning to forecast forced displacement, and discuss gaps in governance, the ethical implications of the project, and methods to strengthen protections for beneficiaries affected by the project. Youth-Led AI Governance: From Consultation to Co-Creation Young people shouldn't just be subjects of AI governance. We should be active participants who help shape it. Organizations developing AI for humanitarian use should establish youth advisory boards with real decision-making power. Beyond traditional policy bodies, youth can influence AI governance through: Grassroots campaigns raising awareness of AI ethics in humanitarian contexts, such as social media campaigns highlighting the potential risks of biometric data collection in refugee camps Developing youth-led ethical guidelines for AI in crisis response, drawing inspiration from existing frameworks like the IEEE's Ethically Aligned Design principles Participating in "algorithmic audits" to assess AI systems for potential bias or harm, using tools like IBM's AI Fairness 360 toolkit or Google's What-If Tool Creating youth-centric resources on responsible AI development, such as interactive online courses or podcasts discussing real-world case studies of AI ethics in humanitarian contexts Engaging with tech companies and NGOs on ethical AI design and governance, potentially through internship programs or youth innovation challenges focused on developing ethical AI solutions for humanitarian challenges Young people have an innately valuable perspective on AI and technology due to growing up in a digital world. We are inheriting a rapidly changing AI landscape, where this technology is being deployed in every field, and its regulation is severely lacking. It is necessary for governance to keep up with technological advancement, and we can contribute to mitigating many of the unintended consequences of AI systems that older policymakers and technologists are less equipped to confront. Ethical AI: Harnessing Youth Innovation Young people can also advocate for transparency and education of AI systems used in crisis response through demanding clear documentation of training data sources and decision-making processes. On the development side, young people can create AI applications that address humanitarian challenges while prioritizing ethics, such as those with privacy-preserving federated learning models for health data analysis in refugee camps. Participating in "AI for good" hackathons focused on ethical challenges in peacebuilding can promote AI literacy and youth participation in the ethical development of AI, and can result in developing AI-powered conflict early warning systems that respect privacy and avoid stigmatization. Young people can also collaborate with researchers to study AI's impact on vulnerable communities through participatory action research projects that involve affected populations and are informed by their experiences. The Road Ahead: Building an Ethical AI Future As we navigate the complexities of AI in humanitarian contexts, amplifying youth voices is essential. Young people have a vital stake in how these technologies are used in global crisis response and peacebuilding efforts, and we bring digital fluency, diverse perspectives, and a dedication to ethical development to these critical discussions. By engaging youth as partners in ethical AI governance, we can harness the potential of these powerful technologies while safeguarding human rights, promoting fairness, and upholding the dignity of vulnerable populations. To build a truly ethical AI future, we need sustained collaboration between youth, policymakers, humanitarian organizations, tech companies, and researchers. This means creating ongoing channels for youth input, investing in AI literacy programs, and empowering young innovators—in both fragile contexts and developed nations—to cultivate responsible AI solutions for humanitarian challenges. About the Author Marine Ragnet and Aiza Shahid Qureshi are researchers at NYU, specializing in the examination of AI's impact in fragile contexts. Their research develops a multilevel framework to analyze AI-related risks and opportunities in humanitarian efforts and peacebuilding. Drawing on their interdisciplinary backgrounds, they aim to shape ethical guidelines and governance mechanisms for responsible AI deployment in challenging environments, bridging the gap between technological innovation and humanitarian needs. (Photo by Salah Darwish on Unsplash)
Read moreGuest Blog
How Youth Can Reclaim Control Over Their DataIn today’s digital age, data has become the currency of the modern world, shaping everything from social media algorithms to targeted advertising. Every time young people log into their favorite apps or post on social platforms, they leave behind valuable data that is harvested, analyzed, and often sold. Yet, according to a 2022 survey by Pew Research Center, 79% of teenagers in the U.S. admitted they were concerned about how their personal data was being used but felt powerless to control it. This disconnection between data use and understanding creates a form of data hegemony—where large corporations wield control over users’ information, while users, particularly youth, remain largely uninformed about the fate of their data. This issue is even more pronounced in developing countries, where young people often have less access to digital literacy education and privacy protections. According to UNICEF’s 2021 report, children and youth in low- and middle-income countries are particularly vulnerable to data misuse due to weaker data protection laws and limited regulatory oversight. For instance, in sub-Saharan Africa, only 28% of countries have implemented comprehensive data protection policies, compared to 80% in Europe. As mobile phone and internet usage rapidly expands in these regions, young people are increasingly exposed to privacy risks without the necessary tools to understand or control their data. This article will discuss how young people, especially in vulnerable regions, can become better informed about their data usage and break free from passive participation in the digital world. Understanding the Dynamics of Data Usage Data usage is a complex, often invisible process that occurs every time we interact with digital platforms. When young people sign up for a new app or service, they aren’t just providing an email or username—they are potentially handing over much more, including their location, browsing habits, and interactions on the platform. This data is collected by companies to build detailed profiles that go beyond what users knowingly share. For instance, social media platforms like Instagram or TikTok don’t just record posts and likes; they also track how long users linger on certain posts, which ads catch their attention, and even the time of day they’re most active. This is known as behavioral tracking, and it fuels targeted advertising and content recommendations. A famous real-world example of data misuse is the Facebook–-Cambridge Analytica scandal, where personal data of millions of Facebook users was harvested without their knowledge or consent. This data was used for political advertising, influencing voters in several countries. Many users were unaware that simply engaging with a quiz or app on Facebook could give third parties access to their personal information. While the Cambridge Analytica scandal occurred in a developed context, similar issues arise globally. For instance, in developing countries like Kenya, where internet access is rapidly growing, a 2021 Mozilla Foundation study found that many popular apps collect personal data without adequate user consent, further highlighting the risks for young people in regions with weaker privacy regulations. As artificial intelligence (AI) increasingly shapes digital experiences, it's important for young people to understand that AI models are trained on vast amounts of data, including their own. Without proper oversight, this data can be used in ways that reinforce biases or invade privacy. Youth must be aware of how their data contributes to AI systems and advocate for ethical AI use and transparency in its data training processes. Therefore, young people across the world should understand these dynamics of data collection and usage is vital for safeguarding their privacy and digital rights. Challenges to Informed Data Usage One of the most significant barriers to informed data usage is the complexity of privacy policies and terms of service agreements. For instance, Instagram’s privacy policy states, “We collect and use your content, communications, and other information you provide when you use our Services,” which includes everything from the photos you post to metadata, such as time stamps and geolocation. This dense language, often found in terms of service agreements, makes it difficult for the average young user to fully comprehend what they are agreeing to. A 2021 Common Sense Media report revealed that only 9% of teens feel confident they understand the privacy policies they encounter, leaving them unaware of how much data they’re sharing. This problem is exacerbated by the power imbalance between tech companies and young users. Large tech companies like Google, Facebook, and TikTok employ sophisticated algorithms capable of processing massive amounts of data. They collect, store, and analyze personal information, often selling it to third parties for targeted advertising. Most young people are unaware of this extensive data collection, believing they are merely engaging with a fun app or social media platform. The psychological impact of this lack of understanding is profound. Studies show that young users, when confronted with the reality of how much data is collected about them, feel overwhelmed and powerless. A 2021 UNICEF survey found that 73% of teens across multiple countries, including Brazil and South Africa, felt concerned or anxious when they realized the extent of data being collected without their consent. Empowering Youth Through Data Literacy Educational programs are essential to empower young people with knowledge about data usage. One solution is forming partnerships between tech companies and educational institutions to teach students about data rights using interactive, scenario-based learning. For example, companies like Google or Facebook could collaborate with schools to organize workshops where students make real-world privacy decisions. A 2020 MediaSmarts study found that students using privacy simulations were 40% more likely to understand data sharing implications than those who only read about it. Simplifying privacy policies is also crucial. Many are filled with legal jargon that’s hard for young users to understand. For instance, the phrase, “We may collect and process personal information such as geolocation” could be simplified to, “We use your location to show relevant ads.” In the United Kingdom, the Information Commissioner’s Office (ICO) has introduced child-friendly privacy policies using simpler language. Furthermore, gamifying data literacy could make learning fun and engaging. Apps could offer rewards for securing profiles and adjusting privacy settings, similar to how Duolingo motivates users through gamification. This would encourage young people to actively manage their data while making privacy education more engaging. The Role of Young People in Shaping Data Policies Youth have a significant role in shaping data policies and advocating for responsible data usage. Across the globe, youth-led initiatives are gaining momentum, demonstrating how young people can actively participate in discussions around data governance. For example, in the U.S., the Future of Privacy Forum engages students to influence policy debates around digital privacy. Similarly, the Youth IGF (Internet Governance Forum), a global initiative, provides platforms for young people to contribute to discuss internet governance, data privacy, and security. In developing economies, young leaders are also making their mark. In Kenya, the Ajira Digital Program has empowered youth by equipping them with digital skills and advocacy tools to participate in data policy discussions. These initiatives showcase how Generation Z brings fresh perspectives to the challenges of responsible data usage and security, promoting data transparency and accountability. To further engage in policy-making, young people can work with policymakers by joining public consultations and online advocacy groups, and collaborating with organisations dedicated to data rights. Platforms like the European Youth Parliament provide avenues for youth to voice concerns and shape policies that ensure data transparency and the protection of their digital rights. Conclusion Informed data usage is essential in today’s digital world. Young people are major contributors to the vast amounts of personal data being collected. By understanding how data is gathered and used, youth can break free from the passive role they often play in digital ecosystems. Education is the key—schools and tech companies must collaborate to teach data literacy through interactive learning, simplified privacy policies, and engaging tools like gamification. Youth also have a vital role to play in shaping the future of data governance by participating in advocacy efforts, influencing policies, and collaborating with policymakers. This helps promote data transparency and responsible usage. By becoming informed and actively engaging, young people can help create a digital world where their rights and privacy are fully protected. About the Author Tuhinsubhra Giri is an Assistant Professor at the Centre for Studies in Population and Development, Department of Economics, Christ University (India). He is passionate about data governance, MSME development, and youth empowerment, consistently merging theoretical insights with practical applications to drive strategic policy recommendations. Si Peng works as a Program Manager at the Institute for Sustainable Development Goals, Tsinghua University. Her areas of interest include advancing the Sustainable Development Goals (SDGs), digitalization, and designing training methodologies for government institutions. References Ajira Digital Program, “Youth and Digital Skills in Kenya,” 2021.Common Sense Media, “Privacy Risks for Teens,” 2021.Future of Privacy Forum, “Youth Privacy Advocates Program,” 2022.MediaSmarts, “Interactive Privacy Education,” 2020.Mozilla Foundation, “Privacy Concerns in the Global South,” 2021.Pew Research Center, “Teens’ Concerns About Privacy in a Digital Age,” 2022.The Guardian, “The Cambridge Analytica Scandal,” 2018.UNICEF, “Children’s Data Protection in the Digital World,” 2021. (Photo by Kateryna Hliznitsova on Unsplash)
Read more
Check out other articles in our
Blog Section
About us
The RD4C initiative is a joint endeavor between UNICEF and The GovLab at New York University to highlight and support best practice in our work; identify challenges and develop practical tools to assist practitioners in evaluating and addressing them; and encourage a broader discussion on actionable principles, insights, and approaches for responsible data management.
The work is intended to address practical considerations across the data lifecycle, including routine data collection and one-off data collections; and compliments work on related topics being addressed by the development community such as guidance on specific data systems and technologies, technical standardization, and digital engagement strategies.
Additional tools and materials are coming soon and will be posted on this website as they become available. Join the conversation to receive regular updates.