The RD4C Principles
Principles to guide responsible data handling toward saving children’s lives, defending their rights, and helping them fulfill their potential from early childhood through adolescence.
Read about our principles at
Principles Page
From our blog
New developments from RD4C.
Guest Blog
Opening the Black Box: Leveraging Youth Power to Decode AI in Humanitarian CrisesAs young researchers deeply interested in the intersection of technology and social impact, we have been investigating the use of artificial intelligence (AI) in humanitarian and peacebuilding efforts. This exploration has revealed a complex landscape of serious potential and existing ethical concerns. At a time when AI is rapidly reshaping how crises are predicted, managed, and responded to, it is crucial for young voices to be heard on the responsible development and governance of AI, particularly in conflict-affected contexts. Risks and Opportunities of AI in Humanitarian Settings AI offers extraordinary opportunities to enhance humanitarian and peacebuilding efforts and accelerate the delivery of support to beneficiaries. For instance, machine learning (ML) algorithms can analyze vast amounts of data to predict potential conflict hotspots, facilitating more proactive peacekeeping interventions. AI-powered chatbots can provide mental health support to refugees, bridging critical gaps in care. Natural language processing (NLP) tools can break down language barriers in crisis communication, and AI-powered early warning systems can analyze online news articles and social media posts to predict the likelihood of violent events in a given area. However, these technologies also carry significant risks, especially when deployed in vulnerable communities. Our research has identified several key areas of concern: Algorithmic Bias: AI models trained on non-representative data can perpetuate and amplify existing biases, leading to discriminatory outcomes in aid distribution or conflict analysis. A 2021 study found that widely-used NLP models exhibited significant biases against certain dialects and linguistic variations, leading to higher false positive rates for hate speech in texts written by marginalized communities. The study evaluated popular NLP models like BERT and RoBERTa on a dataset of Arabic tweets, finding that the models struggled to accurately classify hate speech in dialectal Arabic and often misclassified innocuous phrases as offensive. Privacy and Consent: The collection of sensitive data for AI applications raises serious privacy concerns, especially in contexts where individuals may feel pressured to provide personal information to access vital services. The World Food Programme's implementation of the SCOPE system in Uganda's Bidi Bidi refugee settlement in 2018 highlights these issues. Many refugees reported feeling compelled to provide their biometric data to receive food aid, raising questions about forced consent among people living in insecure environments. Lack of Transparency: Many AI systems operate as "black boxes," making it difficult for affected individuals to understand or contest decisions made about them. This opacity is particularly problematic in humanitarian contexts where decisions can have life-altering consequences. The Dutch government's use of an algorithmic risk assessment system (SyRI) to detect welfare fraud, which was found to violate human rights by a Dutch court in 2020, is one example of how opaque AI systems in social services can harm intended beneficiaries. Erosion of Human Agency: Over-reliance on AI in humanitarian contexts risks undermining participatory decision-making processes, sidelining the communities these efforts aim to support. Empowering Youth Through AI Literacy To navigate this complex landscape, it is crucial that young people become better informed about AI technologies and their implications. This goes beyond fostering basic digital skills to developing a deep understanding of how AI systems work and their limitations—including machine learning, neural networks, and deep learning. Young people can participate in identifying potential biases in AI applications and learn how to mitigate them through diverse data collection and algorithmic fairness measures. AI literacy also involves an awareness of data rights and privacy implications, including concepts like data minimization, purpose limitation, and the right to explanation under regulations like GDPR. Educational institutions and youth organizations should prioritize AI literacy programs that equip young people with understanding and engagement with AI systems. Participatory workshops where young people can analyze real-world AI systems used in humanitarian contexts would be particularly valuable, where youth examine the UNHCR's Project Jetson, which uses machine learning to forecast forced displacement, and discuss gaps in governance, the ethical implications of the project, and methods to strengthen protections for beneficiaries affected by the project. Youth-Led AI Governance: From Consultation to Co-Creation Young people shouldn't just be subjects of AI governance. We should be active participants who help shape it. Organizations developing AI for humanitarian use should establish youth advisory boards with real decision-making power. Beyond traditional policy bodies, youth can influence AI governance through: Grassroots campaigns raising awareness of AI ethics in humanitarian contexts, such as social media campaigns highlighting the potential risks of biometric data collection in refugee camps Developing youth-led ethical guidelines for AI in crisis response, drawing inspiration from existing frameworks like the IEEE's Ethically Aligned Design principles Participating in "algorithmic audits" to assess AI systems for potential bias or harm, using tools like IBM's AI Fairness 360 toolkit or Google's What-If Tool Creating youth-centric resources on responsible AI development, such as interactive online courses or podcasts discussing real-world case studies of AI ethics in humanitarian contexts Engaging with tech companies and NGOs on ethical AI design and governance, potentially through internship programs or youth innovation challenges focused on developing ethical AI solutions for humanitarian challenges Young people have an innately valuable perspective on AI and technology due to growing up in a digital world. We are inheriting a rapidly changing AI landscape, where this technology is being deployed in every field, and its regulation is severely lacking. It is necessary for governance to keep up with technological advancement, and we can contribute to mitigating many of the unintended consequences of AI systems that older policymakers and technologists are less equipped to confront. Ethical AI: Harnessing Youth Innovation Young people can also advocate for transparency and education of AI systems used in crisis response through demanding clear documentation of training data sources and decision-making processes. On the development side, young people can create AI applications that address humanitarian challenges while prioritizing ethics, such as those with privacy-preserving federated learning models for health data analysis in refugee camps. Participating in "AI for good" hackathons focused on ethical challenges in peacebuilding can promote AI literacy and youth participation in the ethical development of AI, and can result in developing AI-powered conflict early warning systems that respect privacy and avoid stigmatization. Young people can also collaborate with researchers to study AI's impact on vulnerable communities through participatory action research projects that involve affected populations and are informed by their experiences. The Road Ahead: Building an Ethical AI Future As we navigate the complexities of AI in humanitarian contexts, amplifying youth voices is essential. Young people have a vital stake in how these technologies are used in global crisis response and peacebuilding efforts, and we bring digital fluency, diverse perspectives, and a dedication to ethical development to these critical discussions. By engaging youth as partners in ethical AI governance, we can harness the potential of these powerful technologies while safeguarding human rights, promoting fairness, and upholding the dignity of vulnerable populations. To build a truly ethical AI future, we need sustained collaboration between youth, policymakers, humanitarian organizations, tech companies, and researchers. This means creating ongoing channels for youth input, investing in AI literacy programs, and empowering young innovators—in both fragile contexts and developed nations—to cultivate responsible AI solutions for humanitarian challenges. About the Author Marine Ragnet and Aiza Shahid Qureshi are researchers at NYU, specializing in the examination of AI's impact in fragile contexts. Their research develops a multilevel framework to analyze AI-related risks and opportunities in humanitarian efforts and peacebuilding. Drawing on their interdisciplinary backgrounds, they aim to shape ethical guidelines and governance mechanisms for responsible AI deployment in challenging environments, bridging the gap between technological innovation and humanitarian needs. (Photo by Salah Darwish on Unsplash)
Read moreGuest Blog
How Youth Can Reclaim Control Over Their DataIn today’s digital age, data has become the currency of the modern world, shaping everything from social media algorithms to targeted advertising. Every time young people log into their favorite apps or post on social platforms, they leave behind valuable data that is harvested, analyzed, and often sold. Yet, according to a 2022 survey by Pew Research Center, 79% of teenagers in the U.S. admitted they were concerned about how their personal data was being used but felt powerless to control it. This disconnection between data use and understanding creates a form of data hegemony—where large corporations wield control over users’ information, while users, particularly youth, remain largely uninformed about the fate of their data. This issue is even more pronounced in developing countries, where young people often have less access to digital literacy education and privacy protections. According to UNICEF’s 2021 report, children and youth in low- and middle-income countries are particularly vulnerable to data misuse due to weaker data protection laws and limited regulatory oversight. For instance, in sub-Saharan Africa, only 28% of countries have implemented comprehensive data protection policies, compared to 80% in Europe. As mobile phone and internet usage rapidly expands in these regions, young people are increasingly exposed to privacy risks without the necessary tools to understand or control their data. This article will discuss how young people, especially in vulnerable regions, can become better informed about their data usage and break free from passive participation in the digital world. Understanding the Dynamics of Data Usage Data usage is a complex, often invisible process that occurs every time we interact with digital platforms. When young people sign up for a new app or service, they aren’t just providing an email or username—they are potentially handing over much more, including their location, browsing habits, and interactions on the platform. This data is collected by companies to build detailed profiles that go beyond what users knowingly share. For instance, social media platforms like Instagram or TikTok don’t just record posts and likes; they also track how long users linger on certain posts, which ads catch their attention, and even the time of day they’re most active. This is known as behavioral tracking, and it fuels targeted advertising and content recommendations. A famous real-world example of data misuse is the Facebook–-Cambridge Analytica scandal, where personal data of millions of Facebook users was harvested without their knowledge or consent. This data was used for political advertising, influencing voters in several countries. Many users were unaware that simply engaging with a quiz or app on Facebook could give third parties access to their personal information. While the Cambridge Analytica scandal occurred in a developed context, similar issues arise globally. For instance, in developing countries like Kenya, where internet access is rapidly growing, a 2021 Mozilla Foundation study found that many popular apps collect personal data without adequate user consent, further highlighting the risks for young people in regions with weaker privacy regulations. As artificial intelligence (AI) increasingly shapes digital experiences, it's important for young people to understand that AI models are trained on vast amounts of data, including their own. Without proper oversight, this data can be used in ways that reinforce biases or invade privacy. Youth must be aware of how their data contributes to AI systems and advocate for ethical AI use and transparency in its data training processes. Therefore, young people across the world should understand these dynamics of data collection and usage is vital for safeguarding their privacy and digital rights. Challenges to Informed Data Usage One of the most significant barriers to informed data usage is the complexity of privacy policies and terms of service agreements. For instance, Instagram’s privacy policy states, “We collect and use your content, communications, and other information you provide when you use our Services,” which includes everything from the photos you post to metadata, such as time stamps and geolocation. This dense language, often found in terms of service agreements, makes it difficult for the average young user to fully comprehend what they are agreeing to. A 2021 Common Sense Media report revealed that only 9% of teens feel confident they understand the privacy policies they encounter, leaving them unaware of how much data they’re sharing. This problem is exacerbated by the power imbalance between tech companies and young users. Large tech companies like Google, Facebook, and TikTok employ sophisticated algorithms capable of processing massive amounts of data. They collect, store, and analyze personal information, often selling it to third parties for targeted advertising. Most young people are unaware of this extensive data collection, believing they are merely engaging with a fun app or social media platform. The psychological impact of this lack of understanding is profound. Studies show that young users, when confronted with the reality of how much data is collected about them, feel overwhelmed and powerless. A 2021 UNICEF survey found that 73% of teens across multiple countries, including Brazil and South Africa, felt concerned or anxious when they realized the extent of data being collected without their consent. Empowering Youth Through Data Literacy Educational programs are essential to empower young people with knowledge about data usage. One solution is forming partnerships between tech companies and educational institutions to teach students about data rights using interactive, scenario-based learning. For example, companies like Google or Facebook could collaborate with schools to organize workshops where students make real-world privacy decisions. A 2020 MediaSmarts study found that students using privacy simulations were 40% more likely to understand data sharing implications than those who only read about it. Simplifying privacy policies is also crucial. Many are filled with legal jargon that’s hard for young users to understand. For instance, the phrase, “We may collect and process personal information such as geolocation” could be simplified to, “We use your location to show relevant ads.” In the United Kingdom, the Information Commissioner’s Office (ICO) has introduced child-friendly privacy policies using simpler language. Furthermore, gamifying data literacy could make learning fun and engaging. Apps could offer rewards for securing profiles and adjusting privacy settings, similar to how Duolingo motivates users through gamification. This would encourage young people to actively manage their data while making privacy education more engaging. The Role of Young People in Shaping Data Policies Youth have a significant role in shaping data policies and advocating for responsible data usage. Across the globe, youth-led initiatives are gaining momentum, demonstrating how young people can actively participate in discussions around data governance. For example, in the U.S., the Future of Privacy Forum engages students to influence policy debates around digital privacy. Similarly, the Youth IGF (Internet Governance Forum), a global initiative, provides platforms for young people to contribute to discuss internet governance, data privacy, and security. In developing economies, young leaders are also making their mark. In Kenya, the Ajira Digital Program has empowered youth by equipping them with digital skills and advocacy tools to participate in data policy discussions. These initiatives showcase how Generation Z brings fresh perspectives to the challenges of responsible data usage and security, promoting data transparency and accountability. To further engage in policy-making, young people can work with policymakers by joining public consultations and online advocacy groups, and collaborating with organisations dedicated to data rights. Platforms like the European Youth Parliament provide avenues for youth to voice concerns and shape policies that ensure data transparency and the protection of their digital rights. Conclusion Informed data usage is essential in today’s digital world. Young people are major contributors to the vast amounts of personal data being collected. By understanding how data is gathered and used, youth can break free from the passive role they often play in digital ecosystems. Education is the key—schools and tech companies must collaborate to teach data literacy through interactive learning, simplified privacy policies, and engaging tools like gamification. Youth also have a vital role to play in shaping the future of data governance by participating in advocacy efforts, influencing policies, and collaborating with policymakers. This helps promote data transparency and responsible usage. By becoming informed and actively engaging, young people can help create a digital world where their rights and privacy are fully protected. About the Author Tuhinsubhra Giri is an Assistant Professor at the Centre for Studies in Population and Development, Department of Economics, Christ University (India). He is passionate about data governance, MSME development, and youth empowerment, consistently merging theoretical insights with practical applications to drive strategic policy recommendations. Si Peng works as a Program Manager at the Institute for Sustainable Development Goals, Tsinghua University. Her areas of interest include advancing the Sustainable Development Goals (SDGs), digitalization, and designing training methodologies for government institutions. References Ajira Digital Program, “Youth and Digital Skills in Kenya,” 2021.Common Sense Media, “Privacy Risks for Teens,” 2021.Future of Privacy Forum, “Youth Privacy Advocates Program,” 2022.MediaSmarts, “Interactive Privacy Education,” 2020.Mozilla Foundation, “Privacy Concerns in the Global South,” 2021.Pew Research Center, “Teens’ Concerns About Privacy in a Digital Age,” 2022.The Guardian, “The Cambridge Analytica Scandal,” 2018.UNICEF, “Children’s Data Protection in the Digital World,” 2021. (Photo by Kateryna Hliznitsova on Unsplash)
Read moreData Governance for Children
From Local Voice to Global Digital Future: We Young People Must Go Through That DoorWhile I watched the thick clouds in the sky, thinking about the brilliant minds in the world of data that I was about to meet, a sense of weightlessness washed over me. It was November of last year. I had been proudly selected as a young representative to attend the UN World Data Forum on behalf of the Commitment to Data Governance Fit for Children, an initiative spearheaded by UNICEF’s Chief Data Office. I met my fellow representatives in person for the first time during our layover at Bogotá Airport en route to Medellín, Colombia. I was amazed at the variety of our group, with members ranging in age from 9 to 24 years old. With great excitement, we greeted each other and immediately bonded over our shared mission – children, adolescents, and youth coming together to work towards a data landscape that is truly safe for us. (Youth representatives from the Alberto Merani Institute presenting their ArcGIS-based project on educational equity in Colombia at the World Data Forum.) A first taste of the UN World Data Forum: fun and fast It was really exciting to be able to connect with peers from around the world, learn about their lives, and share our ambitions. We connected instantly, but I also realized that our conversations couldn’t be too technical. With such a wide range of backgrounds, we needed to communicate in a way that was more accessible and practical for everyone. I was soon impressed by my fellows’ knowledge and skills in the field of technology and data governance. A standout example was a group of students from the Alberto Merani Institute, who developed an award-winning project analyzing educational coverage in Colombia using geographic data and generously shared their findings with us. This demonstrated that, if given the right platform, youth-led initiatives have huge potential to contribute to real-world solutions. It was just past noon, but I already felt that the pace of this forum was exhilarating. I had just attended COP16 Colombia, an international conference on biological diversity. My body was exhausted after 15 intense days of youth engagement and advocacy, but I didn't want to give up. If we, the young people, manage to open the door, we must go through it. With this determination, I participated the first gathering convened by UNICEF along with my fellows. We quickly delved into the history of the World Data Forum and understood why the involvement of children and youth was so important in this space. We then engaged in a series of discussions that prepared us for the upcoming formal sessions. We aligned our schedules and reviewed our presentations and interview questions for senior policymakers one last time. Looking at the young faces around me, I knew I was ready. (Interviewing Dr. Ola Awad Shakhshir, Director of PCBS, on data and the lives of children in Palestine.) Dialogue with global policymakers: a mixed feeling The most important experience of my UN World Data Forum journey began the next day. I was fortunate to meet Ola Awad Shakhshir, the Director of the Palestinian Central Bureau of Statistics (PCBS), and pose questions on behalf of young people, in a series of interviews organized by UNICEF. Prior to my turn, I observed my peers engaging with senior policymakers like the OECD Chief Statistician. Their discussions were truly eye-opening for me, demonstrating how complex data concepts can be translated into accessible and simple forms for non-experts, such as children. And finally, it was my turn. I felt a mix of nerves and excitement. Conducting my first interview in English added an extra layer of challenge. But I was determined to speak up for my people. I started in English, asking how data is being used to protect children in Palestine, where such initiatives are critically needed. Without hesitation, Dr. Ola Awad answered me in a soft but firm voice. I had the feeling that she knew the programs on the ground like the back of her hand. Struggling to follow some of her references, I switched to Spanish—my mother tongue—and asked the interpreter for help. Dr. Ola Awad noticed my situation and slowed down, which largely soothed my nerves and encouraged me to resume the interview in English. My final question was about how the Palestinian government uses child statistics to expand educational and health opportunities. It was great to learn that data is genuinely making a difference in a place where many young people like me need support. And I believe some of these practices can be scaled and applied in other policy contexts, for example, in Colombia. (Young people worldwide gathering together to shape the declaration.) Youth Manifesto: unity is strength The following day, we experienced perhaps one of the most important moments of the entire forum. Youth representatives, policymakers, and data experts from various countries came together to explore ways to foster unity and cooperation in the field of data.. After a long day of engagement, my fellow representatives and I finally had the opportunity to gather in the same space, sit down, and reflect, as we prepared a declaration that captured some of the most urgent concerns we had. The debate quickly became heated, as we each tended to have different agendas. Meanwhile, due to logistical issues, we were asked to move to a new space, adding to the initial chaos. However, we quickly reconnected and found common ground. People with similar concerns were then divided into smaller groups for more focused discussions. At one point, we lost access to the interpretation service provided by UNICEF’s Colombia Office, but we still managed to communicate all the complex ideas. In just two hours, we agreed on some key points for this declaration. The drafted document obviously didn’t capture all local perspectives, but it was already a great achievement considering the short span of time. Seeing this dynamic at that moment, I felt so proud of how children and youth were cooperating together. Coincidentally, right in the middle of this whirlwind, my university teacher called me, demanding that I take a virtual exam and answer the questions via phone call. The situation put a lot of pressure on me, but it solidified my belief that young people must learn to juggle multiple responsibilities while actively claiming their voices. (Meeting with other young leaders from Colombia.) A vision for the future: giving youth a place in the world of data At the UN World Data Forum, more than 18 child and youth leaders co-created a joint declaration and set an example for deliberative participation, in the hope of promoting a more representative and inclusive decision-making space globally. This very declaration was officially released at a high-level meeting that discussed data governance with global delegates the following day. It was an unforgettable moment that undoubtedly planted the seeds of youth influence. A few months later, invited by the UN Secretary-General's Data Strategy Office to attend a webinar, I had the opportunity to reconvene with peers and UNICEF colleagues involved in the forum. I was glad to learn that, even as time went by, the interests in young people’s perspectives on data governance never faded away. We talked about what we had accomplished at the forum and what we could do better next time. While our declaration hasn't yet sparked major policy changes, it undeniably laid the groundwork for future youth influence. As a young advocate, I always seek to widen the door for youth participation. I seize every opportunity I have to amplify the concerns of my generation, ensuring that our voices can be heard, respected, and taken into account. After all, the future of data belongs not only to governments and big companies but also to every young innovator. We young people must go through that door–and all the more powerful when we do it together. (Note: The lead photo was taken after the closed-door High-level Group for Partnership, Coordination and Capacity-Building for statistics for the 2030 Agenda for Sustainable Development session, with global leaders including Stefan Schweinfest[second from left], Director of the UN Statistics Division, and Dr. Fahd Al-Dosari[far right], President of Saudi Arabia's General Authority for Statistics.)
Read moreData Governance for Children
Youth at the Forefront of Data Governance: Reflections on a Powerful CommitmentThe young generation is no longer content to sit aside, waiting for adults to shape their digital future. From Colombia’s rural areas to Tanzania’s workshops, young people are taking action to build their data-driven solutions from the ground up. At a recent UN webinar hosted by the Secretary-General’s Data Strategy team, two youth leaders from the Commitment to Data Governance Fit for Children showed what this looks like in practice. Speaking directly to UN professionals worldwide, they talked about the power of data—a tool that can strip opportunities from marginalized communities but also open pathways to inclusion and justice—and called for greater investment in strengthening youth engagement. Their ask was clear: Find us. Trust us. Let us lead. Reflections from the Frontlines Juan Felipe, a youth leader with UNICEF’s Generation Unlimited from Colombia, spoke about the role of inclusive data practices in creating real change in rural communities. “Data helps us understand local contexts,” he said. “But the value isn’t just about the numbers; what’s more important is that data allows us to see what’s really happening.” Speaking from personal experience, Juan explained how data-driven policies often fail to capture “the full reality in society,” especially in underdeveloped regions where data infrastructure is outdated or incomplete. When data isn’t inclusive, vulnerable populations, such as children, are often underrepresented in the narrative. He urged policymakers and researchers to move beyond spreadsheets: Go to the field. Talk to young people directly. The problems we face don’t always show up in reports. With limited resources and communication channels in rural areas to convey their voices upwards, Juan stressed the need for decision-makers to take the initiative in engaging these communities. He also pointed out that young people are eager to speak up. They care deeply about their countries and are driven to build a better world for all. Reflecting on his own journey, he added: “I love Colombia, I love the opportunity to participate in meaningful discussions, and I love the mission to fight against social injustice. From local to global, we are the same.” Navina, a Tanzanian feminist and Data Values Advocate from the Global Partnership for Sustainable Development Data (GPSDD), believes co-creation is key to meaningful youth engagement. In her initiative, Data Power Tanzania, she brought together dozens of young feminists across the country to discuss critical topics such as data ethics, AI bias, and digital safety. “This was not a lecture-style project, but rather, it was interactive in nature,” she explained. By listening closely to young females, assessing their needs, and identifying key gaps, Navina’s team and attendees co-designed a workshop, which not only helped address questions young people had, but also provided hands-on data skills training to empower them to tackle real-world challenges. “Youth and children should not be excluded from data governance discussions with the assumption that these topics are too complex for them.” Navina said. Drawing on her own experience, she pointed out that while data governance may sound like a big term at first, it can always be broken down and connected to local issues to make it more relatable. There are always data-related issues that concern different groups in the community, from data privacy to data protection. Recognized for their impactful work, Juan and Navina were invited to join the United Nations World Data Forum (UNWDF) as youth representatives for the Commitment to Data Governance Fit for Children — an initiative led by UNICEF that aims to protect and promote children’s and youth’s rights in the evolving digital age. A Platform Like No Other In a packed hall of 2,700 attendees, 18 young leaders took the mic and seized their moment. For the first time at the UN World Data Forum, young people’s voices generated so much momentum. Sitting face-to-face with high-level decision-makers — including Piedad Urdinola Contreras, Head of Colombia’s National Administrative Department of Statistics (DANE), and Ola Awad, President of the Palestinian Central Bureau of Statistics — the young advocates didn’t hold back. They raised tough questions on issues like gender discrimination in data collection and the global gap in AI development. Not to provoke, but because they understood that genuine change begins with open and honest conversations. During an intimate, cross-generational breakfast discussion with senior policymakers from the UN agencies, government ministries, and big tech organizations, the young representatives also didn’t shy away. They actively walked up to those who inspired them, asking thoughtful questions from lessons learned in their careers to how data can be used to create better outcomes for the greater society. At the heart of their advocacy was the Declaration of Children, Adolescents, and Youth on Responsible Handling of Data. Presented during the Forum’s closing ceremony and again in a closed-door session with the High-Level Group for Partnership, Coordination, and Capacity-Building under the 2030 Agenda, the declaration urged leaders worldwide to ensure data serves the young generation, no matter where they are. During the webinar, Juan and Navina reflected on their work with the Commitment to Data Governance Fit for Children and discussed what still needs to be done moving forward in the field of data governance. They spoke about the need to institutionalize youth participation in global platforms like the UNWDF and beyond. The brilliance of youth leaders shouldn't depend on being “invited” to events. Their voices should be part of the conversation from the start, embedded into governance structures, advisory boards, and accountability frameworks. The reality is far from ideal. “There were so few of us young people at the conferences — we could easily spot each other in the hallways,” Navina said. The Commitment to Data Governance Fit for Children makes a real effort to include diverse voices, offering translation services during discussions. But as Juan pointed out, language remains a barrier for young people to participate on global stages, especially for those growing up speaking non-mainstream languages. And it’s not just about English — even within translated spaces, the language of data governance can often be overly technical and hard to grasp. True inclusivity means using clear, accessible language and creating relatable resources that meet young people where they are. Funding was another big issue they raised. As Navina reflected, many young people doing meaningful work in their communities are missing from these global spaces simply because they can’t afford to be there. “The voices of young people are huge, but those who appear? The number is very small,” she said. It’s not that young people don’t care, but that the broader system hasn’t yet made it a priority to support them. That needs to change, especially for young people in the Global South. Let the Youth Lead The stories of Juan and Navina are a snapshot of something bigger happening around the world: the next generation is more confident than ever and ready to take control of their own future. The webinar also echoes a trend that’s hard to ignore: young people are increasingly being recognized as capable and legitimate stakeholders in shaping global policies. This shift in recognition is just the beginning. When young people are truly heard and given the space to lead, data governance finds its most passionate allies: a strong force full of fresh ideas and the courage to challenge the status quo. And when they are brought into the process from the start, data governance transcends technical implementation; it becomes a powerful tool for genuine inclusion and meaningful change. That’s the vision behind UNICEF’s work and the Commitment to Data Governance Fit for Children it leads. It’s already starting to take root. *** The UNWDF Commitment to Data Governance Fit for Children is a pioneering initiative spearheaded by UNICEF that unites a diverse group of organisations and young leaders, committed to ensuring that children’s rights are prioritised in the rapidly evolving world of data and AI. This commitment brings together: Youth representatives ‒ from Generation Unlimited's Young People’s Action Team, Global Partnership for Sustainable Development Data’s Data Values Advocates, Office of the UN Secretary-General’s Youth Envoy’s Young Leaders for the SDGs, UNICEF Colombia’s Red Nacional de Participación Adolescentes en Movimiento por sus Derechos ‒ to amplify youth perspectives and co-create solutions for positive change; UNICEF’s Data Governance Fit for Children Programme to advocate for programmes, policies and systems that are grounded in child and youth rights; DevelopMetrics to integrate ethical AI, supervised machine learning, and fine-tuned large language models in collaboration with youth at the onset of AI development; Highway Child to ensure that children’s voices are authentically represented and that the information they share through in creative content is safeguarded; Abu Dhabi Early Childhood Authority to promote responsible government AI systems that prioritise child wellbeing needs and provide a data driven perspective to decision making; The GovLab to empower children by ensuring data and technology are used to make more effective, equitable, and legitimate decisions that solve public problems; The Datasphere Initiative to empower diverse youth communities by equipping them with knowledge and tools to amplify their participation in data governance and AI policy discussions and The Global Partnership for Sustainable Development Data to amplify youth voices and empower young people to engage meaningfully in data governance, strengthening their ability to lead impactful, multi-stakeholder collaborations. (Lead image by Jordan Elliott on Unsplash)
Read more
Check out other articles in our
Blog Section
About us
The RD4C initiative is a joint endeavor between UNICEF and The GovLab at New York University to highlight and support best practice in our work; identify challenges and develop practical tools to assist practitioners in evaluating and addressing them; and encourage a broader discussion on actionable principles, insights, and approaches for responsible data management.
The work is intended to address practical considerations across the data lifecycle, including routine data collection and one-off data collections; and compliments work on related topics being addressed by the development community such as guidance on specific data systems and technologies, technical standardization, and digital engagement strategies.
Additional tools and materials are coming soon and will be posted on this website as they become available. Join the conversation to receive regular updates.