The RD4C Principles
Principles to guide responsible data handling toward saving children’s lives, defending their rights, and helping them fulfill their potential from early childhood through adolescence.
Read about our principles at
Principles Page
From our blog
New developments from RD4C.
Data Governance for Children
Advancing Child and Youth-Centered Data Governance: Insights from Engagements with Young People From Around the World(Note: This policy recommendation has been reviewed by Rubina Adhikari, Navina Mutabazi, and Juan Felipe Montenegro. These young leaders have participated in youth engagement activities related to the UN World Data Forum through the Commitment to Data Governance Fit for Children, a pioneering initiative led by UNICEF’s Data Governance Fit for Children programme. Their insights have greatly enriched this work, and we sincerely appreciate their valuable contributions.) Introduction Children and youth have become deeply integrated into today’s digital ecosystems, but the governance of their data remains a pressing issue. We live amid a proliferation of digital platforms, AI-driven technologies, and online gaming, but regulation and governance struggle to keep pace. While the United Nations Convention on the Rights of the Child recognizes children's right to privacy and participation, global digital governance frameworks often fail to adequately protect and fulfil their rights. UNICEF has also been a vocal advocate for better governance of children's data and youth engagement. Still, it is not clear if UNICEF alone is enough in a complex ecosystem that includes governments, companies, and more. This piece explores actionable policy measures to address data governance fit for children, drawing from youth engagement insights led by UNICEF’s Data Governance Fit for Children (DG4C) programme at the UN World Data Forum (UNWDF), youth article campaigns and research on existing global practices. Policy Context Due to different infrastructure and governance capacity levels, countries in the Global North and Global South face distinct challenges when handling children's data. The former tend to have more resources to develop data technologies and systems, with commercial interests potentially hindering children's digital rights. These advances are at times enabled by testing periods in less regulated environments of the Global South, where countries struggle to use them in a protective fashion for their people. Even in jurisdictions with end-to-end regulation frameworks, such as the European Union’s General Data Protection Regulation, there are still irresponsible practices regarding children's data. This problem is partially due to general enforcement gaps—such as difficulties in regulating cross-border data flows—and the difficulties of catching up with the rapidly evolving technological landscape, including advancements in AI. At the same time, children's rights are often more recognized in narratives that frame them as recipients of protection. Their right to participate in decision-making and contribute their unique insights is often overlooked. Their rights can be obscured by age-generic approaches in policies, policies that regard children as immature actors in need of guidance. These policies fail to consider their development and how their needs change as they grow. These issues are important given the large role children play online. With an estimated one-third of internet users under 18, children's digital interactions generate vast amounts of data. Poor data governance exposes them to serious risks—such as privacy violations through unlawful data collection and commercial exploitation, as well as discrimination. Algorithmic bias can reinforce inequalities, particularly for children in the Global South or from marginalized groups. Weak protections also leave their information vulnerable to breaches, re-identification, and misuse. Beyond security, unchecked data practices can also cause psychological harm. That is why policymakers need to take data governance fit for children seriously, not just listening to young people, but recognizing and strengthening their agency, empowering them as leaders to shape policies themselves. To build a safer digital world for children, policymakers need to treat them as equal stakeholders and respect their needs and voices. Only then can policies be developed that truly act in their best interests. Key Insights 1. Children and youth have a growing understanding of data; As they become more immersed in digital spaces, their awareness of data deepens. In one of the pre-UNWDF exchange sessions that the DG4C programme held with over 30 young leaders from UNICEF's Generation Unlimited, attendees from across the world shared how their perception of data has changed over time. When asked what they thought about data when they were younger, many recalled that they didn't think much of it, seeing it as a math-related term or administrative record. However, their understanding expanded after experiencing how data is everywhere in this rapidly developing digital age. They have come to realize that the meaning of data has broadened tremendously. It is not just numbers, but rather—in their terms—a collection of records from various sources, from social media posts to healthcare archives, holding great power to shape their lives. This growing awareness indicates that, with the prevalence of datafication, children and youth today are becoming more data-literate. Given this, it shouldn’t be too difficult to equip them with the vital knowledge and skills they need to survive an increasingly data-driven world. Besides, young people are in need of such education and exposure to become more aware of risks and opportunities associated with data, so that they can make more informed decisions. 2. Children and youth recognize the potential of data but also the risks. They sometimes struggle to understand how their data is collected and used and what they can do to protect their privacy and rights. At the UNWDF, five young representatives from the Commitment to DG4C had the opportunity to interview senior global leaders from government statistical authorities, international organizations, and data-focused NGOs. The representatives were enthusiastic about data's “greater good” possibilities for society. They were eager, for example, to learn how children's data could be processed to enhance educational opportunities, as well as how they could develop their own data literacy to prepare for and contribute to a better digital future. While discussing their concerns, they repeatedly questioned the transparency of global data practices. They were most concerned with unauthorized access to their data and misuse of it. Many stressed the urgent need for greater responsibility and accountability throughout the data's lifecycle. However, the participants noted that most children and youth lack the knowledge to meaningfully mitigate these risks. During the pre-event discussion session, participants expressed frustration over the complexity of data policies and the use of ambiguous, overly technical language by those handling their data. They also highlighted the need for structured data education, urging schools to integrate data literacy into their curricula, as many felt their parents were short of the expertise to support them. 3. Children and youth are aware of the disparities in data infrastructure and data governance in our world, particularly in underserved regions and marginalized communities, and advocate for urgent attention and action. In the youth-moderated cross-generational breakfast discussion at the UNWDF, 18 youth representatives from diverse backgrounds spoke out about the unequal distribution of data resources. Using their firsthand experiences, representatives shared how some communities in the Global South still lack essential data infrastructure, preventing them from benefiting from data-driven opportunities. A young leader also pointed out in the pre-event discussion that, since most major AI models are trained in mainstream languages like English, they are likely to be biased against other regions of the world. They urged senior policymakers to take real action to close this gap, ensuring that disadvantaged areas are not left out in the digital era. Without serious reform and investment, the digital divide will only perpetuate. Meanwhile, participants were worried about gender minority visibility and gender bias in data collection. A representative questioned whether existing data frameworks actually capture gender diversity at the UNWDF, challenging the level of existing data inclusivity. Similarly, during the pre-event discussion, one attendee pointed out that gender bias in data is not only an oversight but has real consequences, for example, in resource allocation. 4. Children and youth call for meaningful inclusion in data-related decision-making. They want a real seat at the digital table, rather than being merely symbolic participants. A recurring theme in the pre-event article campaign, an initiative soliciting youth voices about responsible data, was the desire for meaningful institutionalized youth participation in data governance discussions. Nearly two-thirds of the submissions mentioned the importance of youth involvement in data policymaking. In one particularly compelling article, a 17-year-old argued for their rightful presence at the “digital table.” The author pointed out that “data is our most precious asset when we are online, but often, we are powerless, victims to the companies and advertisers scraping our data and selling it to the highest bidder,” calling for a more sincere attitude towards youth in digital policymaking. In a recent webinar organized by UN Data Strategy, two young advocates reflected on their experience in the UNWDF. As they reported, mere presence is not enough; their people must be meaningfully consulted and engaged in shaping the future of data governance, as it also affects them. 5. Children and youth are intrigued by AI and what it could mean for their lives, but they are also worried about how it might harm them. They have an ardent desire to learn about this rapidly developing technology. Throughout the youth-led engagement sessions before, during, and after the UNWDF, AI emerged as one of the most frequently discussed topics. From improving conflict prediction in humanitarian rescues to advancing personalized education, young people see AI as a powerful tool that can transform many aspects of our lives. They are keen to learn how this technology actually works and how it can be better leveraged to benefit humankind. Children and youth are also worried about AI's negative impacts. One main concern is the creation and spread of misinformation, which could magnify the complexity of social problems. They also questioned the fairness and bias in AI algorithms, together with a lack of transparency and invasion of privacy. Given the heterogeneous state of regulatory frameworks across the world and the rapid development of AI, discrepancies in governance and oversight have posed pressing challenges that demand urgent solutions. Policy Recommendations Based on the recollection of insights from our recent youth-led engagement activities, the DG4C programme proposes the following five policy recommendations, some of which have been included in the young representative's declaration presented during the forum's closing ceremony, reaching an audience of 2,700 attendees, including global policymakers, data experts, and civil society leaders. 1. Reform and strengthen data governance frameworks to prioritize children's rights while continuously improving them to suit the evolving digital world. A UNICEF manifesto has indicated that existing data governance systems are insufficient to protect and empower children. Despite more resources being poured into this field, it remains a challenge. To enhance data governance frameworks that put children's rights first, governments, research institutions, international organizations, and relevant companies should take proactive steps to: Invest in evidence-based research to document youth voices and needs in data governance, ensuring the governance system considers the needs and concerns of young people, particularly those from marginalized communities. Conduct comprehensive audits on existing data frameworks based on these insights and other successful practices to assess their fitness for purposes with regard to the protection of and promotion of children's rights, including but not limited to the consent mechanisms, data collection practices, and security protocols. Implement child-specific safeguards, such as default high-privacy settings, strict age verification, and limits on data tracking, to prevent misuse and exploitation of children's data, fixing problems and filling gaps identified in the audits. Develop a comprehensive legal structure that aligns national laws with global standards like the UN Convention on the Rights of the Child, ensuring consistency in protecting children's rights. Keep policies up to date with, or even ahead of, the fast-changing digital landscape to ensure their effectiveness and efficiency, especially regarding newly emerging risks associated with AI, such as biometric data misuse and digital identity theft. One useful resource is Responsible Data for Children (RD4C), a joint initiative by UNICEF at the Chief Data Office and The GovLab at NYU that offers a principles-led framework, tailorable guidance and practical tools to help organizations adopt inclusive, child-rights centered and responsible data practices throughout the data lifecycle. 2. Promote international collaboration to bridge digital and data equity gaps. Fulfilling the inequalities in digital access and data governance requires global coordination. International organizations, governments and the private sector should work together to: Increase investment in digital infrastructure for underserved regions, ensuring equitable access to data-driven opportunities and closing the digital divide. Foster cross-border collaboration by sharing knowledge gained and lessons learned in practice, allowing regions in need to adapt effective policies to their unique local contexts, as well as providing financial support. Make sure data collection reflects diversity, especially when it comes to gender minorities and marginalized communities, so that data-driven policies do not reinforce existing inequalities. International organizations, such as the United Nations (especially the ITU), the World Bank, and other regional networks like the African Union, can play a pivotal role in facilitating dialogue, mobilizing resources, and setting global standards for ethical and inclusive data governance fit for children. The Commitment to DG4C also contributes to this purpose, bringing together coalition members from government and academic institution to NGOs and private companies. 3. Invest in capacity building and data literacy, with a particular focus on AI education. Our children today are the very first generation of citizens to be datafied, with their personal information collected even before their birth. As digital technologies advance, companies, governments, and other actors have an unprecedented ability to collect and analyze children's data. Meanwhile, AI systems are becoming more embedded in their daily lives. Without strong data and AI literacy, young people will be vulnerable to many risks, including but not limited to misinformation, algorithmic bias, and privacy threats. To address this and respond to young people's desire to learn more, governments and educational institutions can: Integrate data and AI literacy into school curricula, ensuring young people understand how their data is collected, stored, and used, how AI systems are developed and operate, as well as how they can better protect themselves while making the most of the technologies. Build virtual learning hubs beyond the traditional education system (especially in regions where in-person education is challenging) and develop accessible learning resources that break down complex data and AI-related concepts into age-appropriate content. Support children by supporting adults, equipping teachers (and even parents) with the necessary knowledge and tools to educate students and children about responsible data practice and their digital rights. UNICEF Innocenti’s foresight toolkit is a valuable resource to encourage young people to imagine what responsible data governance looks like for them in an ideal digital future. The DG4C programme is also driving efforts towards this goal in collaboration with UN Global Pulse in Uganda. With all stakeholders working together, a data-literate next generation will not only be better equipped to advocate for their own rights, but also be better positioned to shape a more open data governance framework and build a more equitable digital future. 4. Ensure more responsible and authentic youth engagement. Children and youth are typically portrayed as recipients of data protection, yet policymakers must remember that they also have the right as well as the capability to shape their own digital future. There must be efforts to balance data protection and self-determination as well as efforts to engage them actively in the policy-making process. To ensure meaningful and diverse youth engagement, governments, international organizations, and private sectors should: Establish youth advisory councils within international and national data governance bodies to institutionalize youth participation in data governance. While some organizations, governments and tech companies have taken action, efforts remain insufficient and should be scaled up. Require youth representation in key data policy discussions, ensuring their voices shape regulations related to their digital future. For the first time at a UN Data Forum, the perspectives of children and youth are being heard. Such inclusion should happen more often and become the norm. Support youth-led research and advocacy, providing targeted and sustainable funding, mentorship, and platforms to amplify their impact. Policies and interventions sometimes failed to include Foster intergenerational collaboration, taking advantage of both young people’s fresh perspectives and first-hand experiences in digital ecosystems and older generations’ institutional knowledge and policy expertise. Authentic engagement goes beyond symbolic inclusion. UNICEF's DG4C Programme exemplifies this, centering its work closely collaborating with global youth networks such as the Generation Unlimited, PSDD’s Data Values Advocates, and local youth groups in countries like Colombia. It has also initiated cross-generational discussions between youth and decision makers. DG4C recognizes and emphasizes the need for young people to take an active role in decisions affecting their data rights, while also providing platforms and support for youth-led engagement and initiatives, ensuring that children and young people can meaningfully contribute to shaping digital policies and governance. 5. Launch public awareness campaigns on responsible data and data rights. From gaps in data literacy to digital inequality and the exclusion of youth from decision-making, many of the issues raised above by young people have been broadly acknowledged. Yet, they often go unaddressed, frequently due to limited resources, institutional inertia, or a lack of awareness and political will. To drive real change, we need widespread public awareness campaigns that: Leverage mass media, academic research, symposiums, and other platforms to make children's digital rights a widely accepted norm and foster a culture where protecting them is a shared societal responsibility. Employ creative approaches, such as visual arts, short films, games and other tools, to make complex data concepts more accessible and engaging. Empower children and the public to hold governments and other data governance bodies accountable while ensuring businesses understand the consequences of non-compliance. Governments should initiate public engagement initiatives and encourage scrutiny of data practices related to children. Encourage youth-led awareness campaigns that bring all stakeholders together. This can be done through youth-led workshops, school programs, social media campaigns, and industry events. Governments, international organizations, and tech companies should work hand in hand to support these campaigns. Not until awareness is raised at all societal levels will the digital ecosystem truly respect children's rights, foster responsible data behavior across the board, and promote an inclusive digital future. Conclusion The digital world continues to evolve at an unprecedented speed, yet the need to preserve childrens’ rights remains constant. To ensure a fair digital future that protects our next generation, it is crucial to strengthen data governance, bridge the digital divide, institutionalize youth participation, invest in AI and other data tech literacy and increase data awareness. These are not just policy choices. They are fundamental commitments to human rights, equity, and long-term sustainability. Every policymaker, industry leader, and civil society member was once a child, dreaming of a society that would serve their best interest. We must honour that hope and take decisive action now, as the decisions made today will shape the digital world of tomorrow.
Read moreGuest Blog
Opening the Black Box: Leveraging Youth Power to Decode AI in Humanitarian CrisesAs young researchers deeply interested in the intersection of technology and social impact, we have been investigating the use of artificial intelligence (AI) in humanitarian and peacebuilding efforts. This exploration has revealed a complex landscape of serious potential and existing ethical concerns. At a time when AI is rapidly reshaping how crises are predicted, managed, and responded to, it is crucial for young voices to be heard on the responsible development and governance of AI, particularly in conflict-affected contexts. Risks and Opportunities of AI in Humanitarian Settings AI offers extraordinary opportunities to enhance humanitarian and peacebuilding efforts and accelerate the delivery of support to beneficiaries. For instance, machine learning (ML) algorithms can analyze vast amounts of data to predict potential conflict hotspots, facilitating more proactive peacekeeping interventions. AI-powered chatbots can provide mental health support to refugees, bridging critical gaps in care. Natural language processing (NLP) tools can break down language barriers in crisis communication, and AI-powered early warning systems can analyze online news articles and social media posts to predict the likelihood of violent events in a given area. However, these technologies also carry significant risks, especially when deployed in vulnerable communities. Our research has identified several key areas of concern: Algorithmic Bias: AI models trained on non-representative data can perpetuate and amplify existing biases, leading to discriminatory outcomes in aid distribution or conflict analysis. A 2021 study found that widely-used NLP models exhibited significant biases against certain dialects and linguistic variations, leading to higher false positive rates for hate speech in texts written by marginalized communities. The study evaluated popular NLP models like BERT and RoBERTa on a dataset of Arabic tweets, finding that the models struggled to accurately classify hate speech in dialectal Arabic and often misclassified innocuous phrases as offensive. Privacy and Consent: The collection of sensitive data for AI applications raises serious privacy concerns, especially in contexts where individuals may feel pressured to provide personal information to access vital services. The World Food Programme's implementation of the SCOPE system in Uganda's Bidi Bidi refugee settlement in 2018 highlights these issues. Many refugees reported feeling compelled to provide their biometric data to receive food aid, raising questions about forced consent among people living in insecure environments. Lack of Transparency: Many AI systems operate as "black boxes," making it difficult for affected individuals to understand or contest decisions made about them. This opacity is particularly problematic in humanitarian contexts where decisions can have life-altering consequences. The Dutch government's use of an algorithmic risk assessment system (SyRI) to detect welfare fraud, which was found to violate human rights by a Dutch court in 2020, is one example of how opaque AI systems in social services can harm intended beneficiaries. Erosion of Human Agency: Over-reliance on AI in humanitarian contexts risks undermining participatory decision-making processes, sidelining the communities these efforts aim to support. Empowering Youth Through AI Literacy To navigate this complex landscape, it is crucial that young people become better informed about AI technologies and their implications. This goes beyond fostering basic digital skills to developing a deep understanding of how AI systems work and their limitations—including machine learning, neural networks, and deep learning. Young people can participate in identifying potential biases in AI applications and learn how to mitigate them through diverse data collection and algorithmic fairness measures. AI literacy also involves an awareness of data rights and privacy implications, including concepts like data minimization, purpose limitation, and the right to explanation under regulations like GDPR. Educational institutions and youth organizations should prioritize AI literacy programs that equip young people with understanding and engagement with AI systems. Participatory workshops where young people can analyze real-world AI systems used in humanitarian contexts would be particularly valuable, where youth examine the UNHCR's Project Jetson, which uses machine learning to forecast forced displacement, and discuss gaps in governance, the ethical implications of the project, and methods to strengthen protections for beneficiaries affected by the project. Youth-Led AI Governance: From Consultation to Co-Creation Young people shouldn't just be subjects of AI governance. We should be active participants who help shape it. Organizations developing AI for humanitarian use should establish youth advisory boards with real decision-making power. Beyond traditional policy bodies, youth can influence AI governance through: Grassroots campaigns raising awareness of AI ethics in humanitarian contexts, such as social media campaigns highlighting the potential risks of biometric data collection in refugee camps Developing youth-led ethical guidelines for AI in crisis response, drawing inspiration from existing frameworks like the IEEE's Ethically Aligned Design principles Participating in "algorithmic audits" to assess AI systems for potential bias or harm, using tools like IBM's AI Fairness 360 toolkit or Google's What-If Tool Creating youth-centric resources on responsible AI development, such as interactive online courses or podcasts discussing real-world case studies of AI ethics in humanitarian contexts Engaging with tech companies and NGOs on ethical AI design and governance, potentially through internship programs or youth innovation challenges focused on developing ethical AI solutions for humanitarian challenges Young people have an innately valuable perspective on AI and technology due to growing up in a digital world. We are inheriting a rapidly changing AI landscape, where this technology is being deployed in every field, and its regulation is severely lacking. It is necessary for governance to keep up with technological advancement, and we can contribute to mitigating many of the unintended consequences of AI systems that older policymakers and technologists are less equipped to confront. Ethical AI: Harnessing Youth Innovation Young people can also advocate for transparency and education of AI systems used in crisis response through demanding clear documentation of training data sources and decision-making processes. On the development side, young people can create AI applications that address humanitarian challenges while prioritizing ethics, such as those with privacy-preserving federated learning models for health data analysis in refugee camps. Participating in "AI for good" hackathons focused on ethical challenges in peacebuilding can promote AI literacy and youth participation in the ethical development of AI, and can result in developing AI-powered conflict early warning systems that respect privacy and avoid stigmatization. Young people can also collaborate with researchers to study AI's impact on vulnerable communities through participatory action research projects that involve affected populations and are informed by their experiences. The Road Ahead: Building an Ethical AI Future As we navigate the complexities of AI in humanitarian contexts, amplifying youth voices is essential. Young people have a vital stake in how these technologies are used in global crisis response and peacebuilding efforts, and we bring digital fluency, diverse perspectives, and a dedication to ethical development to these critical discussions. By engaging youth as partners in ethical AI governance, we can harness the potential of these powerful technologies while safeguarding human rights, promoting fairness, and upholding the dignity of vulnerable populations. To build a truly ethical AI future, we need sustained collaboration between youth, policymakers, humanitarian organizations, tech companies, and researchers. This means creating ongoing channels for youth input, investing in AI literacy programs, and empowering young innovators—in both fragile contexts and developed nations—to cultivate responsible AI solutions for humanitarian challenges. About the Author Marine Ragnet and Aiza Shahid Qureshi are researchers at NYU, specializing in the examination of AI's impact in fragile contexts. Their research develops a multilevel framework to analyze AI-related risks and opportunities in humanitarian efforts and peacebuilding. Drawing on their interdisciplinary backgrounds, they aim to shape ethical guidelines and governance mechanisms for responsible AI deployment in challenging environments, bridging the gap between technological innovation and humanitarian needs. (Photo by Salah Darwish on Unsplash)
Read moreGuest Blog
How Youth Can Reclaim Control Over Their DataIn today’s digital age, data has become the currency of the modern world, shaping everything from social media algorithms to targeted advertising. Every time young people log into their favorite apps or post on social platforms, they leave behind valuable data that is harvested, analyzed, and often sold. Yet, according to a 2022 survey by Pew Research Center, 79% of teenagers in the U.S. admitted they were concerned about how their personal data was being used but felt powerless to control it. This disconnection between data use and understanding creates a form of data hegemony—where large corporations wield control over users’ information, while users, particularly youth, remain largely uninformed about the fate of their data. This issue is even more pronounced in developing countries, where young people often have less access to digital literacy education and privacy protections. According to UNICEF’s 2021 report, children and youth in low- and middle-income countries are particularly vulnerable to data misuse due to weaker data protection laws and limited regulatory oversight. For instance, in sub-Saharan Africa, only 28% of countries have implemented comprehensive data protection policies, compared to 80% in Europe. As mobile phone and internet usage rapidly expands in these regions, young people are increasingly exposed to privacy risks without the necessary tools to understand or control their data. This article will discuss how young people, especially in vulnerable regions, can become better informed about their data usage and break free from passive participation in the digital world. Understanding the Dynamics of Data Usage Data usage is a complex, often invisible process that occurs every time we interact with digital platforms. When young people sign up for a new app or service, they aren’t just providing an email or username—they are potentially handing over much more, including their location, browsing habits, and interactions on the platform. This data is collected by companies to build detailed profiles that go beyond what users knowingly share. For instance, social media platforms like Instagram or TikTok don’t just record posts and likes; they also track how long users linger on certain posts, which ads catch their attention, and even the time of day they’re most active. This is known as behavioral tracking, and it fuels targeted advertising and content recommendations. A famous real-world example of data misuse is the Facebook–-Cambridge Analytica scandal, where personal data of millions of Facebook users was harvested without their knowledge or consent. This data was used for political advertising, influencing voters in several countries. Many users were unaware that simply engaging with a quiz or app on Facebook could give third parties access to their personal information. While the Cambridge Analytica scandal occurred in a developed context, similar issues arise globally. For instance, in developing countries like Kenya, where internet access is rapidly growing, a 2021 Mozilla Foundation study found that many popular apps collect personal data without adequate user consent, further highlighting the risks for young people in regions with weaker privacy regulations. As artificial intelligence (AI) increasingly shapes digital experiences, it's important for young people to understand that AI models are trained on vast amounts of data, including their own. Without proper oversight, this data can be used in ways that reinforce biases or invade privacy. Youth must be aware of how their data contributes to AI systems and advocate for ethical AI use and transparency in its data training processes. Therefore, young people across the world should understand these dynamics of data collection and usage is vital for safeguarding their privacy and digital rights. Challenges to Informed Data Usage One of the most significant barriers to informed data usage is the complexity of privacy policies and terms of service agreements. For instance, Instagram’s privacy policy states, “We collect and use your content, communications, and other information you provide when you use our Services,” which includes everything from the photos you post to metadata, such as time stamps and geolocation. This dense language, often found in terms of service agreements, makes it difficult for the average young user to fully comprehend what they are agreeing to. A 2021 Common Sense Media report revealed that only 9% of teens feel confident they understand the privacy policies they encounter, leaving them unaware of how much data they’re sharing. This problem is exacerbated by the power imbalance between tech companies and young users. Large tech companies like Google, Facebook, and TikTok employ sophisticated algorithms capable of processing massive amounts of data. They collect, store, and analyze personal information, often selling it to third parties for targeted advertising. Most young people are unaware of this extensive data collection, believing they are merely engaging with a fun app or social media platform. The psychological impact of this lack of understanding is profound. Studies show that young users, when confronted with the reality of how much data is collected about them, feel overwhelmed and powerless. A 2021 UNICEF survey found that 73% of teens across multiple countries, including Brazil and South Africa, felt concerned or anxious when they realized the extent of data being collected without their consent. Empowering Youth Through Data Literacy Educational programs are essential to empower young people with knowledge about data usage. One solution is forming partnerships between tech companies and educational institutions to teach students about data rights using interactive, scenario-based learning. For example, companies like Google or Facebook could collaborate with schools to organize workshops where students make real-world privacy decisions. A 2020 MediaSmarts study found that students using privacy simulations were 40% more likely to understand data sharing implications than those who only read about it. Simplifying privacy policies is also crucial. Many are filled with legal jargon that’s hard for young users to understand. For instance, the phrase, “We may collect and process personal information such as geolocation” could be simplified to, “We use your location to show relevant ads.” In the United Kingdom, the Information Commissioner’s Office (ICO) has introduced child-friendly privacy policies using simpler language. Furthermore, gamifying data literacy could make learning fun and engaging. Apps could offer rewards for securing profiles and adjusting privacy settings, similar to how Duolingo motivates users through gamification. This would encourage young people to actively manage their data while making privacy education more engaging. The Role of Young People in Shaping Data Policies Youth have a significant role in shaping data policies and advocating for responsible data usage. Across the globe, youth-led initiatives are gaining momentum, demonstrating how young people can actively participate in discussions around data governance. For example, in the U.S., the Future of Privacy Forum engages students to influence policy debates around digital privacy. Similarly, the Youth IGF (Internet Governance Forum), a global initiative, provides platforms for young people to contribute to discuss internet governance, data privacy, and security. In developing economies, young leaders are also making their mark. In Kenya, the Ajira Digital Program has empowered youth by equipping them with digital skills and advocacy tools to participate in data policy discussions. These initiatives showcase how Generation Z brings fresh perspectives to the challenges of responsible data usage and security, promoting data transparency and accountability. To further engage in policy-making, young people can work with policymakers by joining public consultations and online advocacy groups, and collaborating with organisations dedicated to data rights. Platforms like the European Youth Parliament provide avenues for youth to voice concerns and shape policies that ensure data transparency and the protection of their digital rights. Conclusion Informed data usage is essential in today’s digital world. Young people are major contributors to the vast amounts of personal data being collected. By understanding how data is gathered and used, youth can break free from the passive role they often play in digital ecosystems. Education is the key—schools and tech companies must collaborate to teach data literacy through interactive learning, simplified privacy policies, and engaging tools like gamification. Youth also have a vital role to play in shaping the future of data governance by participating in advocacy efforts, influencing policies, and collaborating with policymakers. This helps promote data transparency and responsible usage. By becoming informed and actively engaging, young people can help create a digital world where their rights and privacy are fully protected. About the Author Tuhinsubhra Giri is an Assistant Professor at the Centre for Studies in Population and Development, Department of Economics, Christ University (India). He is passionate about data governance, MSME development, and youth empowerment, consistently merging theoretical insights with practical applications to drive strategic policy recommendations. Si Peng works as a Program Manager at the Institute for Sustainable Development Goals, Tsinghua University. Her areas of interest include advancing the Sustainable Development Goals (SDGs), digitalization, and designing training methodologies for government institutions. References Ajira Digital Program, “Youth and Digital Skills in Kenya,” 2021.Common Sense Media, “Privacy Risks for Teens,” 2021.Future of Privacy Forum, “Youth Privacy Advocates Program,” 2022.MediaSmarts, “Interactive Privacy Education,” 2020.Mozilla Foundation, “Privacy Concerns in the Global South,” 2021.Pew Research Center, “Teens’ Concerns About Privacy in a Digital Age,” 2022.The Guardian, “The Cambridge Analytica Scandal,” 2018.UNICEF, “Children’s Data Protection in the Digital World,” 2021. (Photo by Kateryna Hliznitsova on Unsplash)
Read moreData Governance for Children
From Local Voice to Global Digital Future: We Young People Must Go Through That DoorWhile I watched the thick clouds in the sky, thinking about the brilliant minds in the world of data that I was about to meet, a sense of weightlessness washed over me. It was November of last year. I had been proudly selected as a young representative to attend the UN World Data Forum on behalf of the Commitment to Data Governance Fit for Children, an initiative spearheaded by UNICEF’s Chief Data Office. I met my fellow representatives in person for the first time during our layover at Bogotá Airport en route to Medellín, Colombia. I was amazed at the variety of our group, with members ranging in age from 9 to 24 years old. With great excitement, we greeted each other and immediately bonded over our shared mission – children, adolescents, and youth coming together to work towards a data landscape that is truly safe for us. (Youth representatives from the Alberto Merani Institute presenting their ArcGIS-based project on educational equity in Colombia at the World Data Forum.) A first taste of the UN World Data Forum: fun and fast It was really exciting to be able to connect with peers from around the world, learn about their lives, and share our ambitions. We connected instantly, but I also realized that our conversations couldn’t be too technical. With such a wide range of backgrounds, we needed to communicate in a way that was more accessible and practical for everyone. I was soon impressed by my fellows’ knowledge and skills in the field of technology and data governance. A standout example was a group of students from the Alberto Merani Institute, who developed an award-winning project analyzing educational coverage in Colombia using geographic data and generously shared their findings with us. This demonstrated that, if given the right platform, youth-led initiatives have huge potential to contribute to real-world solutions. It was just past noon, but I already felt that the pace of this forum was exhilarating. I had just attended COP16 Colombia, an international conference on biological diversity. My body was exhausted after 15 intense days of youth engagement and advocacy, but I didn't want to give up. If we, the young people, manage to open the door, we must go through it. With this determination, I participated the first gathering convened by UNICEF along with my fellows. We quickly delved into the history of the World Data Forum and understood why the involvement of children and youth was so important in this space. We then engaged in a series of discussions that prepared us for the upcoming formal sessions. We aligned our schedules and reviewed our presentations and interview questions for senior policymakers one last time. Looking at the young faces around me, I knew I was ready. (Interviewing Dr. Ola Awad Shakhshir, Director of PCBS, on data and the lives of children in Palestine.) Dialogue with global policymakers: a mixed feeling The most important experience of my UN World Data Forum journey began the next day. I was fortunate to meet Ola Awad Shakhshir, the Director of the Palestinian Central Bureau of Statistics (PCBS), and pose questions on behalf of young people, in a series of interviews organized by UNICEF. Prior to my turn, I observed my peers engaging with senior policymakers like the OECD Chief Statistician. Their discussions were truly eye-opening for me, demonstrating how complex data concepts can be translated into accessible and simple forms for non-experts, such as children. And finally, it was my turn. I felt a mix of nerves and excitement. Conducting my first interview in English added an extra layer of challenge. But I was determined to speak up for my people. I started in English, asking how data is being used to protect children in Palestine, where such initiatives are critically needed. Without hesitation, Dr. Ola Awad answered me in a soft but firm voice. I had the feeling that she knew the programs on the ground like the back of her hand. Struggling to follow some of her references, I switched to Spanish—my mother tongue—and asked the interpreter for help. Dr. Ola Awad noticed my situation and slowed down, which largely soothed my nerves and encouraged me to resume the interview in English. My final question was about how the Palestinian government uses child statistics to expand educational and health opportunities. It was great to learn that data is genuinely making a difference in a place where many young people like me need support. And I believe some of these practices can be scaled and applied in other policy contexts, for example, in Colombia. (Young people worldwide gathering together to shape the declaration.) Youth Manifesto: unity is strength The following day, we experienced perhaps one of the most important moments of the entire forum. Youth representatives, policymakers, and data experts from various countries came together to explore ways to foster unity and cooperation in the field of data.. After a long day of engagement, my fellow representatives and I finally had the opportunity to gather in the same space, sit down, and reflect, as we prepared a declaration that captured some of the most urgent concerns we had. The debate quickly became heated, as we each tended to have different agendas. Meanwhile, due to logistical issues, we were asked to move to a new space, adding to the initial chaos. However, we quickly reconnected and found common ground. People with similar concerns were then divided into smaller groups for more focused discussions. At one point, we lost access to the interpretation service provided by UNICEF’s Colombia Office, but we still managed to communicate all the complex ideas. In just two hours, we agreed on some key points for this declaration. The drafted document obviously didn’t capture all local perspectives, but it was already a great achievement considering the short span of time. Seeing this dynamic at that moment, I felt so proud of how children and youth were cooperating together. Coincidentally, right in the middle of this whirlwind, my university teacher called me, demanding that I take a virtual exam and answer the questions via phone call. The situation put a lot of pressure on me, but it solidified my belief that young people must learn to juggle multiple responsibilities while actively claiming their voices. (Meeting with other young leaders from Colombia.) A vision for the future: giving youth a place in the world of data At the UN World Data Forum, more than 18 child and youth leaders co-created a joint declaration and set an example for deliberative participation, in the hope of promoting a more representative and inclusive decision-making space globally. This very declaration was officially released at a high-level meeting that discussed data governance with global delegates the following day. It was an unforgettable moment that undoubtedly planted the seeds of youth influence. A few months later, invited by the UN Secretary-General's Data Strategy Office to attend a webinar, I had the opportunity to reconvene with peers and UNICEF colleagues involved in the forum. I was glad to learn that, even as time went by, the interests in young people’s perspectives on data governance never faded away. We talked about what we had accomplished at the forum and what we could do better next time. While our declaration hasn't yet sparked major policy changes, it undeniably laid the groundwork for future youth influence. As a young advocate, I always seek to widen the door for youth participation. I seize every opportunity I have to amplify the concerns of my generation, ensuring that our voices can be heard, respected, and taken into account. After all, the future of data belongs not only to governments and big companies but also to every young innovator. We young people must go through that door–and all the more powerful when we do it together. (Note: The lead photo was taken after the closed-door High-level Group for Partnership, Coordination and Capacity-Building for statistics for the 2030 Agenda for Sustainable Development session, with global leaders including Stefan Schweinfest[second from left], Director of the UN Statistics Division, and Dr. Fahd Al-Dosari[far right], President of Saudi Arabia's General Authority for Statistics.)
Read more
Check out other articles in our
Blog Section
About us
The RD4C initiative is a joint endeavor between UNICEF and The GovLab at New York University to highlight and support best practice in our work; identify challenges and develop practical tools to assist practitioners in evaluating and addressing them; and encourage a broader discussion on actionable principles, insights, and approaches for responsible data management.
The work is intended to address practical considerations across the data lifecycle, including routine data collection and one-off data collections; and compliments work on related topics being addressed by the development community such as guidance on specific data systems and technologies, technical standardization, and digital engagement strategies.
Additional tools and materials are coming soon and will be posted on this website as they become available. Join the conversation to receive regular updates.