The RD4C Principles
Principles to guide responsible data handling toward saving children’s lives, defending their rights, and helping them fulfill their potential from early childhood through adolescence.
Read about our principles at
Principles Page
From our blog
New developments from RD4C.
Youth Voices
Youth Leading the Way: Innovating Data Governance in Humanitarian AidNote: The views and ideas shared in this article come directly from the inspiring young leaders who contributed to the campaign as part of the Responsible Data for Children (RD4C) initiative. These voices reflect the unique perspectives of youth from diverse backgrounds and regions. While RD4C provided light editorial support to enhance readability, the content remains entirely theirs—authored by young people for young people. Their insights are independent and do not necessarily represent official positions or endorsements by UNICEF or The GovLab. As humanity strives to harness the power of data, the world continues to witness innocent lives affected by unprecedented humanitarian crises. While data responsibility and data governance today can improve corporate performance and mitigate risks of data breaches in the business world, they are even more crucial in fostering trust between those in need and humanitarian aid agencies. They enable the humanitarian community to respond more effectively and efficiently in complex contexts. In this evolving landscape, young people are at the forefront, bringing innovative perspectives to data governance and driving creative solutions that address the unique challenges of humanitarian efforts. The Double-Edged Sword of Data in Humanitarian Aid Potential risks Data in humanitarian work is usually highly sensitive and can be viewed as strategically important in conflicts and disasters. If leaked, such data could identify the vulnerable communities, potentially leading to risks like human trafficking or political persecution, violating the “Do No Harm” humanitarian principle. Inadequate data governance in humanitarian efforts could result in inconsistent information sharing across a cluster system, limiting organizations’ ability to respond swiftly to disasters. For example, the Food and Agricultural Organization (FAO) noted in the 170th Session of the Council it needed to integrate and improve the internal coordination of FAO data and statistics to better fulfill its mandate. In China, FAO collaborates closely with national statistical offices to gather data on food, nutrition, and agriculture to optimize its work. Opportunities Emerging AI tools can help reduce data management costs and explore best practices in managing using various algorithms. Machine Learning, Natural Language Processing, and other algorithms currently available can scan datasets for errors or inconsistencies and rectify them promptly, reducing human labor in data management. Organizations and individuals can take advantage of data visualization tools to analyze trends and frequencies of natural disasters or movements of internally displaced people. These advanced data analytics methods enable more accurate targeting of humanitarian aid missions and the design of tailored approaches using geospatial data to assist, for example, indigenous groups affected by rising sea-levels in the South Pacific region. Ensuring Transparency in Data Sharing A universal database, combining elements of blockchain technology and ReliefWeb, the most popular humanitarian information service platform, should be established to better inform and monitor data usage. This system would automatically attach a dedicated tag to the user’s name and the data owner each time data is collected or used. Such a database should operate under the supervision of a multilateral agency like the UN. This approach would enhance transparency about their data usage, encourage more open data sharing, and indirectly deter cybercrime. Operating on a voluntary-sharing basis, the database will aggregate data from UN agencies, NGOs and national statistical offices. However, challenges such as the inherently conflicting interests in international relations could impede the development of such a database. Therefore, leaders must demonstrate political will and take collective actions as an informed data usage is crucial for the 2030 Agenda. On a personal note, while living in England, I received numerous automated phone calls claiming to be from immigration bureau, warning of visa issues and requesting personal information. This experience highlighted the vulnerability of individuals, especially those unfamiliar with their host country, to potential data theft schemes. Youth as Catalysts for Inclusive Data Governance “The tour guide” Young people can explain data policies to less data-literate individuals at the community level, raising awareness. In China, the “College Graduates Serving as Village Officials ” initiative has been operating for many years. Under this scheme, college graduates are encouraged by governments to work at grassroots levels, contributing to local poverty alleviation efforts. These dedicated young people could organize seminars or workshops with local communities on data policies, ensuring that people are not left behind in the data era. “The third eye” In remote areas with relatively poor IT infrastructure , young people with IT backgrounds can join the national civil service, becoming the “third eye” to help address inconsistent data records caused by ill-informed officials or community mistrust. By engaging with communities, they can uncover previously unrecorded or unnoticed data, improving the efficacy of data policies. Empowering Ethical AI Through Youth-Led Data Governance As early adopters of new technologies, young people are often among the first to interact with AI algorithms in their earliest forms. This unique position allows them to identify flaws or bugs that could potentially jeopardize data governance processes, stay alert to ethical breaches in AI usage, such as customer profiling on e-commerce platforms, report problems to supervisors or relevant authorities, ensuring AI is used responsibly and hold organizations accountable for ethical AI implementation in both private and public sectors. By leveraging their technological savvy and ethical awareness, young people can play a crucial role in shaping responsible AI usage and robust data governance practices. They can actively identify biases and discriminations in AI systems to safeguard marginalised and vulnerable groups from further inequalities. As frequent users of social media platforms, young people have access to various channels to expose ethical risks that could threaten human rights, enabling public awareness and necessary adjustments. Responsible data governance is essential for ethical and effective humanitarian efforts. By balancing the risks and opportunities of data use, embracing transparency, and empowering young people to lead change, we can build trust and ensure data helps protect those who need it most. (Photo by Salah Darwish / Unsplash is licensed under CC0) About the Author Jin Xiaotong is a young advocate of transformative policies in humanitarian aid. One of her notable contributions is her work on refugee integration and basic education for children and teenagers in countries of destination for the Mediterranean migratory routes. Trained as a conference interpreter at the University of Bath, her communication expertise enabled her to amplify her voice on global platforms with UN Offices in Nairobi, Geneva and Vienna. Currently, she serves in the Communication and Advocacy Office at the East Asia Delegation of the International Federation of Red Cross and Red Crescent Societies (IFRC), promoting innovative policies for the humanitarian landscape in the Asia Pacific region. Articles published in this series: Juventudes en gobiernos hiperconectados On the Digital Table, Youth Need a Voice, Not Just a Seat Who’s Really Watching? The Hidden Data Risks of Children’s “Phone Watches” AI时代的孩子:如何用“道德数据”守护他们的未来?
Read moreYouth Voices
AI时代的孩子:如何用“道德数据”守护他们的未来?Note: The views and ideas shared in this article come directly from the inspiring young leaders who contributed to the campaign as part of the Responsible Data for Children (RD4C) initiative. These voices reflect the unique perspectives of youth from diverse backgrounds and regions. While RD4C provided light editorial support to enhance readability, the content remains entirely theirs—authored by young people for young people. Their insights are independent and do not necessarily represent official positions or endorsements by UNICEF or The GovLab. 随着我们进入人工智能(AI)时代,数据已成为我们这个时代最有价值的资产之一。越来越多的政府、企业和机构将目光投向了与人工智能相关的数据治理。人工智能系统依赖大量数据来学习、适应和改进,在这种情况下,数据治理是指,确保以负责任、符合道德、符合所有利益相关者权利和利益的方式,管理数据的框架和实践。儿童是我们社会最脆弱的群体之一,他们的独特需求必须成为任何负责任的数据治理策略的核心。 在这篇文章里,笔者将讨论“道德的数据治理”(ethical data governance)这一概念,其旨在寻求数据保护与其有效利用之间的平衡,尤其关注作为数据创造者的儿童本人的权利、脆弱性及其数据被滥用的长期影响。 人工智能技术有潜力从根本上改变教育、医疗保健和其他直接影响儿童生活的领域。然而,推动这些技术的数据具有不少风险和挑战。儿童的数据,包括他们的行为模式、学习习惯和健康记录,通常在人工智能系统的输入端被广泛应用,而这些系统有可能深入影响他们生活和未来的决策。儿童还特别容易受到数据滥用的影响,因为他们缺乏充分了解数据收集、同意流程和潜在后果及其风险的自主权(agency)。虽然成年人可能有能力保护他们的数据隐私,但儿童并不总是能够就他们的数据如何被使用、谁有权限访问数据或他们的数据将被保留多长时间做出充分知情的决定。这意味着我们需要在理解儿童发展阶段、认知局限性和长期利益的基础上,对人工智能和数据的使用进行严格的监管。 2019年,YouTube母公司谷歌因违反《儿童在线隐私权保护法》(COPPA)被美国联邦贸易委员会(FTC)罚款1.7亿美元。YouTube被控长期在未经父母同意的情况下,收集13岁以下儿童的数据。该平台利用这些数据来跟踪儿童的在线行为,并向他们提供有针对性的广告。2014年,苹果公司被控允许儿童未经父母授权在应用内购买虚拟商品。最终,该公司以支付至少3250万美元向消费者全额退款为条件,与FTC达成和解。一名消费者投诉称,她的女儿在一款名为“Tap Pet Hotel”的游戏中花费了近2600 美元。这并不是孤例。由于儿童在认知上不成熟,他们很容易成为商业剥削的目标。 为人工智能设计“道德的数据治理”的挑战在于,找到数据保护和数据应用之间的平衡。一方面,保护隐私,尤其是儿童的隐私,至关重要。另一方面,数据在改善儿童服务、激发儿童潜能等方面的潜在好处也不容忽视。 理想的平衡必须对这种冲突性保持敏感,从而设计出既具有保护性功能又具有生产性功能的政策。针对“道德的数据治理”这一概念,笔者向从业人士提出以下呼吁: 第一,管理数据收集。我们应当对数据做到“Reduce、Reuse、Recycle”。首先尽可能减少数据的收集;收集后尽量重复使用,在有监管的情况下做到数据共享,避免不同的机构重复收集用户的数据;在数据达到储存有效期后告知用户,在征求同意后再次储存数据。 第二,确保知情同意。鉴于儿童理解数据应用的复杂性的能力有限,获得他们的知情同意并不总是实际“道德”的,我们通常需要父母或监护人的同意。数据治理框架还应该致力于建立一个系统,在儿童的成长过程中,为其提供与数据权利相关的知识和训练,逐步培养他们的数字素养。 第三,夯实数据安全。随着人工智能系统的激增,数据泄露、滥用和未经授权访问的风险不断增加。儿童数据尤其需要更加严格的安全措施,其中加密、访问控制和定期审计构成了“道德的数据治理”的支柱。 第四,考虑长期影响。我们必须清楚认识到,今天做出的任何有关如何使用儿童数据的决定可能会产生深远的长期影响。例如,数据滥用可能会导致不必要的数据分析、系统的被信任程度受到侵蚀,以及心理创伤,这将危及儿童的心理健康,并进一步影响他们未来的教育和职业发展机会。因此,“道德的数据治理”不仅要注重即时保护,还要注重最大限度地减少潜在的长期危害。 第五,改进问责制和透明度。处理儿童数据的人工智能系统必须以高度透明的方式运行。开发人员、使用组织等所有相关方必须公开声明正在收集哪些数据、如何处理这些数据,以及用于什么目的。同时,所有利用数据盈利的相关方必须确保每个环节责任到人,并有相应的法律规范他们的行为。这种问责制和透明度确保儿童及其看护者的权利得到尊重,并确保他们有权挑战不当的数据使用。 人工智能时代的数据治理充满挑战,但我们必须迎难而上,尤其是在保护儿童方面。人工智能可以释放新的机会并改善生活,但不应该以牺牲儿童的权利、安全或长期发展为代价。“道德的数据治理”的目标应该是促进创新,同时保护我们最弱势群体的隐私、尊严和未来。 为了实现这一目标,政府、企业和机构必须合作建立强大的、以儿童为中心的数据治理框架。这些框架应优先考虑儿童最大利益,确保人工智能技术以公平、透明和保护儿童独特脆弱性的方式开发和部署。只有在数据保护和数据应用之间取得适当的平衡,我们才能为每个儿童创造一个尊重其隐私、赋予其权利、助力其成长的数字未来。 (Photo by Igor Omilaev / Unsplash is licensed under CC0) About the Author 姜剑欧,现为牛津大学工程系博士研究生,专注于结合机器学习的计算流体动力学研究,对数据治理和儿童权利等领域也有深入关注和研究。他本科毕业于帝国理工学院地球物理专业,硕士就读于剑桥大学科学计算专业,目前担任牛津大学中国学联副主席。 Articles published in this series: Juventudes en gobiernos hiperconectados On the Digital Table, Youth Need a Voice, Not Just a Seat Who’s Really Watching? The Hidden Data Risks of Children’s “Phone Watches” Youth Leading the Way: Innovating Data Governance in Humanitarian Aid
Read moreYouth Voices
Who’s Really Watching? The Hidden Data Risks of Children’s “Phone Watches”Note: The views and ideas shared in this article come directly from the inspiring young leaders who contributed to the campaign as part of the Responsible Data for Children (RD4C) initiative. These voices reflect the unique perspectives of youth from diverse backgrounds and regions. While RD4C provided light editorial support to enhance readability, the content remains entirely theirs—authored by young people for young people. Their insights are independent and do not necessarily represent official positions or endorsements by UNICEF or The GovLab. With the advancement of technology and the rise in popularity of smart devices, children's phone watches have become a common feature in family life in China. These devices offer parents convenience by enabling real-time tracking of their children’s location, communication, and even payment activities. While these watches provide significant benefits, such as enhanced security and ease of communication, they also raise important concerns regarding the collection, storage, and use of children's data. This piece explores the governance of children's data in relation to these watches, assesses the risks and opportunities involved in data utilization, and examines relevant laws and cases, particularly from the perspective of informed consent and your participation. Status and Challenges in Children's Data Governance Children's phone watches, specialized smartwatches used by parents to monitor and stay in contact with their children, collect a variety of sensitive data, including location, communication history, payment information, and even biometric data like voiceprints. Managing such data, particularly for minors, requires stringent ethical and legal oversight. However, there are notable deficiencies in the current market practices and regulatory framework for protecting children's data: Insufficient Informed Consent: Many parents are unaware of the full scope of data collected by children's phone watches or how this data is used. Despite the provisions of China’s Personal Information Protection Law (PIPL), which mandates informed consent for data collection, parental consent for children’s data is often reduced to a simplified “one-click agreement.” Parents may not receive adequate information about the types of data being collected, its uses, or the potential risks. Risk of Data Misuse: Children’s phone watches not only gather real-time location information but also connect to other smart devices, generating vast amounts of personal data. If this data is not properly managed, it can be misused by third parties. For instance, some applications may collect and analyze children's behavioral data for marketing or even data trading purposes, leading to significant privacy violations. Challenges in Supervision: The regulation of children's smart devices lags their rapid adoption. Although laws like the Cybersecurity Law and the Law on the Protection of Minors exist, there are no specific regulations addressing the governance of data generated by children’s phone watches. In the absence of dedicated regulations, industry self-regulation is also lacking, which allows smaller or less scrupulous manufacturers to cut corners on security, leading to potential data breaches and misuse. Opportunities and Potential Value of Data Utilization Despite the risks, the appropriate use of data generated by children's phone watches presents several opportunities that benefit both families and society at large: Enhancing Child Safety: One of the key features of children's phone watches is the real-time location tracking and communication functions that help parents monitor their children’s movements. In urban areas with busy traffic, this technology can prevent accidents or child disappearances, providing families with a much-needed sense of security. Monitoring Education and Development: Children’s phone watches can provide insights into daily activities and social interactions, allowing parents to better supervise and guide their children’s development. Schools and educational institutions can also use this data to develop personalized teaching plans, thereby improving the allocation of educational resources. Smart Health management: Some high-end children’s watches are equipped with health monitoring features such as heart rate tracking and activity data analysis. These features enable parents to monitor their child’s health, detect potential issues early, and take preventative measures to ensure their well-being. Informed Consent and Data Protection Informed consent is a cornerstone of data governance, especially when it comes to children’s data. Both the Personal Information Protection Law (PIPL) and the Law on the Protection of Minors require explicit consent from guardians for the processing of data for children under the age of 14. However, in practice, informed consent often becomes a formality, with many parents unaware of the scope and risks of the data collected by the device. Improving the Informed Consent Process: Manufacturers of children’s phone watches should ensure that parents are given clear and detailed information about the types of data collected, its intended use, and potential risks. This can be achieved through pop-up notifications, text descriptions, or video tutorials. Additionally, parents should have the flexibility to control and limit the data collected, rather than merely agreeing to all terms upfront through a single consent process. Enhancing Parents' Digital Literacy: To ensure that parents make informed decisions about the data use of their children’s phone watches, efforts should be made to improve their digital literacy. Educational institutions and community organizations can offer training programs or workshops that inform parents about data security, privacy protection, and the long-term implications of data misuse. By equipping parents with the knowledge to navigate the digital world, they will be better prepared to safeguard their children's data. Coordinated Regulation by Government and Industry: The government should continue to refine and enforce laws that address data collection and usage in children's smart devices. At the same time, industry associations should develop self-regulatory frameworks that encourage companies to prioritize security and adhere to privacy protection standards. Recent advancements in China’s data protection laws, such as the Cybersecurity Law (2017) and the Data Security Law (2021), have laid a solid foundation for personal data protection. However, more specific regulations focused on children’s devices are needed to address the unique challenges posed by such products. Youth Participation in Data Policies Recognizing the importance of youth perspectives in shaping data governance, it is crucial to involve young people in the policy-making process: Youth Advisory Boards: Establish youth advisory boards that can provide input on data policies related to children's smart devices. These boards can offer unique insights into how young people use technology and their concerns about data privacy. Educational Initiatives: Develop programs that educate young people about data responsibility and governance, empowering them to make informed decisions about their own data and advocate for their rights. Feedback Mechanisms: Create channels for young users to provide feedback on their experiences with smart devices, helping to identify potential issues and improvements in data governance practices. Conclusion and Recommendations Children’s phone watches provide substantial convenience and security for parents, but the data governance issues they raise cannot be ignored. Children’s privacy and data security are vital to their overall well-being and future development. Therefore, data governance must prioritize the best interests of children, balancing the benefits of technology with the need to mitigate risks. To improve data governance in children’s smart devices, the following measures are recommended: Strengthen legal frameworks specific to children’s data collection and usage, setting clear boundaries and obligations for manufacturers. Increase efforts to educate parents about digital literacy and data privacy, ensuring they can make well-informed decisions regarding their children’s data. Enhance regulatory oversight and encourage industry self-regulation to ensure manufacturers comply with security standards and prevent data breaches. Actively involve youth in the policy-making process to ensure their perspectives are considered in data governance decisions. By fostering collaboration between government, industry, parents and young people, we can create a safer digital environment where children’s data is protected, and the benefits of smart technology are fully realized. (Photo by Sophia Stark / Unsplash is licensed under CC0) About the Author Minyue Shi recently earned a master’s degree in International Law from East China University of Political Science and Law. Her research focuses on data governance and the legal protection of personal and children’s data rights. Since her sophomore year, she has published papers on topics such as personal information protection and the evolution of digital trade rules in the data era. Minyue is deeply committed to advancing ethical data practices that prioritize safety and responsible data use for individuals in the digital age. Articles published in this series: Juventudes en gobiernos hiperconectados On the Digital Table, Youth Need a Voice, Not Just a Seat AI时代的孩子:如何用‘道德数据’守护他们的未来? Youth Leading the Way: Innovating Data Governance in Humanitarian Aid
Read moreYouth Voices
On the Digital Table, Youth Need a Voice, Not Just a SeatNote: The views and ideas shared in this article come directly from the inspiring young leaders who contributed to the campaign as part of the Responsible Data for Children (RD4C) initiative. These voices reflect the unique perspectives of youth from diverse backgrounds and regions. While RD4C provided light editorial support to enhance readability, the content remains entirely theirs—authored by young people for young people. Their insights are independent and do not necessarily represent official positions or endorsements by UNICEF or The GovLab. For youth to have agency in building our future, we need to be consulted with in the present. It has been widely established that technology evolves rapidly and data is crucial to our society. Whether you sign up to a social media platform, or are simply browsing online, your data is collected and exploited and very often to your detriment. Adult users may be fine with this automatic data grab, but more attention should be paid to data collection and processing for minors, including our perspectives and views on how our data is handled and on how policies are crafted. Indeed, it is concerning how often the voices of young people—the very demographic most impacted by these changes—are ignored in the conversations that determine our future. Despite the fact that data policies today will determine the future of privacy, safety, and access in the digital age, young people are frequently excluded from the policy-making process. In my opinion, there are three main barriers that prevent young people from contributing their perspectives and views about the digital world and specifically data policies: (1) the reluctance of tech companies to listen to criticism and make effective change that weakens their bottom lines; (2) the fact that technological advancements are outpacing legislative guidelines; (3) the frantic push by headline-searching politicians for drastic measures such as phone or social media bans, without any youth consultation whatsoever. Children and young people must be earnestly consulted regarding online data policies that affect us in unprecedented ways. I would like to place emphasis on the phrase ‘earnest consultation’ because many companies promote initiatives like youth boards or advisory panels, however, many of those youth initiatives serve more as PR stunts than platforms for real influence. The recommendations made by young people are rarely integrated and reflected in actual policy or product changes. This tokenism is not only disheartening but it is also a missed opportunity to harness the insights of those who are most familiar and most affected by the digital world. Young people are not just passive consumers of technology; we are savvy, creative, and deeply aware of the impacts of digital policy on our lives. To truly involve youth, tech companies must move beyond superficial engagements and focus groups, beyond one-page campaigns and photo shoots, beyond lip service and disingenuous surveys, tech companies must provide youth with formal authority and decision-making power within the organizations. This engagement could take the form of elected youth representatives who are not just consulted but given the ability to shape policy, product design, and ethical guidelines, similarly to Members of Youth Parliament, for instance. And again, these are just my suggestions as a European teen. Teens from North & South America, Africa, Asia, Australia and beyond may have even more impactful suggestions. Teens that are neurodiverse, face physical challenges, or are otherwise ‘different’ may have their own suggestions as well, but we will not know if those diverse youth are not consulted. Laws that govern digital spaces, such as the Online Safety Act (OSA) or the Kids Online Safety Act (KOSA), are often outdated almost as soon as they are enacted, and, ironically, then take even longer to update. Such legislation is problematic when it comes to the rights of children and young people online, which, some argue, politicians use as leverage to attempt to censor content they disagree with. While such legislation is indicative of progress, it is insufficient on its own. Technology will continue to evolve, and legislation must always be adaptive and inclusive of youth perspectives. Young people should be involved in the legislative process not just as consultees in a hastily organized focus group but as co-creators of these laws. We can ensure that laws remain relevant and responsive to the needs and realities of young people. All young people. Young people who have had their information leaked in instances of “doxxing.” Young people who have had their intimate images leaked. Young people who have had their information leaked in a data breach. Young people who are underrepresented and do not have a voice. The tendency of politicians to push for outright bans on emerging technologies without consulting those who will be most affected is yet another barrier to meaningful youth participation. It is easy for politicians to call for bans—on social media platforms, on certain types of content, on new technologies like AI—without even considering the implications. However, these decisions are rarely informed by the perspectives of young people, and can often make matters worse. We’ve seen cases of children and young people committing suicide after the taking of their phone because of a loss trigger. In one case, a young woman wrote a suicide note that read “You shouldn’t have taken my phone away.” In such instances, bans are not the most proactive solution. Instead of opting for polarizing measures, policymakers should first consult with the youth to understand their experiences and needs. Let’s take for instance, the Children's Online Privacy Protection Act (COPPA), which requires age verification to ensure that children under 13 are not subjected to data collection or harmful content. This act may be helpful in combating some data collection, and protecting younger users but it doesn’t explicitly attack the root cause of the issue: extensive data collection. Young people lie about their age online to get access to a service, and such legislation may no longer serve them because they are now considered an adult by the service collecting their data. Regardless of whether the user was an adult or a child, extensive data collection is unacceptable, especially when it is used for profit (the only driver of most tech companies), or to produce (or attempt to produce) certain outcomes, such as during the infamous Facebook–Cambridge Analytica Scandal. Data is our most precious asset when we are online, but often, we are powerless, victims to the companies and advertisers scraping our data and selling it to the highest bidder. Young people need to actively participate in shaping the data policies that will govern our futures. We must recognize young people as equal partners in the policy-making process. Children and young people must not only have a seat at the table but they must also be empowered and earnestly consulted when it comes to policies that shape our digital lives. Calling us to the table, when the meal has been decided, served and sometimes even cleared away, does a disservice to all those at the table. (Photo by Kane Reinholdtsen / Unsplash is licensed under CC0) About the Author Maximilian Milovidov is a teen online safety advocate. He is an Ambassador to the Children’s Commissioner for England, a former Youth Board Member for Childnet and Youth Ambassador for the Diana Award. Fluent in French, English, Russian, and Spanish, he serves as a Youth Ambassador for the 5Rights Foundation, People vs Big Tech and as a Youth Advisor for Digitalem. Maximilian has been featured on ITV News, Sky News, and the Wall Street Journal. His interests lie in cyberpsychology, human nature, and technology. Articles published in this series: Juventudes en gobiernos hiperconectados Who’s Really Watching? The Hidden Data Risks of Children’s “Phone Watches” AI时代的孩子:如何用‘道德数据’守护他们的未来? Youth Leading the Way: Innovating Data Governance in Humanitarian Aid
Read more
Check out other articles in our
Blog Section
About us
The RD4C initiative is a joint endeavor between UNICEF and The GovLab at New York University to highlight and support best practice in our work; identify challenges and develop practical tools to assist practitioners in evaluating and addressing them; and encourage a broader discussion on actionable principles, insights, and approaches for responsible data management.
The work is intended to address practical considerations across the data lifecycle, including routine data collection and one-off data collections; and compliments work on related topics being addressed by the development community such as guidance on specific data systems and technologies, technical standardization, and digital engagement strategies.
Additional tools and materials are coming soon and will be posted on this website as they become available. Join the conversation to receive regular updates.