Amsterdam – A group of academics, lawyers, and scientists have come together to comment on the second UNESCO consultation on a ‘model regulatory framework for the digital content platforms to secure information as a public good’ (the Framework). They will attend the global conference in Paris later this month promoting an “Internet for Trust” toward guidelines for regulating digital platforms for information as a public good. The team, including Dr Mark Leiser from ALTI Center at VU, research the impact of digital technologies on individuals and societies.
Funded by the Volkswagen Foundation, researchers from the Max Planck Institute for Human Development and VU-Amsterdam are identifying ways individuals can reclaim their autonomy and address the imbalance between human decision-makers and platforms by integrating insights from the behavioural sciences into evidence-led policy and regulation. The team, investigating the effects of social media on democratic values, attitudes towards digital architectures, and strategies to combat online misinformation while respecting human rights, made an impact in 2020 with their Technology and Democracy report for the Joint Research Council of the European Union.
The research group recommends that UNESCO take a behavioural science approach to designing and regulating digital platforms to empower users and ensure accurate and reliable information is available. The European Union’s Digital Services Act serves as an example of a framework that embraces behavioural sciences to protect users while encouraging innovation in the delivery of services. The United Nations recognizes the importance of information as a fundamental public good through the Windhoek Declaration, acknowledging that access to information and free expression are vital for upholding human rights such as non-discrimination and gender equality. However, the violation of these rights has been exacerbated by the actions of digital platforms, which have become gatekeepers of information through their content moderation policies.
This report examines the ways in which these platforms may cause distortions to fundamental rights in terms of transparency, user empowerment, and accountability. Despite attempts to improve regulatory control through transparency requirements in regional regulations, these efforts have resulted in voluminous and frequently incomprehensible legalistic text that is ineffective in helping individuals make informed decisions.
Instead, the report suggests a shift towards meaningful and effective transparency, achieved through empowering users with the ability to personalize their own content moderation policies through interoperability requirements. The report also highlights the importance of dynamic personalization and responsive regulation that is informed by evidence-led insights from the behavioural sciences.
International accountability systems should hold platforms accountable for providing evidence of their efforts to establish transparent operations that effectively encourage user engagement. The United Nations has a crucial role in promoting legislation that reflects principles of platform accountability, measurable transparency, and user empowerment, resulting in a mutually beneficial international environment.
A copy of the Report is available here.
Photo by Gilles Lambert