top of page
  • Lujain Ibrahim

Roundtable Summary - Navigating Global vs Local


Technology platforms have long operated with minimal friction of national borders online, bringing in billions of people around the world as users. Today, this dynamic is being challenged. Countries around the world are putting forth laws aimed at holding platforms legally accountable for how they operate and treat their users in different national jurisdictions. Platforms must negotiate with individual states around how they interpret and comply with laws and navigate a fraught space where economic, national security, and geopolitical interests converge. Platforms take on a number of different ‘faces’ as they navigate global vs local imperatives with implications for Internet users.

From India to the EU, Singapore to China, how are companies reconfiguring themselves? What are the implications for users?

A transcript and video recording are available. Our key speakers for the panel featured: Urs Gasser, Dean of the TUM School of Social Sciences and Technology, Katherine Chen, Professor at National Chengchi University and Facebook Oversight Board Member, Jack Qiu, Shaw Foundation Professor in Media Technology at Nanyang Technology University, and Udbhav Tiwari Head of Global Product Policy at Mozilla.

This summary captures key themes, ideas, and policy recommendations made during the event.



Background & History

The past 25 years have seen a murky regulatory field for platforms, resulting in a maze of laws, regulations, and policies targeted at them. The birth of platform regulation was triggered by events such as the revelations of 2013 and Cambridge Analytica in 2018, which shifted the emphasis towards the role of government in regulating platforms. Issues that have been prioritized around the globe include online privacy, the responsibility of platforms, state and surveillance capitalism, market power and antitrust, and social media content. The legal and regulatory attempts in terms of regulatory strategies, techniques, oversight mechanisms, and enforcement schemes vary across geographies and political economies. Approaches follow national idiosyncrasies, and what agency is tasked with depends on the pre-existing conditions. The heterogeneity across the globe is due to the question of values, democratic norms, and the like.

The trajectory suggests that we see more government involvement compared to the early days of the internet and the World Wide Web, and many interventions, including the DSA and content moderation initiatives, are great examples.

Perhaps even strengthening the platforms themselves, to elevate them as policy actors themselves. Historically it is important to remember that Section 230 from the Communications Act in the US gave platforms a ‘liability shield’ and enabled Platforms to become what they are today. It's a chessboard and a web of norms of responsibilities and enforcement schemes.

Facebook Oversight Board

Katherine Chen, a Facebook Oversight Board member, discussed the purpose and process of the board. The Oversight Board is an independent organization that makes final and binding decisions on what content Facebook and Instagram should allow or remove. The board is composed of diverse global members who speak to the complex problems facing users around the world. Chen explains how the board's independence from the company is central to its work, as it operates autonomously from the economic, political, and reputational interests of the company. Since its inception in October 2020, the board has taken cases from all over the world and examined a range of Facebook's community standards, including hate speech, violence, incitement, and health misinformation. The board has published 35 case decisions, covering issues such as the Russian invasion of Ukraine and LGBTQ rights, and made 186 recommendations to Facebook, many of which have already been implemented.

The board has examined the nuances and complexities of hate speech, violence, incitement, COVID misinformation, protest-related content challenging national leaders, and the limits of automated detection. The board considers the context of each case, including the rights of users who are posting content, not only freedom of expression, but also other human rights such as the right to non-discrimination or health, and the potential harms of dark content remaining on the platform. The board aims to hold Facebook accountable by giving more scrutiny to its initial decisions and bringing further transparency to what has been a very opaque and inconsistent process.


On China, Jack Qiu discusses the false dichotomy between the US and China in terms of data governance and sovereignty. He explains how popular discourse views these two countries as having taken different approaches to data governance, with the US adopting a market-based, multistakeholder approach, and China adopting a sovereignty-based, multilateral approach. However, he points out that this dichotomy is misleading, and that there are many nuances to the issue. The conversation also explores the historical context of data governance, tracing it back to the Cold War era and highlighting the role of international cooperation and multilateralism in shaping policies related to information and technology.

Qiu then turns to the current state of data governance in China, questioning the effectiveness of China's top-down approach to information management, including censorship and surveillance.

He argues that China's attempts to project an image of data sovereignty have been ineffective, as demonstrated by recent political protests in Hong Kong and the failure to prevent a white paper revolution against COVID-19 policies. Qiu calls for a deeper analysis of how Beijing's talk of sovereignty is translated into effective policies, particularly in relation to data flow within the country.

Section 230, the DSA, and more

Udbhav Tiwari discusses the regulation of online platforms and the different approaches taken by governments. He mentions the Digital Services Act (DSA) in Europe as an example of platform accountability, which includes accountability for content moderation decisions, data sharing, and independent audits for large online platforms. He believes that the DSA is likely to have a similar effect to the General Data Protection Regulation (GDPR) for data protection. India’s intermediary liability rules and yet to be announced Digital India Act seem to be heavily borrowing concepts and broader principles from the DSA. In contrast, in the United States, Section 230 provides broad protection for platforms from liability for content moderation decisions, which many believe needs to be reformed. Tiwari notes that American-based platforms tend to use Section 230 as a negotiating position. He also mentions that browsers have not been a major focal point for regulation, but that this could change.

Tiwari discusses the different approaches taken by states in the Asia-Pacific region, noting that most tend to be a very ‘paternalistic state control based approach rather than that necessary process and procedure based approach’. Singapore’s POFMA, Bangladesh’s Digital Safety Act, and India’s Over-the-Top (OTT) regulations are all examples of this approach.

Using India as an example, Tiwari observes that Platforms are becoming closer to the government and more willing to comply, choosing to sacrifice freedom of expression for market access.

Inter-government Cooperation

The UK is set to pass an Online Safety Law, and the Office of Communications (Ofcom) has been designated as the regulator to enforce it. Ofcom has been actively speaking with regulators from other countries, including Australia, the European Union, Fiji, and Singapore, to share insights and best practices for platform accountability. This collaboration is beneficial as it can lead to a global minimum standard for regulation, which can reduce compliance costs.

However, the speakers also suggest that a multilateral approach, such as a treaty, may not be in the users' best interest, based on past experiences in the internet governance space. Instead, voluntary bilateral and multilateral cooperation to set global minimum standards could be a recommended alternative.

Policy Recommendations

The Computer Emergency Response Teams (CERT) model

Urs Gasser suggests that collaboration and cooperation on international issues, such as mis and disinformation, may require fresh thinking and inspiration. He proposes a model similar to the Computer Emergency Response Teams (CERT) in cybersecurity, called an Information Quality CERT (IQ CERT), to share threat information and provide resources and technical assistance to educate and support stakeholders in combating mis and disinformation. This would involve private-public partnerships and a network architecture to facilitate collaboration among all actors involved. The IQ CERT could eventually coordinate responses to misinformation campaigns once trust is built. Gasser acknowledges that this may be a challenging task, but suggests that starting with information sharing and capacity building could be a good start.

Legislation Watch

  1. Digital Services Act (DSA)

  2. United States 1996 Telecommunications Act, Section 230

  3. UK Online Safety Law

  4. Digital India Act

  5. Bangladesh Digital Safety Act

  6. Singapore Protection from Online Falsehoods and Manipulation Act (POFMA)

Final words

The challenges faced by technology platforms in navigating global vs local imperatives and complying with different national laws are becoming increasingly complex. This roundtable discussion addressed the role of the Facebook Oversight Board in holding platforms accountable, the nuances of data governance and sovereignty, particularly in China, the different approaches taken by governments in regulating online platforms with the DSA in Europe and Section 230 in the United States as examples, and more. Finally, the idea of an Information Quality CERT was proposed as a potential model for international cooperation in combatting mis and disinformation. As technology continues to evolve and shape our lives, it is important to consider these issues and work towards creating a more transparent and responsible digital landscape for all users, locally and globally.


Platform Futures, a project by Digital Asia Hub, convenes a network of academics and experts to create a space for dialogue on opportunities, challenges, and governance best practices across the Asia-Pacific region.

16 views0 comments


bottom of page