Skip to the main content
Global Digital Pandemic Responses: Some Reflections on Four Country Case Studies

Global Digital Pandemic Responses: Some Reflections on Four Country Case Studies

Student contribution

Over the past couple of months, we’ve studied different digital responses to COVID-19, focusing on proximity tracing apps, in countries around the world and produced four country-specific briefings for Germany, Singapore, Switzerland and Taiwan. There are vast differences between some of the countries’ responses, not necessarily in terms of the technology, but in terms of culture, public perception, and legal interactions. Some of the trends we’ve seen are easy to explain while others have left us with thought-provoking questions as countries continue to use digital tools in response to the pandemic. The following is a discussion of our main insights from our briefings as well as important questions looking forward.

Privacy precautions do not guarantee uptake.

For the voluntary contact tracing app approach, gaining public support is essential towards making these apps effective, and privacy concerns seemed among the biggest roadblocks in gaining this support. Our briefings illustrate how some governments took a number of precautions in response to these concerns and to protect user privacy. For example, Switzerland and Germany used the DP-3T decentralized protocol for their apps to protect user anonymity. They also made their apps open source to promote transparency and invite feedback. They have collaborated with research institutions, private companies and data protection authorities to optimize security. Yet despite the painstaking efforts to gain the public trust, voluntary app downloads have remained below levels of what is needed to be effective. Only around 20% of Germany’s population has downloaded its app and in Switzerland, only around 15% of the population is actively using the app. Although there is a great deal of uncertainty surrounding the download rate needed for these apps to be effective contact tracing measures, the current rates are less than ideal. While it is an open empirical question as to what remaining privacy concerns are at least partly responsible for the relatively low adoption rates, our observations so far suggest that even in high-trust environments, a “privacy by design” approach seems to be a necessary, but not sufficient condition.

Other important barriers may exist at the technical level. Many of the apps require newer iPhone or Android models, which not all people have access to. Estimates in Singapore, for example, show that around 20% of their population do not have the adequate technology to use the TraceTogether app. This is especially worrisome given that the digitally excluded groups in Singapore are among the most vulnerable to the pandemic, including elderly and poor communities, which is a main reason why Singapore pivoted towards their wearable token devices. Thus, it is particularly important for countries to think in terms of accessibility and not just in terms of public trust if they wish to improve their technological responses to COVID-19.

Adherence is shaped by various behavioral factors.

A second insight from our briefings is that many of the proximity tracing apps rely on human action beyond downloading the app to be successful, and if these actions are not performed, it could hinder the digital response. One example is in Germany where the Corona Warn-App requires a user to manually toggle a switch in the app to indicate they tested positive for COVID-19, and validate their test result either with a QR code or by manually entering a government issued code. Similarly in Switzerland, a user must enter the “covid-code” they received from the government if they wish to indicate that they tested positive. Further, in Singapore, an app user must upload their data from the app to a government server when they have tested positive, and a token user must physically turn in their token. These examples demonstrate the possibility that the contact-tracing functionality of the app might not work as designed, for instance, if users choose not to report that they have tested positive.

Beyond an app user’s individual behavior, there are certain actions that institutional actors must complete as well that shape outcomes. For instance, as part of Singapore’s token plan, volunteers manually distribute tokens to recipients. These volunteers physically go to recipients' homes to deliver the tokens and provide instructions about the technology -- without this, recipients who do not have mobile devices may remain without access to the technology. Ultimately, this makes contact tracing apps especially vulnerable to human decision making -- the technology might work well, but the system could fail without certain user behavior. Without further automation or integration, these systems will remain dependent on behavioral aspects.

Culture is especially influential.

A third observation from our briefings is the way in which culture has impacted both the government’s actions and public perception to a country’s digital response, especially with regard to contact tracing. This is perhaps best illustrated by the German case, a country with a history of public skepticism with respect to government surveillance measures. Knowing this, the government specifically tailored its actions towards accommodating any concerns, for instance, by switching from a centralized to decentralized storage method despite extra costs, and by publishing the app’s source code online. This is vastly different from a country with high levels of government trust, like Taiwan. Even though Taiwanese citizens are sensitive to an overreach of civil liberties due to Taiwan's history, the belief in “necessity for the public good”, a very community-based sentiment, made it so the Taiwanese government could implement less privacy-conscious tools. Based on our initial observations, this type of mandatory, invasive response would not be possible in countries where concerns about civil liberties are more predominant. Thus, cultural norms shape the type of digital response that is even possible in a given country, and in some instances have even shaped the legal norms protecting civil liberties.  

The above insights are just a few of the general trends we’ve seen so far through our studies of proximity tracing in four countries. However, the uncertainty surrounding COVID-19 and the novelty of proximity tracing apps has left us with many outstanding questions about the interaction between technology, the law, and human behavior. A comprehensive research agenda written by researchers in Switzerland, put forth a detailed discussion about proximity tracing apps and their efficacy as well as other technical and legal considerations, such as privacy considerations in light of data collection. We explore a few of these questions below, mainly from a big-picture approach with regard to our own research, in an effort to identify areas that are worth studying further.

Has emergency legislation reinforced or overwritten existing privacy protections?

Our findings have left us with a number of questions as we look towards the future of digital tools in the emergency response space. First, have laws enacted during the public health emergency overwritten traditional expectations of privacy protections? The interaction between such legislation enacted in a moment of crisis and default legislation is murky, for instance with respect to the question of how emergency norms are governed by pre-existing data protection laws, as in the case of the GDPR in the European Union, which provides a legal basis to public health authorities collecting data during a pandemic, and gives further detail as to what that actually allows. Many countries also have specific laws that trigger special provisions in the event of a public health emergency -- like the Epidemic Act in Switzerland or the Infectious Disease Act in Singapore -- and others have adopted new or modified legislation in response to COVID-19, like Taiwan’s Special Act. When triggered, these laws confer broad centralized power to the government in order to combat the health emergency at hand. Since many of these countries also have personal data protection laws in place, the question becomes whether or not the emergency government action must adhere to privacy protections or if the emergency nature overpowers the protections due to necessity. For example, Taiwan claims their Electronic Fence System falls within the “necessity to furthering the public interest” exception to their Personal Data Protection Act, but does not provide further detail or justification, falling short of preventing the collection of personal data that had previously been safeguarded by the law.

Another tension can arise between powerful social norms regarding privacy and any emergency-related legislation that affects the level of legal protection of privacy rights. In Switzerland, for instance, amendments to their Epidemic Act, which gave their app response a legal basis, are being challenged under a referendum due to a perceived tension between the emergency legislation and well-established privacy protections. The fear of unfettered surveillance power, reidentification, misuse of data, as well as broad terms in the legislation are all reasons why public groups may want to challenge legislation enacted in moments of emergency.

How do architectural choices and behavioral traits interact with each other and impact the efficacy of a country’s digital response?

This brings us to our second question. The net effect of digital technology in response to the pandemic is not only shaped by laws, including the level of privacy protection it provides during such a crisis. The efficacy of any digital response is also governed by a complex interplay between the choices made by the designers of technology on the one hand and the behavior of the users interacting with these technologies on the other hand. The power of “code” as a constraint on behavior becomes visible when comparing the European-style decentralized architectures of proximity tracing systems, like in Switzerland and Germany, to the surveillance-heavy Taiwanese digital pandemic response. In the decentralized approach, health officials might be able to tell how many people have downloaded an app, but there’s no way of knowing if people are actually being warned because of the user anonymity. Another example showcasing the impact of technical architecture is in Taiwan with their mandatory, real-time location tracking and exceptionally low infection rates and number of quarantine violations. It is interesting to observe how such “hard-coded” norms are interacting with behavioral aspects, like public support and wide scale technological accessibility, and together shape the impact of a technology’s efficacy.

Singapore’s approach also provides an example that demonstrates the effects of a complex set of behavioral choices. Singapore chose a centralized app, even though the centralized app approaches forgo the element of complete anonymity with the government. At the same time, they still utilize some privacy protections, like using encrypted IDs for contact encounters and deleting stored data after it is no longer useful, and do not have to be compulsory, which can help maintain public support, while still being effective due to the government’s role in performing contact tracing with the app data. Beyond this, choosing a technology well-suited to a demographic plays a role in efficacy, based on our observations. In Taiwan where an estimated 99% of the population have a mobile device, a mobile-based response would be effective, whereas in a country with less mobile users, like Singapore at around 80%, there would be gaps with a mobile-only approach. Thus, Singapore has begun rolling out tokens to reach non-mobile users. As such, it seems like a series of factors including voluntariness, government involvement, and accessibility all weigh into a digital response’s efficacy.

Across these areas, one important question is who makes the architectural choices that have the potential to shape the efficacy of digital technologies used in response to the current pandemic. One example observed in the country case studies, is the important -- and from a democratic legitimacy perspective, not unproblematic -- role large technology companies play. While a country’s central government may seem like the main entity in charge of its digital response, a couple of large technology companies have heavily influenced the available proximity tracing technologies, particularly Apple and Google. Due to the power these two companies share in the mobile phone space, countries were strongly pushed to adopt the Apple/Google decentralized API, or risk technical incompatibility otherwise -- Germany switched from centralized to decentralized and other countries faced the difficult choice of scrapping their plans or sticking to home-grown methods, like Singapore, and risk experiencing technical issues. It remains an open and contested question how these and related power dynamics between governments and technology companies can and should be addressed, in a moment of crisis like this, but also in the long-run.

Can ethics help chart the path towards best practices?

Given the complex interactions among laws, privacy protections, and public trust in a moment of unprecedented crisis and the uncertainty that comes along with it, best practices to a digital pandemic response have yet to emerge. While proximity tracing apps can build upon a more general set of design practices and implementation experiences, many of the design questions are open-ended given the novelty of the concept. Since governments began designing and using digital tools in their response to COVID-19, a number of groups have shared what they think is most important towards shaping these tools, principles which at a high-level relate to conversations about what is ethical.

To name a few we’ve observed through our research, there’s the Germany Chaos Computer Club’s best practices, the proposed Coronavirus (Safeguards) Bill in the United Kingdom, the WHO’s ethical consideration guide, and even more comprehensive, a book written by the Johns Hopkins Project on Ethics and Governance of Digital Contact Tracing Technologies. Many of these documents are privacy-centric. Collectively they discuss user anonymity to prevent re-identification, transparency in the form of open source to allow for auditing and commenting, voluntariness with no penalty for non-use, limited data collection and storage, testing standards, oversight committees, and limits to central government involvement in order to protect the public against expansive surveillance power -- all of which at a high level, center around the ethical dilemma countries might face while designing a proximity tracing app, especially when it comes to protecting personal freedoms while collecting data. A number of countries we’ve studied have adhered to some of these practices, for instance Germany and Switzerland keeping their apps voluntary, which protect civil liberties and bolster public trust, but potentially at the cost of effectiveness, which is an interesting ethical dilemma.

All in all, these best practices are an aspirational set of ideas and standards, but since they are not law and thus not binding by definition, a country can choose to implement or ignore them. Some practices may be better suited to certain countries over others, based on culture and access to technology. Also, some practices extend far beyond technical suggestions into much larger discussions about the role of ethical frameworks in protecting the public while managing a pandemic. Given the uncertainty, it will be interesting to see how countries address these ethical questions in the coming months as they continue to work towards effective responses to the pandemic.

What is the role of digital inclusivity in a country’s digital pandemic response?

Perhaps the most challenging, and underexplored, question when it comes to the design and use of technology during a health emergency, is that of inclusivity. What are the approaches countries are implementing (if any) in order to make their digital pandemic responses inclusive?  The Singapore case is an important reminder that countries must acknowledge pre-existing inequalities in technology, not only for their response to be effective (for instance with high download rates), but so as to prevent deepening existing inequalities or create new gaps and divides.

The pandemic has had a disproportionate effect on various communities, including racial and ethnic minority groups, as well as elderly, poor, and rural communities, to name several. Within the context of our own research, our four country case studies highlight the vulnerability of elderly and poor communities to digital inequalities, which can be due to disinterest, lack of technological literacy, and cost, making it important for governments to consider these groups when deploying a solution. In countries where a significant percentage of people do not have mobile devices or do not have new enough models required by the Apple/Google API, many people will be left without access to helpful tools, expanding the digital divide. Further, any data collected through these tools would be underinclusive, an incomplete representation of a country’s population. Thus, as countries continue to respond to the pandemic, it is important that they consider their demographics in an effort to design an inclusive, widely-applicable solution.

You might also like


Projects & Tools 01

BKC Policy Practice: Digital Pandemic Response

The BKC Policy Practice on Digital Pandemic Response is an interdisciplinary program that works with public and private decision makers on difficult questions around the use of…