Digital Contact Tracing and Trustworthiness: What Building Trust Means
People’s trust will play a major role in the success of a voluntary digital contact tracing (DCT) app. Without that trust, we may see DCT with little use (it requires 60% use rate), no at all DCT, or forced DCT imposed by authorities. Interfacing with social factors, DCT falls smack into the middle of a trust conflict zone that must be resolved.
read original version on Medium here.
The DCT app is an extrapolation of a phone call with our friends and loved ones. One might call family to 1) share personal information like “I am sick” and “who I have been in contact with” and 2) to help them take proper precautions. Simply put, we trust and take care of one another. Except now, with DCT, the context has changed:
- The object in question is a technological device or program.
- The personal information in question is being digitally stored via an app.
- The electronically transmitted “social contact” is being controlled by an external entity.
- There is a feeling that someone or something else (potentially unknown) tracks our status.
Now things get tricky with trust. And the fact is, individuals are less likely to engage in social projects such as DCT when they don’t perceive trustworthiness.
And skepticism is a good place to start: it opens the doors to innovation becoming truly trustworthy and mitigates the risk of innovative projects running amok (e.g., picture forcible DCT from China playing out in Switzerland). In other words, we should place our trust in what is actually trustworthy by assessing whether it fulfills certain requirements.
We can imagine those requirements as building blocks that are used in the construction of trust and in- or decrease the levels of trustworthiness. Here are six of these building blocks which seem particularly important to the current discussions surrounding the possible DCT app in Switzerland.
1st, clear values that underlie DCT implementations must be established. For example, DCT may need to be voluntary if autonomy is highly valued. We are more likely to perceive trustworthiness where we see our own values reflected; they are integral to trust relationships.
2nd, we need to feel we can rely on the app. Even within Europe, experts promote dissimilar approaches to DCT, as reflected in the (de)centralized storage controversy. Given the multitude of voices and opinions, some now question how average citizens can be expected to rely on the final version doing what it claims: help us slow the virus, but not by violating our privacy.
3rd, people can gain a sense of trust through other trusted partners. In Switzerland information regarding the push in favor of DCT is coming from scientific communities, with information from the most up-to-date sources. This is coupled with the basic assumption that scientists are neutral rather than perusing a political objective. Their neutrality is a fundamental component of scientific trustworthiness. If we trust scientists– medical, tech and social – to make well informed decisions or recommendations for the development of the DCT app, then its trustworthiness increases as a result.
4th, we are more likely to trust a system if there are assurances or plans in the case of failure. It creates the necessary two-way relationship which is often missing in technological trust contexts. What if, for example, there is a data leak? Or, what regulations ensure that we won’t become an Orwellian surveillance state?
5th, individuals need to be able to get clear information regarding their potential use of the app to make well-informed decisions. Transparency is critical in relation to the technical and social side of the app. Yes, this means the app runs the risk of not being used if users do not agree, but asking for blind trust is an even greater risk.
6th, an ethos of trust is critical. The more we trust our fellow citizens, the more we can extend trust to other situations – such as using DCT. For example, one of the big promises of DCT is that individuals can efficiently practice social distancing. If notified they were in contact with an infected individual, then they can go into isolation. We therefore have to trust others to do the same AND trust that there is enough social support to make this a viable option (it has been pointed out that self-isolation is a privilege of the rich).
In the current crisis, the importance of an ethos of trust is seen in the interplay between three players: civil society, government authorities, and the scientific community. Scientific assessments (epidemiological, virological and technological) have been given tremendous political weight; civil society is pressuring political decisions to be based on solid scientific findings. Thereby, the public’s confidence in regards to proportionality and effectiveness of the measures can be increased. Perhaps the high trust-stakes involved in getting it wrong is one reason why the Federal Council has been reticent about concrete measures surrounding the DCT app. What is currently evident is that the interaction between government, science and the public presupposes an ethos of trust as much as it shapes it in real time.
The building blocks of trust are crucial for the useful implementation of a DCT app that remains voluntary. Of course, we wouldn’t need this discussion if there was a 100% certainty that nothing could possibly go wrong. Given the innovative nature and the surrounding crisis context, that scenario is unlikely. Trust helps us bridge and confront those uncertainties. We need to think carefully about how this program will be communicated and constructed if it is to be trusted and successful.