COVID-19: Who Should I Trust?

Most of us aren’t epidemiologists, so we have to take the word of others for what to believe about the novel coronavirus, and COVID-19, the respiratory disease that it causes. And misinformation about it abounds.

This misinformation comes in several forms, including false remedies; conspiracy theories; memes that downplay or politicize the threat, or trivialize it by comparing it to well-known, treatable diseases; and conflicting testimony from authority figures. These forms of misinformation all have serious consequences, and they can fool even those who are in a position to know better.

Furthermore, this misinformation often stems from a basic psychological need to help each other understand what’s going on. University of Washington professor Kate Starbird calls this “collective sensemaking.” The problem, of course, is that when everyone is pitching in on this effort, the sensible explanations can be drowned out by the uninformed ones.

What is to be done about this? Perhaps there are things that large entities, like Facebook and Twitter, can and should do. But what can we do, as individuals?

As a philosopher, it’s tempting to say, “Learn to think critically!” And while this is good, important advice, there are several problems with this response. First, it takes years. We’re in a pandemic; we don’t have years. Second, it won’t necessarily shield you from disinformation. As Washington State University Vancouver researcher Mike Caulfield points out, bad actors can pretty easily overcome all the tips and tricks that a critical thinking approach might teach you about learning to recognize misinformation. And third, teaching people to research things for themselves won’t solve the problem of deciding which sources of information to trust in the first place. People must be able to discern which sources are credible prior to researching.

Likewise, we can’t expect everyone to track down every new meme or piece of information they encounter through a site like Snopes, FactCheck, NewsGuard, or BuzzFeed. Such sites are useful in particular cases, but they are not a solution to the misinformation problem.

To solve that, there’s only one fix, and unfortunately there’s no shortcut. We have to accept what the experts say. All the critical thinking and tech-savviness in the world won’t help us if the “experts” we trust aren’t actually experts.

So how do we know who we can trust for issues like COVID-19? Fortunately, there’s no need to reinvent the wheel here. Society has longstanding structures and norms in place for determining who knows what they’re talking about. Here are a few:

· Experts tend to have formal credentials from reputable educational institutions. Note that not all degrees are created equal. A decent (but not foolproof) guide here is to favor the sorts of institutions that mainstream professional organizations look to for guidance and from which they want to hire.

· Experts have experience working with the issue at hand or relevantly similar issues. Note that expertise in one area does not entail expertise in another, even if to a layperson the differences may not be clear. For example, a general practitioner is not automatically an authority on a new viral disease like COVID-19. They, too, must listen to the epidemiologists.

· Experts generally do not benefit personally from getting individuals to believe certain things (though exceptions might include public health emergencies).

· Experts have a strong reputation among other experts.

Of course, these are not perfect indicators. After all, many fringe communities have their own forms of credentialing. And there’s a worry about circularity with the last point: if in order to recognize an expert, I must be able to tell who the experts are, then I’m in a bind. But these are merely general guidelines. While none of them are sufficient individually, where they all overlap, we can be pretty sure.

Put somewhat crudely, expertise comes down to time spent thinking about something in a critical environment. When it comes to a public health crisis like COVID-19, those who have the most relevant formal training, experience, and reputation are epidemiologists. And fortunately, their take on coronavirus is unanimous and forceful.

Quite simply, if you’re not an epidemiologist, then your epistemic responsibility — i.e., your responsibility as a knower — is to accept what they tell you.

But what about those friends (we all have them) who are not open to hearing from genuine experts, perhaps because they’re already the victims of misinformation or because they live in an echo chamber? A couple tips come to mind: (1) Be kind to them. People in such situations are very unlikely to transfer their allegiance from one authority to another if they feel threatened or marginalized. Approach them as you would approach a loved one in a cult (the epistemic situation is actually eerily similar). (2) If possible, appeal to sources that they are more inclined to trust. For example, many who might think that the coronavirus is a hoax or overblown also have a high respect for the U.S. military. Pointing out that the military is taking the threat seriously might be useful.

At the end of the day, however, we’re all responsible for shepherding our own beliefs. Let’s all do our best to care for one another, both physically and with the information we choose to share.

Originally published at https://stories.marquette.edu on April 5, 2020.

philosopher writing about disagreement, public discussion, trust, expertise, and (occasionally) politics and religion