S3-Episode 2: Trust and truth: Navigating the age of misinformation
Misinformation has been called a defining issue of our time, eroding trust in academia, science, and other key pillars of our society. Commissioner Kosseim speaks with Dr. Alex Himelfarb of the Council of Canadian Academies about what we can do about it, and why access to trustworthy, evidence-based information matters more than ever.
Notes
Dr. Alex Himelfarb is the chair of Council of Canadian Academies’ Expert Panel on the Socioeconomic Impacts of Science and Health Misinformation. He chairs the board of the Narwhal and is a member of the boards of Atkinson Foundation, the Public Service Foundation and the Advisory Committee of the Auditor General, and is a fellow of the Broadbent and Parkland Institutes.
- Choosing to lead the expert panel on science and health misinformation [2:30]
- Misinformation has become a defining issue of our time, why? [3:56]
- Social media, declining trust, and the quest for certainty [4:17]
- Fault lines in modern society [7:08]
- Socioeconomic impacts of science and health misinformation [8:57]
- Impact of misinformation on vulnerable and marginalized communities [11:00]
- With the rise of AI, what does the future hold? [12:36]
- Telltale signs of misinformation [14:29]
- Impact of misinformation on democracy [16:00]
- The role of government transparency and access to information in fighting misinformation [19:02]
- How individuals can fight back against misinformation [22:04]
- Building critical thinking, numeracy and media literacy into curriculum in schools [25:20]
- Communicating information more accessibly [26:14]
- Encouraging proactive disclosure by government institutions [28:13]
Resources:
- Fault Lines (Report of the Expert Panel on the Socioeconomic Impacts of Science and Health Misinformation, Council of Canadian Academies, January 26, 2023)
- Verified (United Nations project to improve access to accurate information)
- IPC Transparency Showcase sheds light on open government projects (IPC news release, May 11, 2023)
- Ontario Information and Privacy Commissioner calls on public institutions to join the Transparency Challenge (IPC news release, September 28, 2023)
- IPC Strategic Priorities 2021-2025
Info Matters is a podcast about people, privacy, and access to information hosted by Patricia Kosseim, Information and Privacy Commissioner of Ontario. We dive into conversations with people from all walks of life and hear stories about the access and privacy issues that matter most to them.
If you enjoyed the podcast, leave us a rating or a review.
Have an access to information or privacy topic you want to learn more about? Interested in being a guest on the show? Send us a tweet @IPCinfoprivacy or email us at @email.
Transcripts
Patricia Kosseim:
Hello, I’m Patricia Kosseim, Ontario’s Information and Privacy Commissioner, and you’re listening to Info Matters, a podcast about people privacy and access to information. We dive into conversations with people from all walks of life and hear real stories about the access and privacy issues that matter most to them.
Hello listeners, welcome to another episode of Info Matters. In today’s digital age, misinformation has become a serious problem. A recent report by the Council of Canadian Academies has called it a defining issue of our time. Misinformation can take many forms, from misconceptions and misleading facts that people believe are true to outright lies, rumors and distortion of facts that people deliberately share, knowing them to be untrue. Left unchecked, misinformation can spawn confusion and mistrust, spreading it like wildfire. Fueled by social media platforms, users can be reinforced and rewarded for sharing information frequently. Whether it’s accurate or not, misinformation can have serious and reaching consequences as we found during the COVID-19 pandemic. False information about the virus led people to pursue dangerous and ineffective treatments or none at all contributing to its spread.
Over time, people can also become skeptical of information from legitimate sources, thwarting efforts to address climate change and undermining trust in public institutions and democracy itself. In this episode, we’ll be talking about the dangers of misinformation and what governments, legislators, media, social media platforms and individuals themselves can do to curb its spread. My guest is Alex Himelfarb of the Council of Canadian Academies and led their expert panel on the socioeconomic impacts of science and health misinformation. Alex, welcome to the show.
Dr. Alex Himelfarb:
Thanks so much, Commissioner. Pleased to be with you.
PK:
So Alex, you’ve had quite a stellar career. You were a professor of sociology, a Senior Executive with the federal government for 30 years, and a Clerk of the Privy Council serving three prime ministers. So tell us what led you to chair the expert panel on science and health misinformation.
AH:
Since leaving public service and academia, my focus has been on social justice and environmental justice and it became increasingly clear that one of the obstacles to progress in the things I cared most about was misinformation. That misinformation was an obstacle to collective progress. It’s interesting that when the WHO was talking about public health in the midst of the COVID pandemic, they also talked about how their fight against COVID had to be matched by an equally vociferous fight against an infodemic, the massive misinformation that inhibited people from protecting themselves and one another and was creating huge pressures on health workers and frontline workers. Just months later, the intergovernmental panel on climate change wrote that one of the major obstacles to climate action progress was misinformation and politically endorsed misinformation, which is particularly powerful because it accepts the will to do the hard things that sometimes we need to do collectively to meet our challenges. So yes, misinformation stands in the way of individual and collective decision making that matters profoundly to our health and wellbeing.
PK:
So as you said in your intro to the report, misinformation isn’t really a new phenomenon. You wrote “Myths, conspiracy theories, and deliberate deceit are probably as old as human communication itself,” yet misinformation as you say, has become a defining issue of our time. So how did things get so bad?
AH:
It’s not new, but something has changed. In 2016, Webster Dictionary made post-truth the word of the year. Just last year, Webster Dictionary made gaslighting the word of the year. Something is afoot. So what’s happened? The first factor I think is the rise of social media and individual messaging platforms as a major way in which people get their information. A recent survey suggested that about 90% of Canadians during COVID got their information from social media or messaging apps. And what that means is that they are exposed to vast amounts of information, but also vast amounts of misinformation and almost entirely without mediation, without guard posts, without signposts, without people helping to guide through what’s true and what’s not true. It also, as you said, creates incentives to create bubbles of self-affirming information. The algorithms and incentives built into social media platforms make conflict, clash, and misinformation much more popular and fast spreading.
Add to that the second big factor, which is decades of declining trust in public institutions and government, but also in the media and in universities and private institutions as well. There’s just a declining trust in one another, a declining social trust as well as a declining political trust. Research shows that people haven’t really lost trust in the concept of science. What they’ve lost trust in is the institutions they used to rely on to get scientific information. So they don’t believe government the way they used to. They don’t believe public agencies the way they used to. They don’t even believe universities the way they used to. And media people will tell you they don’t believe mainstream media the way they used to.
So each of them finds their own sources often, again, part of these self confirming bubbles, so unmediated and untrusted. Now you add to that that we live in an age of layered and multiple crises. We have crises of democracy, a pandemic crises, democratic crises, crises of breakdown in the social fabric and in times of crisis, people want a degree of certainty that science and scientific knowledge doesn’t provide or doesn’t provide quickly enough and certainty is hard to come by, but they want certainty and they often want somebody to blame. And that makes us really ripe for conspiracy theories and really ripe for misinformation. You’ll put those three factors together, you have a perfect storm.
PK:
The title of the report is Fault Lines. Just in your introduction there I can sense a number of fault lines, but let me ask you in your own words, what’s the significance of the title? What fault lines are you referring to?
AH:
You can name them. Fault lines of class, racial fault lines, the lingering consequences of colonialism, gender fault lines, the fault lines of people with disabilities experience, sexual orientation and sexual identity. Misinformation feeds off those fault lines and deepens them, turns cracks into crevices. One of the real collective consequences of misinformation is to breakdown social cohesion. Misinformation has become tied up into identity and ideology. It used to be that we would disagree about things over the dinner party. Now we throw things at each other and there’s disturbing and venomous attacks now against politicians and journalists and frontline health workers, especially female and racialized workers.
That’s a result of this sort of changing nature of how misinformation is now part of our identity, you’re with us or you’re against us. It’s important in that respect that we not demonize people who say don’t want to vaccinate themselves. We shouldn’t be feeding these fault lines. There are a number of reasons why people might distrust misinformation and we have to understand their starting point. There’s also many reasons other than misinformation for people to behave in the ways they behave. So we shouldn’t make assumptions that even exacerbate those fault lines.
PK:
So the panel was asked to examine the socioeconomic impacts of science and health information on the public and public policy in Canada. So what ultimately did the panel find?
AH:
We provided quite a lot of evidence of how misinformation about vaccines made us vulnerable to preventable diseases, to hospitalization and to death. I wouldn’t dwell too much on the numbers, but there are calculations about how much that has caused in terms of personal suffering, but also in terms of collective costs. So those examples I think are obvious. If people don’t take the preventive measures that could have helped them or take medicines or drugs that are actually not intended for the purposes they’re taking them for, they’re going to get themselves in trouble. And that’s happened, sadly to thousands of people. The report is filled with those kinds of examples of individuals that have failed to prevent bad things from happening or actually brought bad things upon themselves that could have been avoided had they had the right information, or at least spent some thought thinking about the risks of acting on the information they did have.
The spillover or secondary costs are things like costs to the health system. The health system got overloaded because people who could have avoided getting sick didn’t and that meant other people weren’t getting treated. And we know the burden that that created for health workers, not just a work burden but an emotional burden. And then there are of course some of the collective costs we’ve already talked about, the kinds of social upheaval, the breakdown of social consensus that’s necessary if we’re going to take collective action. Misinformation has slowed down progress on climate change, the debates about whether it was real truly slowed it down and the debate keeps shifting, but it’s always at the base of it, misinformation, and it’s hard enough to do the hard things necessary to solve our problems. It’s almost impossible. If we can’t agree about where we are, it’s hard to imagine that we’ll ever agree about where we ought to go.
PK:
Let me just ask you to elaborate a little bit on what the panel found and particularly you found racialized and underserved communities were the most affected disproportionately by misinformation. Can you elaborate a little bit on the vulnerability of marginalized communities to the impacts of misinformation?
AH:
It’s not surprising that marginalized vulnerable communities pay the highest price for misinformation. They pay the highest price for everything that goes wrong in our world, this too, but part of it resides in their lack of access to good information. Access to good information and scientific information isn’t equal, isn’t even so that’s part of it. Part of it is that there is a legitimate and understandable distrust of public institutions that might be the source of help or good information and partly it’s the lack of alternatives, the lack of resources to choose effective alternatives. And so when there are collective consequences, they always hit the most vulnerable, hardest even if they’re on board, even if they’re well-informed. So an array of reasons and the evidence is over and over again that they’ve paid the highest price.
PK:
I don’t know if the panel had a chance to think about this in the context of artificial intelligence, but with the burgeoning field of AI, we’re seeing some very, very real images of the Pope in a puffy white coat or of former president Donald Trump in an orange jumpsuit being taken away in handcuffs after his arraignment, which of course did not happen. But the images are so incredibly real-looking and it seems like they’re only going to get better, which is only going to make misinformation worse it seems. How pessimistic or optimistic should we be about this situation getting worse or better over time?
AH:
I think there are real risks and they’re not in some distant future, they’re happening now as you say, even before the sort of growth of the popularly available AI, it was pretty easy to fake things on the internet. And yes, I think AI will make it much easier, much faster. It’s not unusual that technology outpaces law and culture and that we always have trouble catching up with technological innovation. And I think we will catch up. I’m seeing that some newspapers already saying they will not use AI even where it might make sense because they don’t want to jeopardize trust. And let me be clear, there are obviously some very pro-social abuses for AI. It has to be managed and it has to be managed within a legal and cultural framework and an ethical framework. And we’ll have some catching up to do, but I’m confident we’ll catch up. In the meantime, I think we’ll have to be extra vigilant and I think we’re going to have to be more willing than we’ve had in the past to regulate social media.
PK:
In the report, the panel talked about some telltale signs of misinformation that individuals can be attuned to or should be alive to. Can you give us some of those indicators that should make us suspicious or skeptical of information that we see or we view online?
AH:
Beyond sort of telltale deliberate misinformation and yes, there are a lot of indicators of that and certain language they use, but even more generally, I think we have to be very wary of any reports of a single study that purports to provide the definitive answer on any issue that matters. And you don’t have to look at deception and social media, even mainstream media will sometimes be incented to overdramatize a study because it’s sort of sexy and will sell copy and will get attention. And even scientists may sometimes be incented to exaggerate their studies’ import so it’ll get published and be visible. And we have to learn never to believe anything that’s a single study and we have to learn to not believe anything that’s a single expert. We have to be looking increasingly to what stands as the best scientific consensus at this moment, recognizing that even that is uncertainty and when somebody is selling a certainty, run for the hills.
PK:
The panel looked at misinformation in the context of science and health information, but the issue is far more pervasive than that chipping away at the pillars of our democracy. And just the recent example of Fox News admitting to making false claims that Dominion Voting Systems rigged the 2020 US presidential elections is an example of just how pervasive that could be, particularly in the case of a most watched news channel in the US and it cost the organization almost 790 million dollars in settlement claim. What are your views on how misinformation can impact something as fundamental as elections and the underpinnings of our democracy?
AH:
The panel was focused away from political and misinformation and deliberate political use misinformation, but you can’t look at science and health misinformation separate from political because one bleeds into the other. You’ve seen the politics of vaccine misinformation and how you can build a political constituency around embracing conspiracy theories. The more profoundly people have become committed to a conspiratorial view of the world, the more available they are is to be part of a political conspiracy and so politics and misinformation can be self reinforcing. So yes, it is profoundly affecting our political life and it is in my view, a significant threat to democratic principles and democratic institutions. And that’s because in authoritarian countries, misinformation is a deliberate tool to undermine any chance of protest and to maintain control. Hannah Arendt used to write that the first sign of the growth of authoritarianism is the breakdown in truth.
And when you need to rely on the powerful figures as your source of truth rather than traditional institutions as a source of truth, it’s really threatening to democracy. It undermines democracy. In a sense, democracy depends on our having information to make informed collective decisions. Just like our individual wellbeing requires us to have information to make individual decisions. One of our panelists talked about this as a democratic right, the right to have the information to make informed decisions collectively about our future. Misinformation undermines that and therefore undermines democracy but it also feeds autocracy, feeds authoritarianism, and there’s a rich literature that that says that’s not some lunatic fear. There are powerful and profound instances where misinformation fed, nurtured and maintained autocracies.
PK:
So it’s interesting because you were saying one of the antidotes to misinformation is to encourage a plurality of sources of information and countervailing information. I want to know what the panel found, but also from your own personal experience, having worked in senior public service for over 30 years, having served three different governments, what role does government play? What role does transparency and proactive disclosure play in filling the vacuum and providing some of those countervailing sources of truth?
AH:
I think that’s a really important point, especially for the Information and Privacy Commission, but in a time of misinformation where there is no trust, not revealing just intensifies the pressure to come up with the explanations, feeds conspiratorial thinking. It’s messy. A lot of what government does and decisions we make are messy and sometimes when we expose them we get hit for it, but we get hit even harder if we hide it. When you do spring-cleaning and you get the information out, the dust flies and sometimes you pay a price, but if you don’t clean, things are much worse. And so it’d be naive to think that exposing our information will build trust overnight. In fact, it creates doubts as well, but nothing is worse than hiding the truth. So I think government has a huge role in being proactively transparent and relying less on freedom of information requests.
The other thing I think is really important is how we treat information even when it’s public. Everybody loves to talk about evidence-based policy. How could you not like evidence-based policy? We like policy, we like evidence. Evidence-based policy is like apple pie and ice cream, but we often use it to hide behind science, but science doesn’t tell us what the right thing to do is. It tells us what the costs of various options are and moral and value judgments along with evidence will tell us what the right thing to do is and we shouldn’t hide behind it and we should have those moral arguments because we do a disservice to science. Let’s treat science as the knowledge that we use to assess risks and rewards and recognize that when we’re assessing risks and benefits and rewards, we’re doing that on a values basis and let’s be explicit about what those values are. So I think yes, proactive disclosure, even when it hurts and don’t hide behind knowledge, have the moral discussion and be the moral leader the government’s got to be.
PK:
I know that you considered as a panel other players too, in addition to government, you looked at what legislators can do, social media platforms, the role of journalism and media and even individuals themselves, you and me, we all have a role to play in busting myths and deconstructing misconceptions, misinformation. Can you elaborate on some of those leading practices and what can we and others do?
AH:
On an individual basis, just slow down. Before you click to retweet something, read it and then ask yourself, is that a good source? Do I know about this source? Do I really want to engage other people without knowing more? Have I checked? Just slow it down. Ask yourself questions. I think that’s really important to take seriously that every time you retweet, you’re engaging other people in what might be a mistake. Retweeting something is a responsibility. Do you know what this magazine is? Have you ever heard of this scientist? Do you know what that person’s standing is in the community? Do you have a purpose for retweeting? So I think we can all be in that respect more careful. There are three categories of intervention that I think have been shown to be effective. The first is to take on misinformation directly, to identify it, to label it, and to reduce the amount of it.
And in that respect, there’s a wonderful program run by the UN and 19 countries are involved called Verified. What the UN does is it monitors in the internet areas of misinformation that they fear could do damage to people’s wellbeing. They labeled it and they flood that part of the internet, that social media platform or whatever it is that they’ve identified with really good information in various media forms, really accessible. So that’s part of the answer is deal with it directly, identify it, label it, and then debunk it. And part of that, in my view, we ought to be a little bit braver about regulating the social media environment. All of us on the panel, and I’m sure you, are committed to preserving freedom of expression and we don’t want to be dictating what’s true from some government department of truth. But what we can demand is that social media platforms take seriously their role in moderating content and tell us how they do it, to take seriously their need for transparency on how their algorithms work and what they’re doing to make sure that they are not furthering misinformation.
And by doing that, by forcing them to be more transparent, we’ll also learn much more about what’s going on and how we might address it. And I would also regulate how they use private information. So they can’t keep targeting misinformation to me because they know my vulnerabilities. There are things we can do that are totally consistent with preserving freedom of expression and don’t pretend to pronounce from some platform what is or what isn’t true and still make progress. I think the second is to equip people, individuals, to be better at sorting through misinformation and that means building into the curriculum, critical thinking skills, their literacy skills, numeracy skills. And here I would point people to a program in Finland that’s built into the school system, that is directly focused on teaching kids how to identify misinformation and how to ensure that they’re not vulnerable to it and it has proved to be extremely successful.
So in the short term, we should be telling people what to look out for. That’s one of your questions. What are the signs? It’s really easy, for example, for sites to pretend they’re a legitimate media site and to give themselves labels and names. Have a look, double check, make sure that it’s real. And the final thing is something you and I have already talked about and that is how do you present real good information better so it’s more accessible and that also means be honest about the level of uncertainty. There are two ways in which science is communicated. One is to persuade and one is to build trust. We should stop communicating science to persuade and we should increasingly communicate science to build trust. And that means be honest about the qualifications, be honest about the level of uncertainty, correct yourself when evidence requires it.
Making good information accessible, making it transparent, making it honest, making it integral. All of that is key. And part of that is I think knowledge institutions have to build relationships with the communities they serve. They have to build trust, not at a time of emergency, but day after day they have to engage the community in their work, help the community understand what their work is, they have to bring the community to them and be in the community, embed themselves. So those are the three streams. Identify, label, debunk, prepare people better to deal with it on their own and communicate good information better than we have.
PK:
Obviously as a senior public servant, you’ve had to deal with freedom of information. You’ve dealt with privacy commissioners, information commissioners. Let me ask you your advice in terms of what you think we as the Information and Privacy Commissioner of Ontario could do to contribute to the remedies that you lay out in your report, some of the solutions. Our office of course is mandated with protecting privacy but also promoting transparency. And so from both sides of the coin, what can we do better or more of?
AH:
When I was in government, I was not always the model for access information and it’s labor-intensive, it’s sometimes harder than the deadlines allow. I’m not making excuses, we got better over time. But if you could encourage departments not to wait for the request, to find systems for proactively making their information available and for letting people know that it’s available and for making it not only available in some remote way that you have to be an expert to find, but truly accessibly, I think that would be over time, a real trust building exercise and what office is better equipped to do that. And it’s actually in some ways a kind of less punitive, less “enforcement” approach to your mandate, which is to say, look, don’t always wait for them to ask because we know it can be onerous. So just do it.
PK:
It’s interesting because I think I mentioned to you that we launched a Transparency Challenge a little earlier this year to reveal some great examples of open data initiatives, open government initiatives, proactive disclosure in a Transparency Showcase. We hope that the showcase will encourage others by seeing these examples to say, we could do this too, and we could be more courageous in being proactive with the information we have and making it accessible in some really neat ways that people can understand and make use of. And I’d be interested in getting your views, Alex, if you think we’ve hit the mark and at least moving the dial a little bit in terms of encouraging or promoting openness.
AH:
I think it’s a great initiative for what it’s worth. It’s exactly the kind of thing that your commission can do that would actually make a difference.
PK:
Well, thank you so much, Alex. That’s certainly an encouraging note to end on. You’ve been extremely informative, insightful. I’ve learned a ton just speaking with you and reading the report, which I encourage our listeners, whoever’s interested in getting a copy and reading it. Now, it is a long report, but the executive summary does a fantastic job of synthesizing the key messages. So thank you once again, Alex.
AH:
My pleasure. Thanks for having me.
PK:
For listeners who want to read the Fault Lines report, you can find it on the CCA website or look for a direct link to it in the show notes to this episode. For listeners who want to learn more about privacy and transparency, you could visit our website at ipc.on.ca. You can also call or email our office for assistance and general information about Ontario’s access and privacy laws. I want to thank everyone for listening, and let me reassure you, this episode of Info Matters is real and I truly did speak to Alex Himelfarb. Thank you, and until next time.
I’m Patricia Kosseim, Ontario’s Information and Privacy Commissioner, and this has been Info Matters. If you enjoy the podcast, leave us a rating or review. If there’s an access or privacy topic you’d like us to explore on a future episode, we’d love to hear from you. Send us a tweet @IPCinfoprivacy or email us at @email. Thanks for listening and please join us again for more conversations about people, privacy, and access to information. If it matters to you, it matters to me.