Podcast

S3-Episode 10: Best of season three

Info Matters Podcast Cover Graphic

Tune in for riveting conversations about people, privacy, and access to information as Info Matters revisits its most thought-provoking moments from season three. This recap includes a variety of guests talking about cybersecurity, misinformation, genetic data, artificial intelligence, neurotechnology, women’s access rights and more … have a listen!

Notes

A round up of key moments from season three of the Info Matters podcast, winner of the 2023 Canadian Podcast Award for Outstanding Technology Series.

Resources:

Info Matters is a podcast about people, privacy, and access to information hosted by Patricia Kosseim, Information and Privacy Commissioner of Ontario. We dive into conversations with people from all walks of life and hear stories about the access and privacy issues that matter most to them.

If you enjoyed the podcast, leave us a rating or a review.

Have an access to information or privacy topic you want to learn more about? Interested in being a guest on the show? Submit a comment to @IPCinfoprivacy or email us at @email.

Transcripts

Patricia Kosseim:

Hello, I’m Patricia Kosseim, Ontario’s Information and Privacy Commissioner, and you’re listening to Info Matters, a podcast about people, privacy and access to information. We dive into conversations with people from all walks of life and hear real stories about the access and privacy issues that matter most to them.

Welcome to a special retrospective episode of Info Matters. As we close out Season 3, we’re taking time to reflect on some of the most noteworthy conversations we’ve had with various guests on the show. Today, you’ll hear a curated selection of memorable clips that have resonated deeply with me and our listeners. So let’s journey back and relive some of those pivotal discussions.

As the Information and Privacy Commissioner of Ontario, I’m both fascinated and concerned about how rapidly our world has transformed, with technology blurring the lines between science fiction and our current reality. Dr. Christopher Parsons, former associate at the University of Toronto Citizen Lab, and now Manager of Tech Policy at the IPC, joined me on the first episode of the season, for an insightful dive into how policing tools and methods have leapt decades in just a few short years.

You’ve spoken about how we’re already living in a kind of sci-fi world when it comes to technological capabilities of law enforcement, somewhat similar to what was portrayed in the movie Minority Report that I just mentioned a couple of minutes ago. Tell us, in what ways has policing entered the science fiction world?

Chris Parsons:

So 30 years ago, if someone wanted to figure out what you had been writing to someone else, you’ve either got to intercept and steam open letters, which is actually legally fraught in this country. Or if you want to tail someone, where are they going in a city? Where are they traveling? You’ve got whole teams of officers, either in the security services, your law enforcement, that are going to have to track that person, which means handoffs and cars and planning and hotel stops. There’s a lot that goes into it.

Our diaries were discrete things that you could access by breaking into someone’s home. Safes were where we held the most important documents. Law enforcement interceptions meant clipping alligator cables to telephone line. Move to today, law enforcement can undertake many of those activities with far fewer human resources and much more quickly. So we store huge amounts of data in the cloud, encrypted and non-encrypted both. So that means, if you’ve got a personal diary, there’s a better than zero chance that’s probably sitting on a server somewhere. You might have a blog or a website where you post personal thoughts that once we’re much more private. Smart sensors are proliferating everywhere, which means that there’s a lot of our mobility data that’s available. Smartphones themselves are uploading geolocation information to our photos and other things. So there’s an other source of information so we don’t have to have 30 agents figuring out where you’re going.

We live in a world today where the most advanced law enforcement actors, the most well-resourced law enforcement actors, can do things that really were science fiction just 20 or 30 years ago.

Patricia Kosseim:

Chris then went on to describe predictive policing. For example, how algorithms are used to create heat maps, by assessing which neighborhoods are likely to experience higher levels of crime, or how algorithms help determine bail assessments based on factors that predict when an individual is more or less likely to violate their bail conditions.

But what happens when those algorithms are trained on historical data, usually involving racialized Black and Indigenous communities already overrepresented in the criminal justice system. Chris goes on to explain how the predictions in those cases might lean a certain way and actually perpetuate bias and discrimination in the system. It was a fascinating discussion, definitely worth a listen.

In the second episode, I spoke with Dr. Alex Himelfarb about an expert panel he chaired for the Council of Canadian Academies, examining the rise and real-world impacts of misinformation. Here’s a short excerpt from that conversation that helped set the scene.

So as you said in your intro to the report, misinformation isn’t really a new phenomenon. You wrote myths, conspiracy theories and deliberate deceit are probably as old as human communication itself. Yet misinformation, as you say, has become a defining issue of our time. So how did things get so bad?

Alex Himelfarb:

The first factor, I think, is the rise of social media and individual messaging platforms as a major way in which people get their information. A recent survey suggested that about 90% of Canadians during COVID got their information from social media or messaging apps. And what that means is that they are exposed to vast amounts of information, but also vast amounts of misinformation, and almost entirely without mediation, without guard posts, without signposts, without people helping to guide through what’s true and what’s not true.

It also, as you said, creates incentives to create bubbles of self-affirming information. The algorithms and incentives built into social media platforms make conflict, clash, and misinformation much more popular and fast spreading.

Add to that the second big factor, which is decades of declining trust in public institutions and government, but also in the media and in universities and private institutions as well. There’s just a declining trust in one another, a declining social trust as well as a declining political trust.

Research shows that people haven’t really lost trust in the concept of science. What they’ve lost trust in is the institutions they used to rely on to get scientific information. So they don’t believe government the way they used to. They don’t believe public agencies the way they used to. They don’t even believe universities the way they used to, and media people will tell you, they don’t believe mainstream media the way they used to.

Patricia Kosseim:

I then went on to ask Dr. Himelfarb what governments can do to address the flood of misinformation out there and help fix the public trust deficit.

So it’s interesting, because you were saying, one of the antidotes to misinformation is to encourage a plurality of sources, of information, and countervailing information. I want to know what the panel found, but also from your own personal experience, having worked in senior public service for over 30 years, having served three different governments, what role does government play? What role does transparency and proactive disclosure play in filling the vacuum and providing some of those countervailing sources of truth?

Alex Himelfarb:

I think that’s a really important point, especially for the Information and Privacy Commissioner, but in a time of misinformation where there is no trust, not revealing just intensifies the pressure to come up with explanations, feeds conspiratorial thinking. It’s messy. A lot of what government does and decisions we make are messy. And sometimes, when we expose them, we get hit for it. But we get hit even harder if we hide it.

So when you do spring cleaning and you get the information out, the dust flies, and sometimes you pay a price. But if you don’t clean, things are much worse. And so it’d be naive to think that exposing our information will build trust overnight. In fact, it creates doubts as well. But nothing is worse than hiding the truth. So I think government has a huge role in being proactively transparent and relying less on freedom of information requests.

Patricia Kosseim:

In episode three, I sat down with Philippe Dufresne, Privacy Commissioner of Canada. He and I had a chat and exchanged insights into our respective mandates and our strategic priorities. We talked about the challenges our offices face, particularly some of the concerns arising from technological advancements, their implications for people’s privacy, and the urgency to establish proper guardrails for responsible use.

And I guess I want to know candidly from you, what keeps you up at night about all of this?

Phillipe Dufresne:

Great question. And I think, in terms of what keeps me up at night, it’s interesting. I would say, I don’t think they keep me up at night. I sleep very well. I’m grateful for that. But they keep me perked up during the day, I think I would answer that question with that slight twist.

And really, what keeps me perked up during the day is, for the moment, the three sort of operational priorities that I have as my focus. One is the accelerating pace of technology, in particular, generative AI, and all that it can do, both good and in terms of risks to privacy. So that is something that is I’m very much focused on, wanting to make sure that we can keep up with that, that we can stay ahead of that in terms of regulators principles, making sure that we can highlight what the guardrails should be.

Patricia Kosseim:

This conversation between Philippe and I was just one of many more we’ve had since, to discuss how our offices can work closely together, along with our other federal, provincial, and territorial colleagues from across the country, to tackle the many privacy and access challenges that cross jurisdictional boundaries.

We went on to adopt four FPT resolutions in 2023, ranging from access to information, employee privacy, best interests of the child, and generative AI, probably, I’d say, one of our most productive and prolific years ever as an FPT community.

As director of the McGill Center for Genomics and Policy, Dr. Bartha Maria Knoppers is a true luminary in the field of health law and bioethics. I was honored to welcome Bartha as our fourth guest this season. Bartha and I talked about the clinical use of genetic information, to discover actionable, treatable diseases, including through newborn screening programs and the right of the at-risk child to be found. We also spoke about the importance of biobanks for health research. We talked about the individual right to privacy, but also the responsibility to act in solidarity with other citizens and the necessary conditions for building public trust. Have a listen.

So why is it a problem when some people don’t want to be in a biobank, or don’t want to give their consent to have their samples or other personal data included?

Bartha Knoppers:

Well, that’s their right. For whatever reason, they may decline. The problem is that, often, the refusal of individuals is because there’s a mistrust in the medical establishment. But the problem with not participating is that, your given community, your given familial socioeconomic circumstances and so on, and your own origins and where you came from, your grandparents came from, are not recorded. And so these data sets are not representative of modern, what we call heterogeneous as opposed to homogeneous populations. And so then people say, there’s not enough diversity in your data sets. Well, yeah, we need more people that are not a Caucasian or European origin.

Canada, today, stats CAN 2022, 24% of Canadians living in Canada today are from non-European or Caucasian origins. We need data on these people. You can’t get the right drugs, the right devices, the right treatment, the right understanding, if they’re not in the data sets, or if they’re not in the biobanks. So there needs to be, I think, more genomic literacy, if I can call that that, or health literacy, because everyone benefits from these resources.

Patricia Kosseim:

You talked about public trust. What are some conditions that would enhance public trust when it comes to participation in research biobanks?

Bartha Knoppers:

I would start with transparency. What is so difficult about once every three months doing a bulletin, a publicly available, understandable, plain English, French, whatever 14 languages you want to put it out on, what’s the news? Where are we going? Why? And then three months later, what’s the latest news? Where are we going, or still going or not going? Why? We don’t have transparency. We don’t have sufficient visible governance. And we don’t have visible accountability for where mistakes are made.

So I think, once we have that and people know where to go look for stuff, why are we going to Google? Why aren’t we going to the Ministry of Health’s quarterly bulletin to see what’s happening? That kind of thing. I think those are the three elements. Visible governance, transparency, and then accountability.

Patricia Kosseim:

Being host of Info Matters gives me the rare opportunity to talk with fascinating people from all walks of life. In August, I was profoundly touched to welcome Betty-Lou Kristy, Chair of the Minister’s Patient and Family Advisory Council.

Betty-Lou shared with us her deeply personal story that turned her into the passionate, dedicated champion she is today, advocating for patient and family rights. We talked about her mission to improve patient care in Ontario, by putting patients and families at the center of policy making. She also gave me some great advice on how my office can advance our goal of building trust in digital healthcare, by fostering greater transparency, and ensuring all stakeholders have a voice in today’s digital health landscape.

One of my office’s strategic priorities is trust in digital health. And our goal is to promote confidence in the digital healthcare system, by guiding health providers to respect the privacy and access rights of Ontarians, and also to support the use of personal health information for research, to the extent that it serves the public good. So in your opinion, what advice would you have for my office to make advances on that goal?

Betty-Lou Kristy:

I think it probably goes back to that building relationships, bonding piece. It’s a process where the only way we’re going to get to the heart of that is when we actually see a lot of councils or a lot of advisories and really good partnership and engagement and co-design with patients, caregivers and family. And that’s the crux of it. And it sounds so simple, but it’s not, because to do that properly takes a lot of process and a lot of time and a lot of engagement, particularly when you ripple out to equity deserving communities and you’re also trying to mitigate historical harms and historical mistrust, while you’re trying to build trust.

So I think that’s one of the most important pieces is that, when we actually are standing up more seasoned and have built capacity in Ontario for this type of engagement and co-design, and we’re building all the tools and everything we need to move forward on this, and there’s a genuine interest, intent and heart-soul connection that healthcare system really wants to do this and do this properly. And I think that’s, for your office too, visiting some of these councils, coming for advice to some of these councils.

So I think, whatever ways and means there is to build a relationship, but also share information and build literacy around that and understanding of what it is you do, your office does, what it is you’re trying to do. Because like I said, the general agreement is, data for good is a good thing.

Patricia Kosseim:

In episode six, I sat down with Laura Neuman, Senior Advisor at the Carter Center in the US, to talk about women’s rights to information, and the many obstacles they face in trying to access that information. Laura shared some valuable insights into the real world impacts on women, when they lack the information they need to avail themselves of their right to benefits and services.

Can you tell us some examples of the kinds of obstacles women face, real concrete examples that you found in your study?

Laura Neuman:

In many cases, it was an issue of time. So while women understood the value of information, she didn’t have time to get to public agencies. She didn’t have mobility. Mobility is a massive issue. Women are incredibly insecure around the world in public transportation and in trying to get from her home to the public agency to seek information.

There was also issues of a lack of awareness. Women didn’t know about this right. They didn’t know how to ask for information. And then just a host of normative and cultural issues. So it wasn’t appropriate for women to be asking questions of men. It wasn’t appropriate for women to be going and seeking information. There was also issues of fear, where women were just afraid. They were afraid to go to the public agencies. They weren’t welcomed.

We saw our researchers who we put in public agencies, over 130 agencies, and they visited each one three times on three different days, three different weeks. They saw women trying to come in and get information to get services, and they were often ignored. They were derided. “Why would you want this? Why do you need information? Who are you to be asking me for information?” And so once that happens, the opportunity for women to go back is going to be pretty limited.

Patricia Kosseim:

I then asked Laura whether these obstacles are specific to developing countries, or if we’re seeing some of them play out closer to home. Laura spoke about a much more generalized phenomenon. She talked about the expansion of their Informed Women Transform Lives campaign that now involves over thirty-six major cities around the world, including several US and European cities. Notably, no Canadian City has yet signed on. But hopefully, that will change after hearing more about this exciting and critically important campaign.

Laura Neuman:

The campaign is based again on this idea that women aren’t getting information, and because of it, they’re not getting their rights. They’re not being able to benefit from the services that, in this case, municipal governments provide.

So each city identifies a service that they have that they believe would be transformative. It’s a service that if women had access to would be life-changing, for her, for her family, for her community. They identify that service. They identify the target population of women. In some places, it’s just whoever lives in their jurisdiction. In other places, they focused on younger women. Some have focused on older women. Some have looked at socioeconomic disparities. But they identify their target population of women, and then they use information as a bridge.

So we have this awesome service. We have these women who need it. How can we get that to them using information, raising awareness of the type of services that are being provided? Where to go to get it? What are the requirements for application? How long does a service last? What happens if you don’t receive it? All those pieces of information are exactly what the right to information is about. It’s just really getting that meaningful information into the hands of women, and then seeing what happens.

And it’s been unbelievable. We’ve had incredible success stories coming out of these cities, and it’s just being intentional. It’s being innovative, and it’s being intentional, and it’s being committed.

Patricia Kosseim:

Jason Besner, Cybersecurity Expert and Director of Partnerships at the Canadian Center for Cybersecurity joined me for episode seven of the season. Jason shared some of the concerning trends we’re seeing in cybersecurity, and also some of the practical strategies for organizations and individuals to navigate the ever-expanding digital frontier. Our exchange serves as a reminder of the importance of vigilance, preparedness and collaboration, in the face of cyber threats that impact all of us.

In very simple terms, Jason, why does cybersecurity matter?

Jason Besner:

Cybersecurity matters because it’s a greater risk than most people realize, both for the personal information and to organizational information, and to the critical services on which we rely. It matters because there are constant attempts and malicious actors looking to breach our systems, looking to steal information, whether it’s through impersonation, through fraud, through cyber attacks. This is a very lucrative business. It is generating a lot of money. It has grown into a very sophisticated business. There’s an entire underground marketplace where you can hire people to do this kind of work. So it’s a big concern because, as you said, one attack getting through can be absolutely devastating for an organization.

If I can just give you a quick number of detected incidents and reported incidents, there are about 2,000 in ’22-’23. And on average, we block up to 5 billion malicious attempts on Government of Canada systems a day. So roughly, about a 1 billion to one win-loss ratio. But that one can absolutely wreak havoc on an organization, not only on its current business, but its future business, its reputation, and on all of the data that it holds on its clients and partners.

Patricia Kosseim:

I want to turn now to focus in on individuals, people like you, me, people listening to this podcast. What in very practical terms can individuals do to better protect themselves against cyber threats?

Jason Besner:

The single most popular entry vector for ransomware and other cyber attacks remains phishing, so that is using an email or communication designed to trick the recipient into clicking a link or downloading an attachment that will introduce malware into the system.

So the way that phishing succeeds is by catching someone off guard, catching someone distracted. We saw a huge explosion in these attempts during the pandemic, because certainly, threat actors knew we were working from home. We were juggling a lot of things. People were concerned. And so you saw an increase of this, and you saw individuals or users that wouldn’t normally click on something, click on it because they were doing too many things.

So that’s the advice that I would give is, think about how you’re receiving communication, and how maybe a partner with whom you have business or you have a personal relationship would be asking you for information. Does your bank normally ask you to provide this type of information in this medium? Do you normally get a text from the Canada Revenue Agency asking you to verify credentials? These are not typical ways that mature organizations will act or try to receive information.

There are certain red flags, no matter how well written or how well crafted it is. If it doesn’t feel right, ask questions and investigate. If you’re working for an organization, report it to IT department. Ask them to take a second look at it because it looks suspicious.

The other thing that I would point Canadians to is all the resources at getcybersafe.gc.ca. These resources are really designed for anybody to be able to use and to adopt into action. And as I said before, the basics will thwart the majority of malicious cyber attempts. So this is, how do you use your mobile phone securely? Using a VPN through encrypted traffic if you’re traveling, instead of using public Wi-Fi, for example. Paying attention to where you download your apps from, to making sure that they’re secure. Making sure that you don’t say no when an app asks you to use multi-factor authentication. These are all very user-friendly guides that can take you through the basics and really just make you a harder target.

The great mass and the great volume of these attacks are not terribly sophisticated, so with these few steps that you can take to protect yourself, you just want to encourage an actor to move on.

Patricia Kosseim:

The famous scientist, Stephen Hawking, once said, “We are all now connected by the internet, like neurons in a giant brain.” Well, I had the chance to discuss the brain, or more specifically, neurotechnologies that interface with the brain, with Jennifer Chandler, a law professor at the University of Ottawa, affiliated with the Center for Health Law, Policy and Ethics.

Jennifer and I discussed various applications of neurotechnologies, in the health, employment, and law enforcement contexts. We talked about some of the amazing benefits, but also some of the significant legal, ethical and privacy issues these give rise to. I asked Jennifer about an interesting debate going on internationally around the need to formally recognize novel human rights associated with our brain and minds, rights that we never had to think about until now.

There’s a lot of people thinking about the things that you’ve just raised internationally, and some people are calling on international bodies to begin to recognize novel neural rights or cognitive rights, to begin to enshrine things like the right to mental privacy, the right to personal identity or personality, the right to free will. What’s happening internationally in terms of codifying some of these novel neural rights? What can you tell us about that?

Jennifer Chandler:

So it’s a really active discussion at the moment, and it’s going on at multiple levels. There’s international organizations like the UN Human Rights Division that’s looking at this actively at the moment. A report is expected next fall 2024. There are other multinational organizations, regional bodies that have been looking at this as well. And even some countries that have started to modify their law to put in protection for certain aspects of these neural rights. So Chile is an example, where the government has decided to modify its legislation.

There’s a different list of proposed rights, depending upon who you ask, but some of the key ones, in my view, are this issue of cognitive liberty, which would include the right to alter your own mental states, as well as to protect them against forcible intervention.

Another key proposed novel neural right is this mental privacy issue, and that would be to protect the privacy and integrity of brain data and mental experiences associated with it. Another big right that’s mentioned is fair access to the kinds of brain interventions or augmentations, and the concern here is that some people will have access to neuro-enhancements, others not, and this will exacerbate inequality that already exists in the world. So these are three key ones that are being discussed.

This is a big debate that’s occurring between legal scholars and everybody else who has an interest in this topic. We’ll see where we end up with this. There’s tremendous activity right now, trying to figure this all out, and I think a fairly strong sense that something has to be done.

Patricia Kosseim:

In the final episode of the season, I was joined by professors Jane Bailey and Valerie Steeves of the University of Ottawa, to talk about their research and insights into the unique challenges young girls face in the digital world. They highlighted the crucial role of education in navigating and overcoming these challenges, moving beyond simple rule setting, to more informed and meaningful conversations among young people, educators, and policymakers. In this segment, Professor Steeves shares her perspectives on the need for a nuanced, democratically grounded approach to education, emphasizing understanding to support and empower young girls in the digital age.

Valerie Steeves:

So often, what we do in these types of situations is we give kids rules. Don’t post your name on the internet. Don’t post a photo. Don’t do this, don’t do that. And they kind of look back at us and say, “As if that’s a solution.” First of all, it’s not even the problem, but it’s also not the solution. So I think, a lot of this really requires the deepest form of education, which is grounded in informed dialogue between people.

So we do a lot of education, and we look at these issues and try to create an environment, particularly in schools, but also in community groups, where young people can talk about an example that’s distanced from their own lives, and raise the concerns that they have about that particular issue. So we made a twenty-minute film on the experiences of a young girl in a near future that’s using kind of a Google Glass kind of technology to improve her maths grade. And it ends up having all sorts of social and political consequences for her.

And so we create those kinds of exemplars so we can go into a classroom and young people can use that to talk about the issues and have a discussion among themselves, but also with their teachers and with their parents and the adults in their lives, to bridge that gap so adults can begin to understand exactly what these kids are facing.

Patricia Kosseim:

We also went on to discuss my office’s Draft Digital Privacy Charter for schools, that contains high-level principles and commitments to protect children’s online safety, but also to promote their digital literacy and empowerment. I asked Jane for her thoughts on the tangible benefits for Ontario schools and school boards to sign on to the digital charter.

How would you describe the benefits of signing on to a charter like that, among the schools and school boards in Ontario?

Jane Bailey:

I think a big part of it is to signal to the public that you are a rights-respecting institution, that you are someone who centers the very people that are meant to be centered in the education system, which is young people themselves. And so to be a leader on that issue, I would think is sort of ideally what every educator wants.

And I suppose the trick to it is that then once you’ve taken the position, you got to walk the walk. You can’t just say you’re committing to it, because when you do commit to it, then you’re held to it, and that’s really important. That doesn’t mean that you’re always going to get it right. Probably not. Probably not going to always get it right. It centers you as an institution that says, “We’re so committed to this, that we’re willing to come out and publicly endorse this, knowing that it opens up conversations about how to get this right, and to recognize that we’re going to make mistakes and accept that.”

Patricia Kosseim:

So there you have it, folks, just a small sampling of some of our fascinating conversations we had through season three of Info Matters. We hope you enjoyed this journey back through some of our greatest hits. And if you did, you can always go back and listen to the particular episode that interested you most.

We look forward to having you join us for more Info Matters, as we prepare to launch an exciting lineup of new episodes in season four. Thanks for listening, everyone, and until next time.

I’m Patricia Kosseim, Ontario’s Information and Privacy Commissioner. And this has been Info Matters. If you enjoy the podcast, leave us a rating or review. If there’s an access or privacy topic you’d like us to explore on a future episode, we’d love to hear from you. Send us a tweet at @IPCinfoprivacy, or email us at @email. Thanks for listening, and please join us again for more conversations about people, privacy and access to information. If it matters to you, it matters to me.

Disclaimer
The information, opinions, and recommendations presented in this podcast are for general information only. It should not be relied upon as a substitute for legal advice. Unless specifically stated otherwise, the IPC does not endorse, approve, recommend, or certify any information, product, process, service, or organization presented or mentioned in this podcast, and information from this podcast should not be used or reproduced in any way to imply such approval or endorsement. None of the information, opinions and recommendations presented in this podcast bind the IPC’s Tribunal that may be called upon to independently investigate and decide upon an individual complaint or appeal based on the specific facts and unique circumstances of a given case.
Help us improve our website. Was this page helpful?
When information is not found

Note:

  • You will not receive a direct reply. For further enquiries, please contact us at @email
  • Do not include any personal information, such as your name, social insurance number (SIN), home or business address, any case or files numbers or any personal health information.
  • For more information about this tool, please see our Privacy Policy.