S4-Episode 11: The best of season 4

In this special retrospective episode of Info Matters, Commissioner, Patricia Kosseim revisits season four’s standout conversations. Highlights include junior high students' views on privacy, Cynthia Khoo on facial recognition, and Robert Fabes on how people experiencing homelessness perceive privacy. Dr. Devin Singh explores AI in health care, while Priya Shastri from WomanAct discusses information sharing in safety planning for survivors of intimate partner violence. The episode also covers the use of digital educational tools in the classroom, mediation in access appeals at the IPC, conversations about the IPC’s Transparency Showcase, and IPC health privacy cases involving cyber attacks and abandoned records.
Notes
- Students from Westboro Academy discuss what privacy means to them [00:52]
- Cynthia Khoo is a technology and human rights lawyer explains the privacy risks of facial recognition technology [3:37]
- Robert Fabes of The Ottawa Mission shares insights on the barriers people experiencing homelessness face and how to provide access to essential services while respecting their privacy and dignity [8:07]
- Dr. Devin Singh of Toronto’s Hospital for Sick Children speaks about balancing the benefits and risks of the use of Artificial Intelligence technologies in health care [11:33]
- Priya Shastri, Director of Programs at WomanAct, provides insights from the front lines on information sharing, building trusting relationships with victims and survivors, and taking a collaborative, trauma informed approach to combatting Intimate Partner Violence [14:42]
- Commissioner Kosseim speaks with Shaun Sanderson, a mediator at the IPC, about how mediation works, what parties should do to prepare, and what they can expect to get out of the process. [17:07]
- Commissioner Kosseim shines a light on the innovative projects submitted by Ontario’s public institutions as part of the IPC’s Transparency Challenge 2.0. [21:24]
- Jeff Ward, CEO of Animikii, discusses the longstanding connection between technology and culture, and how incorporating Indigenous values and principles into the development of new technologies can empower communities. [29:25]
- Anthony Carabache, a staff officer in the Professional Development Department at the Ontario English Catholic Teachers’ Association, sheds light on the opportunities and challenges for educators adopting technology in the classroom. [32:35]
- Commissioner Patricia Kosseim delves into significant health privacy cases of 2024 with her colleagues from the IPC. [36:20]
Resources:
- Digital Privacy Charter for Ontario Schools
- Privacy Pursuit! Lesson Plans (free IPC lesson plans to teach kids about privacy)
- Facial Recognition and Mugshot Databases: Guidance for Police in Ontario (IPC guidance)
- Sharing Information in Situations Involving Intimate Partner Violence: Guidance for Professionals (IPC guidance)
- Code of Procedure for Appeals under the Freedom of Information and Protection of Privacy Act and the Municipal Freedom of Information and Protection of Privacy Act (IPC resource)
- IPC Transparency Showcase
- Niiwin data platform (Animikii)
- Privacy and Access in Public Sector Contracting with Third Party Service Providers
- Responding to a Health Privacy Breach: Guidelines for the Health Sector
Info Matters is a podcast about people, privacy, and access to information hosted by Patricia Kosseim, Information and Privacy Commissioner of Ontario. We dive into conversations with people from all walks of life and hear stories about the access and privacy issues that matter most to them.
If you enjoyed the podcast, leave us a rating or a review.
Have an access to information or privacy topic you want to learn more about? Interested in being a guest on the show? Post @IPCinfoprivacy or email us at @email.
Transcripts
Patricia Kosseim:
Hello, I'm Patricia Kosseim, Ontario's Information and Privacy Commissioner, and you're listening to Info Matters -- a podcast about people privacy and access to information. We dive into conversations with people from all walks of life and hear real stories about the access and privacy issues that matter most to them. Welcome to a special episode of Info Matters. As we wrap up season four, we're taking a moment to look back at some of the season's most thought-provoking conversations. Today's episode is a collection of small snippets of those conversations that resonated with our listeners and offered unique perspectives. So let's dive in to some of the most memorable highlights from season four. We kicked off season four with an inspiring conversation with junior high students from Westboro Academy who shared their personal views on privacy in a digital world. Their reflections on what privacy means to them were fascinating and refreshingly candid. So a lot of people say that kids don't care about privacy anymore. What do you think about that?
Henrik:
I don't think it's as much that they don't care about privacy. I think it's more that they don't know and they don't think about it. When they're online, that's not what they're thinking about. So if they were taught about it or reminded about it, that could help.
Patricia Kosseim:
What about you, Isaac? Do you care about privacy?
Isaac:
Yes, I actually do care a lot about privacy and like Henrik said, I think that it's not that children don't know about privacy or don't care about it, it's that they don't fully understand the consequences of how they act online.
Patricia Kosseim:
Isaac, do you know anybody who's ever posted something online that they later think, "Hmm, kind of regret having done that"? And how easy do you think it is for somebody to take information down once they've posted it?
Isaac:
Well, yes, I have known people who have regretted what they posted online, and I think that technically it's easy to get rid of something, you just press the delete button. But in the real world, it's actually hard to erase it from everything because people can take screenshots or repost it to other people and things like that.
Patricia Kosseim:
Gosh, you're really smart. That's really good to know. And you're right, you can't just erase it and think it's gone forever. There's so many other places it could have gone. So Gaja, what do you think?
Gaja:
I've known a lot of people that regretted texting someone something. It was usually a bad thing, and they've apologized, they've done all they can. They tried to make sure that no one else had it, that everyone knows both sides of the story, but still it did hurt someone's feelings and it is on there forever and people did take screenshots and it's just you probably should think twice before you post something or text something or do something just to make sure that it is the right thing to do.
Patricia Kosseim:
These students reminded us that young people are open to learning about privacy and capable of making good privacy choices, but it's up to us as parents, educators, and regulators to give them the right guidance and support. In episode two, we were joined by Cynthia Khoo, research fellow at the University of Toronto Citizen Lab and senior associate at the Center on Privacy and Technology at Georgetown Law in Washington DC. Cynthia helped us unravel the intricacies of facial recognition technology and its far-reaching implications for privacy and human rights. Her deep dive into the biases inherent in some of these systems highlighted the importance of oversight and legal guardrails. So Cynthia, listeners may not be familiar with facial recognition technology. Can you unpack that a little bit for us and explain to us how does facial recognition technology work?
Cynthia Khoo:
Broadly speaking, facial recognition technology is used to identify people based on metrics that are taken from their face, whether it is the space between your eyes, the shape of your chin. Different metrics that are taken from different set points on your face that are then used to create a face print that is then used to match up with the face prints of other faces. Whether it is in a large database or whether it is a previous photo that you have submitted to the person who is running the system, for example.
Patricia Kosseim:
As you know, many people say this, I don't believe it. I know you don't, but I'd love for you to refute it in your own words. You know that old adage "if you have nothing to hide, then what's the problem?" And why do we worry about technologies, surveillance technologies like facial recognition when we're in public spaces? So, what should it matter and why should we care?
Cynthia Khoo:
That's a great question. Having anonymity in public is an essential privacy right. The Supreme Court of Canada has recognized this, so it's constitutionally protected and a lot of our privacy laws were created on the basis of assuming that people generally do enjoy privacy of anonymity when they're walking around in public. When that idea was formulated when you were out in public, people did not have the ability to take a photo of you and then instantly know everything about you. Say someone has survived and escaped a situation of intimate partner violence or family violence, and then someone is able to track them down after because they were randomly spotted at a cafe and then they were identified and it somehow got back to the person they're trying to escape. Undercover intelligence agents, for example, could potentially be identified if you got into a random altercation with someone.
So imagine a typical road rage scenario where normally it would happen and then you drive away and generally one hopes that it's done. But what if the person who was in the state of road rage could take a picture of you and follow you home or know where you worked and then if they happen to be a vindictive person, pursue you there? There's no natural end to it anymore because physical space is no longer a limitation. When it comes to civil liberties, think of not being able to attend a protest without the police or government instantly being able to know who you are or your employer, people potentially being able to retaliate against you for speaking out and what you believe in.
The second point is that you don't have anything to hide for now. That can change very quickly. You don't know what will happen in the future. Maybe a government will come into power that disagrees with how you live your life or a part of who you are, and so it's not always necessarily the fact that you have nothing to hide. It's more about who is in power and how might they use that power against you, which is why we need these protections in place for everyone.
Patricia Kosseim:
Cynthia's insights reveal the powerful reach of facial recognition and why it's essential to protect our right to anonymity in public spaces, to safeguard vulnerable individuals and to prevent misuse by bad actors. She highlights how this technology's impact goes beyond expediency, touching on crucial issues of privacy and civil liberties. Her perspective reminds us that while facial recognition can serve valuable purposes, it demands thoughtful oversight to prevent unintended consequences. One of our most impactful episodes featured Robert Fabes of the Ottawa Mission who offered insight on the informational barriers facing people experiencing homelessness, their challenges in obtaining and managing government ID and how their access to essential services hinges on it. His reflections were both eye-opening and deeply moving. What does privacy mean for people experiencing homelessness and how important is it to them in relation to other things in their life?
Robert Fabes:
It's a good question. It signals to me and to the population that your office is looking at this from their perspective, and that's so fundamental to making meaningful change in the area and addressing some of the problems that you raised in your introduction. When you had asked me to do this podcast, I went to the clients at the mission and we talked about this, right? We had a discussion about what privacy means to them and for them, they are much more focused on control of their personal information. They want to be able to have access to it. They're much more concerned with their image being used. They are very sensitive to the fact that being identified as a shelter resident. They're sensitive to the stigma that brings, especially when they try to access certain services. Their experience is that in most cases, as soon as somebody sees an address of a shelter as their address, they start being treated differently.
Patricia Kosseim:
Our office is the first to recognize that privacy isn't an absolute right. There's always trade-offs to be made in terms of the personal information that you provide in order to obtain services, for instance. So, for this particular population of persons experiencing homelessness who obviously have basic needs at stake, how can we strike a balance between providing them with the access to the services they desperately need while also protecting their personal information?
Robert Fabes:
For them, the access to their own information in order to be able to get the government service is much more important than the protection of that particular information. Healthcare number, social insurance number, tax records, birth certificate. Because for them, and it goes to what you said, Pat, having that information allows them to meet their basic needs.
Patricia Kosseim:
Rob offered an important perspective into the lives of people experiencing homelessness, showing that for many, privacy is about controlling how their information is used. In a world where stigma associated with a shelter address can deeply affect how they're treated, where a lack of valid ID can impede their access to services and where privacy becomes a matter of human dignity. His insights remind us that for vulnerable populations, balancing privacy and access to essential services starts by listening and understanding their unique needs and voices. In episode four, I was really pleased to welcome Dr. Devin Singh of Toronto's SickKids Hospital who discussed the transformative potential of artificial intelligence in healthcare and the ethical responsibilities that come with it.
Dr. Devin Singh:
During this journey, particularly over the last five years when we were building out these technologies, we recognized that there needed to be governance, rigor, and thoughtfulness. How do we take these technologies that we're building and actually scale them across the country? That's really the mission to truly make a global impact on the way we transform pediatric care. In order to do that, it's actually not necessarily the technology that's the hardest part. It's thinking through the risks around privacy, around ethics, around patient safety and governance. From a practical perspective, we spent about four years designing something called a Get AI framework that really ensures the way we build and develop technologies internally goes through the necessary checks and balances at every single stage from day one. None of these projects get off the ground without thorough regulatory assessment, which considers things like impact on privacy, impact on patient safety, and an assessment around what are some of the potential unintended consequences that may happen to the system as we start to move the technologies forward.
Patricia Kosseim:
How secure are these technologies in terms of say their interface with commercial vendors or their vulnerability to bad actors and cybersecurity risks?
Dr. Devin Singh:
We have to make an assumption that one day there could be a hack, and if there was a hack, everything has to be encrypted. So no sensitive patient data ever is compromised. It's just a minimum bar standard, but it doesn't necessarily mean every single player in the space does this. I would say to the community, challenge those vendors with questions around what happens if the data is hacked? Tell me all the elements that aren't encrypted. Why aren't they encrypted? How do you control who looks at my data? Do you ever use data for anything other than what we are contracting for and why? And what is it that you're using it for? And can I disagree on behalf of all my patients that we don't consent to you using data in that way? Right? As a health community, we need to challenge and ask those questions so it's crystal clear the measures that are put into place, and then we need to hold our technology companies, and that may be holding our hospitals as well, to these really high standards of maintaining cybersecurity, encryption, data privacy governance and ethics.
Patricia Kosseim:
This episode provided concrete examples of how AI can be deployed to improve health and healthcare services and offered many good takeaways for responsible deployment of AI that prioritizes safety, roots out bias, and puts the patient and their family at the center of decision-making. In episode five, we were joined by Priya Shastri, Director of Programs at Women Act, a Toronto based organization dedicated to ending violence against women. Priya shared her years of experience working in the area of intimate partner violence, providing insight into the systemic challenges facing women survivors, and the critical role of information sharing in creating safety plans.
Priya Shastri:
The crucial part of supporting safety for a survivor is holistic support or wraparound services such as housing, health, legal, employment, education, child services, so all these services working together to share information to meet survivors safety needs. And what that looks like is each of these service providers has a piece of information about the survivor and the perpetrator. Perhaps a health provider has some information about that current physical and mental health state of the survivor and the perpetrator. The education system might have some information about the child's safety, so bringing these different community partners and service providers together to share a piece of the puzzle so you get a full picture and you use that full picture then to create a safety plan.
The challenge becomes if you're creating safety plans on a puzzle that isn't really complete. So for example, if you're sitting around the table and there's information that a perpetrator potentially is going to be released, but the survivor might not be aware of that and neither might any of the partners supporting the survivors, and it would be important as part of the safety plan to know that a perpetrator is being released. But sometimes what will happen is there will be a gap in the information and safety plans aren't as effective when there is missing information, and it decreases safety for the survivor. And so that is essentially why it's so crucial and important.
Patricia Kosseim:
Priya highlighted the systemic barriers, victims and survivors face from intergenerational trauma to gaps in support systems emphasizing the need for wraparound services built on collaboration and transparency. She also emphasized how information sharing when done thoughtfully and with a trauma-informed approach can be a powerful tool to ensure safety without causing further harm. In episode six, we were joined by Shaun Sanderson, a skilled mediator in our office who brings a wealth of experience in resolving disputes under Ontario's access to information laws. Shaun takes us inside the room, so to speak, to explain the mediation process and share real-life success stories.
Shaun Sanderson:
At the beginning of receiving an appeal, our office makes sure that we have all of the relevant documentation and the records at issue at the intake stage of the appeal process, and then the file is transferred to mediation. The mediator receives the file, they obviously review all of the documents and correspondence in the file, and then basically the mediator determines how the process will unfold as each appeal that we do receive at our office is unique. Every file has its own special circumstances and issues, and the mediator decides how to proceed with the mediation, and so they will reach out obviously to both parties. And the role really of the mediator is to educate the parties about the process. So we do go through the whole mediation process with them and we go through the FOI legislation with them. We work with the parties to clarify the issues and dispute.
Our job really, and our goal is to try to settle all of the issues in the appeal, or if we're not able to settle them completely, to narrow and clarify the issues that may proceed to adjudication, which is the next stage of the appeal process if the file doesn't settle in mediation. It's important also to point out that mediators are neutral. We are neutral third parties so we can provide opinions, but we don't make decisions.
Patricia Kosseim:
Can you give us an example of where mediation successfully resolved a freedom of information appeal?
Shaun Sanderson:
I had a file recently where it was a request to a township from an individual looking for particular information. They were seeking information about the activities of the chief building officer over a long period of time, including every site visited, the actions and decisions resulting from those visits. Then they were looking for multiple years of township expenditures, including dates, amounts, and purposes of payments. And all of that together, the township claimed that that request was frivolous and vexatious. So we had a situation where the appellant appealed that decision. There was a lot of mistrust between the parties. It was a very acrimonious history. There was a lot of preliminary groundwork that was done before we actually had the teleconference. I did share our frivolous and vexatious fact sheet with the parties, previous orders with the parties. I worked with the appellant to really get at their underlying interests.
What was it that they really were looking for? What kinds of questions were they concerned about? And I was able to share that ahead of time with the township so that they could be prepared to answer those questions during the teleconference. Then we did have a teleconference with all of the parties, and I think the key to this success was that they had the right parties at the table. In addition to the CAO, which most municipalities will bring, they also had the people who were experts in the subject matter of the request. They were able to not only answer the appellant's questions, but to assist them with revising the requests and the timeframes to capture the appellant's interests and what they were really looking for. In the end, they were able to significantly reduce the amount of work for the institution, including the search time, the preparation time, and the overall fees that would be required. So it really was a win-win for both parties and we were able to resolve that appeal.
Patricia Kosseim:
Shaun's expertise and dedication highlight the vital role mediation plays in resolving access to information appeals efficiently and effectively. Her insights into building trust, fostering communication, and finding creative solutions showcase the impact of this important part of the process. In episode seven, we put the spotlight on three projects from our Transparency Challenge 2.0, the Beauty and Benefits of Transparency. These initiatives demonstrate how transparency can foster greater public trust, improve access to vital information and drive meaningful change in our communities. Let's hear about some of these inspiring examples of transparency in action. Starting with my conversation with Steve Orsini and Josh Lovell of the Council of Ontario Universities.
Maybe you can walk us more concretely through the types of data that are publicly accessible on the platform and how they actually benefit students, policymakers and educators.
Steve Orsini:
If you go to our open data site, we have a number of categories that you can access. For example, we have information on enrollment. We have information on the number of teaching faculty, research grants awarded, the graduation and employment rates by program. So if students are interested, how are people getting jobs when they graduate? We have that information available. We also have information available with our application center. We get over 800,000 applications a year. Students will want to know what programs are people applying to, how are they getting employed afterwards, what's the graduation rates and employment rates by program? All that information's available.
Josh Lovell:
I would say there's three buckets of data on our site and the first, it's what universities are doing, and that's some of the stuff that Steve mentioned, the applications that they're processing, enrollment levels and the programs that students are in. It's operational data that is available on a dedicated portal. The second bucket is about how universities are doing what they're doing. So as an example, we have our financial officers portal that summarizes which revenues are coming into institutions, how they're spending their money and other elements of their day-to-day operations. And then the third buckets, what are universities achieving? What are the graduation rates, the credentials that are being distributed? What are the workforce outcomes? There are synthesized reports as well that kind of articulate some of the biggest, most important trends.
Patricia Kosseim:
I then spoke with Mike Melinyshyn and Damien Mainprize about the Town of Innisfil's Helpful Places Initiative.
Mike Melinyshyn:
As we started in Innisfil, adopting different technologies, a strong compulsion to make sure we were doing it properly, being transparent about what we were doing and protecting the personal identifiable information of our residents as we adopted those technologies. I was drafting a data governance strategy for the town of Innisfil, and in my research I happened to connect with Jackie Lu, who was the director of data integration for Waterfront Toronto, and she was creating an organization called Helpful Places. That is where the project started from. She was trying to create a really transparent tool for municipalities to convey to your residents of the technologies that we're deploying across town.
The tool is really interesting because it's two-part. There's a signage taxonomy that really simplifies what technology is being deployed. So if it's a sensor, it's a small picture of a sensor or a CCTV camera is a picture of a camera. But the unique part of this was the implementation of a QR code. So residents could scan a QR code, it would tell what technology was being deployed, what data was being collected, where it was being stored, was it anonymized, could they access it? So it was a fully transparent tool, and that's really what I found fascinating about this project.
Patricia Kosseim:
How did you make sure that people understood what data was being collected and why? How did you do it in a simple way?
Damien Mainprize:
The DTPR program created a standard set of taxonomy symbols or icons. They're installed as a part of the signage that gets deployed or installed at or near each of the technology sites. So for us, it was at our two main parks and the garbage bins that were around those locations. The sign is meant to capture the purpose of the project, a brief explanation, but it uses these taxonomy icons to very simply capture the purpose. It also includes an icon for the type of technology being used. The last part of every sign is a QR code, and in that QR code, whenever a user or resident walks by and scans it, they're brought to a web-based application, and that application allows them to drill into almost every aspect of the technology and the program that's going on. So, you can imagine the residents that would really want to dig into it and have more questions, they can drill down into every section of that particular project and every piece of the technology being used and find a lot of those answers.
Patricia Kosseim:
My final guest for this episode was Andrea Roberts from the Ministry of the Environment Conservation and Parks, who explained how the ministry is using creative solutions to accelerate the FOI process and make it easier for the public to access environmental property records.
Andrea Roberts:
We definitely here at the Ministry of the environment have a huge number of FOI requests that come through every year. Right now it's about 9,000 per year, and that represents more than 40% of the total for the entire province. So as a ministry, we are definitely overloaded with FOI requests. Over 95% of them are actually related to property records. And so what's available right now is that a member of the public, and usually it is a business that's interested in land redevelopment, but anyone can go to the website and input an address and ask how many records and what types of records are associated with that property. So they do the search, they pay the small fee, and then they get their search results back, and then that allows them to gauge what next steps do I have to take?
And over 40% of recipients get a response back that says there are no responsive records. That might sound like a not great thing, but actually it is a really good thing for the average client because what it means is that we don't have any records of issues related to the property that they're interested in. And so usually what that means is that they're done, that they don't need any more information, they don't have to file an FOI request, and the great news is they can get that information within five days.
Patricia Kosseim:
What lessons has the ministry learned from this initiative and how might the insights be applied to future services aimed at improving access to government records?
Andrea Roberts:
The first was that we set out to build a service that was directed towards a particular need where the FOI system was sort of being leveraged but not leveraged in a very effective way. So maybe the lesson learned there is that we have this particular problem, but a solution to that problem is not directly to solve that problem, it's to solve this side problem that eventually solves your original problem. And so I think the MPI, the EPI tool is an example of that kind of creative thinking that my predecessors and the team that continues to work on this project solved and created. I guess the other lesson learned is that staged approach. This did look like an intractable problem, and so taking that multi-staged approach, moving the yardstick slowly but surely, I think is another lengthy lesson learned when the problem looks insurmountable.
Patricia Kosseim:
Open data platforms at Ontario Universities, innovative technology tools in Innisfil, streamlining access to environmental records, all these projects show the true beauty and benefits of transparency. I was deeply honored to engage in a critical and timely discussion about the need to integrate principles of indigenous data sovereignty and ethics into the development of technologies. My guest, Jeff Ward, founder and CEO of Animikii shared his expertise and unique perspective on how technology can empower indigenous communities while respecting their cultural values and data rights. What are some of the ways that Animikii incorporates indigenous traditions and values into its technological solutions?
Jeff Ward:
It's a great question. We at Animikii have adopted the seven sacred teachings from my culture, and so those values are love, humility, respect, wisdom, courage, honesty and truth. And so as a technology company, weaving love for example into technology, that's something that we think about and talk about on the day to day. As an organization, as a tech company, we think about how do we connect those values at all levels from our mission, vision, strategic plan, our company goals, even individual goals that our team members make here even impact measurement. So we think holistically about these values. They're not just words on our site, but they are the values that we measure and hold ourselves accountable to, and we bake that into the tech and the software development process from day one.
Patricia Kosseim:
Listening to you today, I can't help but think that we need more voices at the table. So in your view, how can the principles of indigenous data sovereignty inform our conversations about data ethics and governance in a digital world?
Jeff Ward:
Well, I think ensuring that indigenous data is protected and that that data remains under the control of the indigenous peoples, and that data benefits those peoples specifically indirectly, right? I know that was talked about in the other episode with the OCAP principles, and today we've talked about the fair and the care principles as well. Using those frameworks as a way to kind of guide your process would be important. And also just supporting and promoting indigenous led technology that respects our data rights and protocols and promoting indigenous technologists to have a voice in all of your work, especially from communities that you're working with.
Patricia Kosseim:
Jeff's insights on indigenous data sovereignty and the intersection between technology and cultural values highlight the importance of ethical and inclusive approaches to data governance. As Jeff reminded us, the frameworks for responsible action already exist. It's now up to all of us to embrace them and ensure that technology serves the collective good. In episode nine, we explored a topic that's increasingly shaping the future of education -- technology in the classroom. From online learning apps and AI driven tools to the growing presence of third party platforms, technology is transforming how learn and how teachers teach. I was joined by Anthony Carabache, a staff officer with the professional development department at the Ontario English Catholic Teachers Association, and a valued member of the IPC Strategic Advisory Council.
With the increasing use of technology platforms, commercial apps, and other third-party tools in the classroom, we're seeing more and more privatization seeping into the public education system. Is that something you're seeing as well, Anthony?
Anthony Carabache:
Third-party providers are already embedded in our education system, and I witnessed that firsthand in the early two thousands when vendor after vendor would come knocking on the school board's door looking for an opportunity to license their third-party application. So at that time it made sense because it literally was a tool, but now it's not just that we have a lineup of third-party applicants coming in to sell their wares. They're knocking down the door offering solutions where a solution to them is not so much a problem for us. So we're sold a problem and then we're sold the solution. So this is really opening up a gateway of updates and more software coming into our public school system that is quite invasive.
Patricia Kosseim:
Recently, my office released guidance to help public sector organizations identify privacy and access considerations when they're selecting and contracting with third-party vendors, or in your case providers of digital education tools in the classroom. How can associations or sectors or boards have greater negotiation power and influence when entering into these contractual relations with third-party vendors?
Anthony Carabache:
I think this is a very important point to discuss, Patricia. In terms of what you've released, that in my opinion, is an excellent framework to guide public offices and public institutions in terms of procurement. That is really the second place outside of regulation, that is really the second most important place and empowering place for our school board officials to protect our most vulnerable and their staff. Everybody benefits from that. So those guidelines and that understanding of what is safe and what is not, how to revalue privacy, how to revalue skill development, that happens during the procurement process.
Patricia Kosseim:
Anthony helped shed light on the incredible opportunities that technology offers in the classroom, while also reminding us of the importance of safeguarding student privacy and well-being. As we continue to navigate the digital landscape, it's clear that collaboration among educators, policymakers, and institutions will be key to striking the right balance between innovation and protection, guided by the best interests of our students. In episode 10, I spoke with my colleagues at the IPC about some of the most important health privacy cases we worked on in 2024. First, I spoke to Jennifer Olenick, an adjudicator in our Tribunal and Dispute Resolution Division. Jennifer shared insights from PHIPA Decision 249, a case involving devastating consequences of cyber attacks on healthcare providers and offers practical strategies for health information custodians to strengthen their cyber security defenses.
Well, Jennifer, let's start with you. You rendered a decision known as PHIPA Decision 249, that involved a privacy breach at a medical imaging clinic. Can you provide a bit of background about what happened in that case?
Jennifer Olenick:
This is a situation that is sadly becoming all too common because what we are dealing with in this case was a ransomware attack. So this was a medical imaging clinic, and this hacking group encrypted the files on their electronic medical record servers and their file sharing servers so that the clinic couldn't access them. They also exfiltrated or took those records as well. The hacker also deleted the backup systems, so the clinic wasn't able to just sort of simply restore from backup. The result of this was that about a little over a half million patient records were taken by the hacker. The contact information, names, part of the health card number and dates of birth, so information that's significant to people. The other result was that the clinic was essentially not able to function for about two weeks while it worked to resolve the situation, which it eventually did by actually paying the ransom, getting a decryption key from the hackers, as well as assurances that they wouldn't do anything further with that personal health information.
Patricia Kosseim:
What are some of the key takeaways that other institutions listening to this can learn from this particular case?
Jennifer Olenick:
You have to pay attention to the basics. So things like monitoring for those dormant accounts, you can't just lose track of them because something can happen in future. Also, make sure that only those who need the privileges actually have them. So don't have regular staff having more privilege to the systems than they actually need. That can reduce the effect of a breach that occurs. And finally, you need to have proper policies in place so that employees understand that they need strong passwords, they need passwords that can't be easily guessed. They need passwords that they haven't used before in other circumstances. Another lesson here is that you should keep an eye on your backup systems and ensure that you always have one that a hacker won't be able to access.
Patricia Kosseim:
Next I turn to Linda Chen, a lawyer in the IPC's Legal Services Department who discussed the high-profile cyber attack at LifeLabs that affected millions of patients and the ensuing legal battle over privileged claims. This case established an important legal precedent for what regulated entities should expect in the context of a privacy investigation by a regulator, or in this case, two regulators. Linda, can you describe what were the kinds of documents that LifeLabs was claiming privilege over?
Linda Chen:
There were a number of documents that LifeLabs was claiming privilege over. These documents included a forensic report from an IT consultant about the causes of the cyber attack, what systems were affected, and essentially how to remediate the problem and prevent cyber attacks of this nature in the future. There was also the internal LifeLabs data analysis. This was done by LifeLabs in order for them to figure out which of their customers, and as you mentioned, it was 8.6 million Canadians who were affected by this particular data breach. They also claimed privilege over the correspondence that their consultant company had directly with the cyber attackers. So these were negotiations with respect to the ransom, and they claimed privilege over these as well. In addition, there were a couple of documents that were responses to questions to LifeLabs put to them by the commissioners, but because LifeLabs routed them through lawyers, they took the position that these documents were also subject to privilege.
Patricia Kosseim:
And what did the Ontario Divisional Court ultimately decide?
Linda Chen:
The Ontario Divisional Court ultimately decided that the claims of privilege made by LifeLabs with respect to the facts that were ultimately included in the commissioner's investigation report were not supported by evidence. The court noted that solicitor-client privilege does not extend to protect facts that are required to be produced by the regulated party, in this case LifeLabs, pursuant to a statutory duty or obligation. In this case, under PHIPA and under the Personal Information Protection Act in British Columbia, LifeLabs was required to investigate and remediate the cyber-attack. They were required by law to do that. And so this information that was related to that investigation and remediation could not be privileged with respect to the commissioner's investigation. The court also found that LifeLabs couldn't just avoid these responsibilities by placing facts about the privacy breaches inside documents and then claiming privilege over them.
Patricia Kosseim:
I then spoke to Alanna Maloney, one of our investigators about PHIPA Decision 260. This case highlights the importance of annual privacy training and confidentiality undertakings by all hospital staff, including physicians. The need for monitoring mechanisms to ensure their completion and the need for clear policies on the use of personal health information for educational purposes.
Alanna Maloney:
The hospital determined that the physician had access just under 4,000 patient files without authorization, and these unauthorized accesses were brought to the physician's attention by the hospital. And he actually admitted to accessing the patient files that the hospital had identified as unauthorized. What he explained was that he was doing it for educational purposes, so he believed that accessing the hospital's electronic health records of patients to educate himself was an authorized use of personal health information. So as part of the investigation, the hospital determined that his accesses weren't targeted. He didn't search for the patients, and the physician didn't have a personal affiliation with the patients that he accessed. The physician ended up providing an apology. He was retrained by the hospital, required to sign confidentiality agreements, and his access was monitored by the hospital. They found no further accesses by this doctor after this breach for individuals that he did not provide care to.
Patricia Kosseim:
What are the key takeaways you would say arise from this case?
Alanna Maloney:
So I think there are really four key takeaways that came out of this case. And I think for me, the most important one is that health Information custodians must provide privacy training for all staff members, including physicians upon hire and on an annual basis. It is inadequate for a health information custodian to have different expectations between its physicians and its other staff members. My second takeaway that should be highlighted is that health information custodians really must provide clear guidance on the use of personal health information for education purposes. They must also have privacy policy in place that provide clear guidance on expectations and requirements for privacy training and signing of confidentiality agreements. Finally, health information custodians really need to implement tracking systems that ensure that all of their staff, including physicians, complete the privacy training annually and signed and renew confidentiality agreements on an annual basis to make sure that they're in compliance with their policies and the expectations of PHIPA.
Patricia Kosseim:
One of the most unsettling privacy challenges we face at the IPC is the issue of abandoned health records. These situations can arise when a healthcare provider retires or passes away or closes their practice without making arrangements for their patient's records. I spoke with Fida Hindi, an IPC lawyer specializing in health privacy law about a particularly complex case where a medical clinic closed up shop leaving hundreds of patients' medical records behind that almost got permanently destroyed with no backup. This case serves as a stark reminder of the importance of having a clear succession plan to protect sensitive personal health information. Have a listen.
Fida Hindi:
It was a particular case where a medical clinic seized operation on a property due to the fact that a creditor took possession of the property and then sold the property. So there were abandoned records that were left on the property, some of which were moved to a storage company by the property management company that was hired by the creditor. That's how the file started, and the property management company had notified our office of the abandoned records. The property management company during the investigation actually threatened that they would instruct the storage company to destroy the records if they weren't going to be picked up with a very tight timeline of 24 hours. And because of that, the adjudicator issued an interim order to preserve those records and to stop the property management company from directing the storage company to destroy them. And then a final order was then issued, PHIPA Decision 230, where findings were made with respect to certain requirements under the act and to permit the custodian to go and retrieve those records.
Patricia Kosseim:
So very practically then, what should a provider do to put a contingency plan together in advance to avoid abandoning records containing personal health information such as upon death or retirement or say bankruptcy?
Fida Hindi:
Yes, I think it's very important for a health information custodian to have a succession plan in terms of whether there is a bankruptcy, there's a death, or they just switched to seize operation as to what will happen to those records of personal health information. And that will be dependent on a number of legal obligations that the health information custodian would need to consider. If you are a professional that is regulated by a college, there may be specific retention requirements that you would need to abide by with respect to those records. So speaking to legal counsel and determining how long records need to be retained for is very important. Then once you have that understanding, then creating a succession plan in terms of how long will they be kept, who will the records be kept by, in what state and how notice is going to be provided to the individuals so that they may retrieve the records or obtain a copy of them if they wish to do so.
Patricia Kosseim:
So whether it's responding to cyber attacks, preventing unauthorized access, or managing abandoned health records, the lessons learned from these cases can help healthcare providers and custodians strengthen their privacy practices and uphold their obligations under the law. So as we wrap up this best of season four episode, I hope you've been as inspired as I have by the incredible work being done to advance transparency, privacy, and innovation in our communities. If there's one or two topics that piqued your curiosity, I encourage you to listen to the fuller episode.
I want to thank all our guests who joined the Info Matters Podcast this past season to share their in-depth experience and highlight the power of commitment, collaboration, and creativity to make a real difference. I also want to thank all of our listeners for joining us today and throughout the season for your ongoing interest in these vital issues. If you'd like to learn more about the IPC and our work, visit our website at ipc.on.ca. And as always, we welcome your thoughts, feedback, and ideas on how we can continue to champion privacy and access rights together.
I'm Patricia Kosseim, Ontario's Information and Privacy Commissioner, and this has been Info Matters. If you enjoy the podcast, leave us a rating or review. If there's an access or privacy topic you'd like us to explore on a future episode, we'd love to hear from you. Send us a tweet @IPCinfoprivacy or email us at @email. Thanks for listening, and please join us again for more conversations about people, privacy and access to information. If it matters to you, it matters to me.