Commissioner's Blog

Dive into Commissioner Kosseim’s insightful blogs covering privacy, access, cyber safety, and beyond. Stay informed and explore the latest insights.

The power of PETs: Building trust through privacy enhancing technologies

Blog by Patricia Kosseim

On January 28, 2025, the IPC had the pleasure of hosting Ontarians at a public event in celebration of Data Privacy Day. The theme was the Power of PETs: Privacy Enhancing Technologies. A recording of the event is available on our YouTube channel and my opening remarks, for those of you interested, can be found on our website.

Here are a few highlights and key takeaways from the event.

Unlocking the power of privacy enhancing technologies

As organizations collect and share more data, the risk of personal information being hacked or misused increases. Privacy enhancing technologies (PETs) can help mitigate these risks by reducing the need to use data that contains personal information in the first place.

PETs include tools and methods that protect personal information or avoid it altogether to allow valuable data analyses to take place in privacy protective ways. In short, PETs can help organizations keep personal information safe while still unlocking the huge value of data to support research, innovation, and improved public services. 

On Data Privacy Day, I was joined by a panel of experts from the public, private and health sectors, academia, and law to discuss privacy enhancing technologies and how they can be used by organizations to address privacy challenges.

Our first panel discussed different types of PETs, the importance of PETs to support responsible innovation, and some of their associated risks.

Luk Arbuckle (Global AI Practice Leader, IQVIA Applied AI Science) explained that PETs work like an air traffic control system, helping manage different levels of data transformation, as well as the surrounding context. He described the process of deidentification, highlighting the Five Safes framework — safe projects, safe people, safe settings, safe data, and safe outputs. The Five Safes emphasize important thresholds one must think about when transforming the data itself; but just as importantly, one must carefully manage the context in which the data are likely to be used by multiple parties for different purposes to minimize the risk of reidentification. And ethical considerations must be taken into account when operating these types of technologies, added Arbuckle.

Luk noted how innovation grows from constraints and limitations that force us to find new ways of doing things. Having worked in both regulatory and industry sectors, he stressed the need for greater collaboration and how innovation and privacy can complement one another. 

Jules Polonetsky (CEO, Future of Privacy Forum) introduced the concept of differential privacy, a mathematical framework allowing organizations to share aggregate patterns while minimizing privacy risks. Using a coin-flipping analogy, he explained how differential privacy enables valid statistical outputs without exposing individual data points. Heads, a person would tell the truth. Tails, a person would lie. Aggregated, they cancel each other out without being able to know which is which. Differential privacy is about adding noise to the data to protect individual privacy, while still getting a valid statistical output from the data as a whole. This technique is particularly useful for encrypted data sets and collaborative learning models.

Jules added that there’s a lot of data that people want to learn from but it's crucial to be aware of the legal and privacy risks involved and understand the rigorous process of PETs and what they fully entail. The data also needs to be sound – without cultural critic or bias, advised Jules.

Dr. Khaled El Emam (IPC Scholar in Residence, Professor at the School of Epidemiology and Public Health, University of Ottawa, and Senior Scientist, Electronic Health Information Laboratory, CHEO Research Institute) described synthetic data as a form of generative AI that learns from real data patterns to create new, artificial data with the same statistical properties as the original data but that belongs to no real individuals. This technique allows data sharing for medical research while preserving privacy. Although there are some risks, such as potential inferences from the synthetic data, automation makes it easier for analysts to work with complex data sets. (Learn more about synthetic data in my conversation with Khaled in Season 2, Episode 1 of the Info Matters podcast). 

Khaled stated that organizations that use older methods of manually de-identifying information must shift from the status quo to more modern and sophisticated technologies. This shift requires time, investment, changing practices, and re-training of professional technical skills. But if we want to continue to share and use data for the greater good, we must start adopting PETs, for reasons of complexity and scale. Technology has to be part of the solution. 

How organizations are using PETs

Our second panel spoke about how they are seeing PETs being used by organizations in practice and shared what they felt the role of the IPC (and other regulators) should be in helping to encourage the adoption of PETs by Ontario’s institutions.

Mohammad Qureshi (Corporate Chief Information Officer and Associate Deputy Minister, Ontario Ministry of Public and Business Service Delivery and Procurement) kicked off the second panel by highlighting how organizations in the public sector are using PETs. He identified three “buckets” where the public sector leverages data that’s been collected to: 

  1. deliver government services 
  2. better inform public policy and
  3. advance economic development, prosperity and innovation

He spoke about the need for public sector innovation but that throughout, institutions must remain focused on maintaining public trust as their north star. 

Pam Snively (Chief Data and Trust Officer, TELUS) shared how PETs are being used in the health sector. She spoke about uses of artificially generated information that mimics real patient data, but still preserves privacy and doesn't contain any actual patient information. For example, synthetic data can be used to advance research into rare conditions and enhance clinical risk prediction modeling.

She also spoke about the concept of federated learning — a privacy preserving approach that enables multiple players to collaboratively share insights or train AI models, including by sharing only the models, rather than actual data. By sharing information in this way, organizations can also learn from each another. Pam emphasized that effective regulatory guidance can help remove uncertainty, foster trust among citizens, and inspire innovators to move ahead without the fear of doing the wrong thing. 

Adam Kardash (Co-Chair, Privacy and Data Management, National Lead, AccessPrivacy) described how the legal landscape is evolving both in Canada and internationally. Some jurisdictions are moving away from the idea of personal information as a black or white concept, to an understanding of identifiability on a spectrum. More and more, regulators are taking into account different levels of identifiability in regulating risks associated with data use and are advancing the adoption of privacy enhancing technologies as a reasonable means of mitigation. 

Adam spoke about how private sector, health sector, and broader public sector clients often look for guidance on leveraging large amounts of data, while still respecting privacy. He stressed the critical role of regulators and the importance of normalizing de-identification methods as central to those discussions, noting that otherwise, the misuse of data can lead to the erosion of consumer trust.

Key takeaways from Data Privacy Day

There is no question that with vast amounts of data and exponential advances in digital technologies, we have tremendous opportunity to improve our health, our economy, and our society for present and future generations. But without protections and safeguards in place, the risks to our privacy might be too great a cost, particularly in a context where public trust is riding low. 

Privacy enhancing technologies are innovative ways that allow organizations to extract valuable insights from data while protecting our personal information or avoiding its use altogether.  By being innovative, we can help enable responsible uses of data in ways that can deliver us benefits — without compromising our privacy rights.

Overall, this year’s Data Privacy Day event highlighted for me that PETs are no longer these highly remote and complex technological tools that only large-scale, sophisticated players can afford to experiment with. They have become essential safeguards that must be made more readily accessible to all organizations both large and small for protecting privacy in a data-driven world. In short, they have become a critical part of the practical solution we need. 

The IPC is committed to continuing to raise awareness about privacy enhancing technologies and helping guide and support organizations along their PETs adoption journey. Stay tuned as we update our award-winning guidance, De-identification Guidelines for Structured Data, later this year.

As regulators and privacy advocates, we must meet innovation with innovation if we stand any chance of being effective and relevant. It behooves all of us to work collaboratively together in search of the kinds of creative, practical and innovative solutions that PETs can offer. 

 

Media Contact

For a quick response, kindly e-mail or phone us with details of your request such as media outlet, topic, and deadline:

Email: @email
Telephone: 416-326-3965

Contact Us

Social Media

The IPC maintains channels on LinkedIn, X (formerly Twitter), YouTube and Instagram in its efforts to communicate to Ontarians and others interested in privacy, access and related issues.

Our Social Media Policy

Help us improve our website. Was this page helpful?
When information is not found

Note:

  • You will not receive a direct reply. For further enquiries, please contact us at @email
  • Do not include any personal information, such as your name, social insurance number (SIN), home or business address, any case or files numbers or any personal health information.
  • For more information about this tool, please see our Privacy Policy.