Privacy and Transparency in a Modern Government

Our goal is to advance Ontarians’ privacy and access rights by working with public institutions to develop bedrock principles and comprehensive governance frameworks for the responsible and accountable deployment of digital technologies.

Our work to further this goal includes:

Showing 10 of 72 results

Title Topics Type Date More Information Toggle
S4-Episode 11: The best of season 4 Artificial Intelligence, Children and Youth in a Digital World, Next-Generation Law Enforcement, Privacy and Transparency in a Modern Government, Trust in Digital Health Podcast Read moreExpand

In this special retrospective episode of Info Matters, Commissioner, Patricia Kosseim revisits season four’s standout conversations. Highlights include junior high students' views on privacy, Cynthia Khoo on facial recognition, and Robert Fabes on how people experiencing homelessness perceive privacy. Dr. Devin Singh explores AI in health care, while Priya Shastri from WomanAct discusses information sharing in safety planning for survivors of intimate partner violence. The episode also covers the use of digital educational tools in the classroom, mediation in access appeals at the IPC, conversations about the IPC’s Transparency Showcase, and IPC health privacy cases involving cyber attacks and abandoned records.

Ensuring secure disposal of health records: Out of sight is not out of mind! Health, Privacy and Transparency in a Modern Government, Trust in Digital Health Case of Note Read moreExpand

Case of Note: PHIPA Decision 266

Background

A complaint was brought to the Information and Privacy Commissioner of Ontario (IPC) alleging that a health clinic had failed to securely dispose of records of personal health information (PHI). To support the allegations, photographs of patient records found discarded in an unsecured recycling bin were provided.

The IPC wrote to the clinic to inquire into the allegations. The clinic provided a report to the IPC which raised additional concerns and the IPC initiated an investigation into the matter.

The IPC investigator took custody of the records retrieved from the recycling bin. Despite many of the records being shredded or torn by hand, the investigator was able to recover some sensitive information. This included dates of patient visits, a self-reported health history, a patient’s date of birth, and six other complete patient names associated with the clinic.

During the investigation, the clinic explained that staff began disposing of records to make more space. Some records were shredded, others were hand torn to avoid noise from the shredding machine that might disturb patients during appointments. The discarded records were picked up by cleaners biweekly and placed in a dumpster in a locked garage area of the plaza where the clinic is located. From there, the garbage would be picked up weekly by the local garbage collector.

The clinic acknowledged that the cleaners would have had access to the insecurely destroyed material. It recognized that further steps should have been taken to ensure secure disposal of this information. The clinic also advised that it lacked written policies or procedures for record retention and secure record destruction or disposal. Instead, staff relied on verbal instructions, which the clinic admitted were insufficient.

The clinic notified affected patients of the breach by sending an initial notification letter to affected individuals. Subsequently, the clinic sent another notification letter to nearly 500 patients who may have also been affected.

Findings

The investigator found that at the time of the breach, the clinic was not in compliance with several sections of the Personal Health Information Protection Act (PHIPA). These include legal requirements for health information custodians to:

  • take reasonable safeguards to protect personal health information (s. 12(1)),
  • securely handle and dispose of records (s. 13(1)),
  • have proper information practices in place (s. 10(1)), and,
  • follow those practices (s. 10(2)).

The investigator concluded that the clinic’s lack of measures and safeguards resulted in its failure to ensure that records of PHI in its custody or under its control were retained and disposed of in a secure manner.

To address the investigator’s concerns, the clinic created and put in place policies and training. This included a new privacy policy to address how the clinic routinely collects, uses, modifies, discloses, retains or disposes of PHI. The clinic also created a client records policy outlining specific measures to be taken to safeguard and securely dispose of client records.

All staff were required to review the new policies and submit a written acknowledgement of their understanding and willingness to comply with them. Two training sessions were also held to familiarize staff with the updated privacy practices and the clinic committed to conducting biannual training going forward. The clinic also updated its employee handbook with additional resources related to its obligations under PHIPA, including a PHIPA training video and links to the entire statute and other resources.

The investigator concluded that these remedial steps brought the clinic into compliance with PHIPA.

Lastly, the investigator found that the unsecured disposal of PHI constituted a loss of PHI, triggering the obligation to notify all affected individuals. While the clinic did provide notice, the investigator found a deficiency with the initial notification letter (which was remedied) and found that notice should have been provided more quickly. However, overall, the investigator was satisfied that the clinic provided the notification required by section 12(2) of PHIPA.  

Key takeaways

  1. Health information custodians (HICs) must ensure that PHI of their patients is secure at all times, including during the record disposal process.
  2. HICs must have privacy policies in place that address how they collect, use, modify, disclose, retain or dispose of PHI. These policies should specifically address measures to be taken to protect the security of patient records and the secure disposal of these records.
  3. Procedures for secure record disposal depend in part on the storage media used. If dealing with paper, as in this case, records should not simply be torn by hand. They should be properly shredded using a cross-shred or micro-cut shredder to ensure that the records cannot later be reconstructed. This can be done on-site, or, if using an outside agency, a formally signed contract or agreement should be in place. The agreement should address the need to ensure security and confidentiality of records during the disposal process and indicate the specific disposal method to be used.
  4. HICs should provide all staff with regular training on privacy policies and practices and the secure disposal of client records. Staff should receive training annually and be required to submit signed attestations acknowledging that they have read and understand the privacy policies.
  5. HICs must notify affected individuals when personal health information in its custody or control is stolen or lost or used or disclosed without authority. Unsecured disposal of PHI constitutes a loss of PHI, triggering the obligation to notify affected individuals.

Additional Resources

The power of PETs: Building trust through privacy enhancing technologies Privacy, Technology and Security Read moreExpand

On January 28, 2025, the IPC had the pleasure of hosting Ontarians at a public event in celebration of Data Privacy Day. The theme was the Power of PETs: Privacy Enhancing Technologies. A recording of the event is available on our YouTube channel and my opening remarks, for those of you interested, can be found on our website.

Here are a few highlights and key takeaways from the event.

Unlocking the power of privacy enhancing technologies

As organizations collect and share more data, the risk of personal information being hacked or misused increases. Privacy enhancing technologies (PETs) can help mitigate these risks by reducing the need to use data that contains personal information in the first place.

PETs include tools and methods that protect personal information or avoid it altogether to allow valuable data analyses to take place in privacy protective ways. In short, PETs can help organizations keep personal information safe while still unlocking the huge value of data to support research, innovation, and improved public services. 

On Data Privacy Day, I was joined by a panel of experts from the public, private and health sectors, academia, and law to discuss privacy enhancing technologies and how they can be used by organizations to address privacy challenges.

Our first panel discussed different types of PETs, the importance of PETs to support responsible innovation, and some of their associated risks.

Luk Arbuckle (Global AI Practice Leader, IQVIA Applied AI Science) explained that PETs work like an air traffic control system, helping manage different levels of data transformation, as well as the surrounding context. He described the process of deidentification, highlighting the Five Safes framework — safe projects, safe people, safe settings, safe data, and safe outputs. The Five Safes emphasize important thresholds one must think about when transforming the data itself; but just as importantly, one must carefully manage the context in which the data are likely to be used by multiple parties for different purposes to minimize the risk of reidentification. And ethical considerations must be taken into account when operating these types of technologies, added Arbuckle.

Luk noted how innovation grows from constraints and limitations that force us to find new ways of doing things. Having worked in both regulatory and industry sectors, he stressed the need for greater collaboration and how innovation and privacy can complement one another. 

Jules Polonetsky (CEO, Future of Privacy Forum) introduced the concept of differential privacy, a mathematical framework allowing organizations to share aggregate patterns while minimizing privacy risks. Using a coin-flipping analogy, he explained how differential privacy enables valid statistical outputs without exposing individual data points. Heads, a person would tell the truth. Tails, a person would lie. Aggregated, they cancel each other out without being able to know which is which. Differential privacy is about adding noise to the data to protect individual privacy, while still getting a valid statistical output from the data as a whole. This technique is particularly useful for encrypted data sets and collaborative learning models.

Jules added that there’s a lot of data that people want to learn from but it's crucial to be aware of the legal and privacy risks involved and understand the rigorous process of PETs and what they fully entail. The data also needs to be sound – without cultural critic or bias, advised Jules.

Dr. Khaled El Emam (IPC Scholar in Residence, Professor at the School of Epidemiology and Public Health, University of Ottawa, and Senior Scientist, Electronic Health Information Laboratory, CHEO Research Institute) described synthetic data as a form of generative AI that learns from real data patterns to create new, artificial data with the same statistical properties as the original data but that belongs to no real individuals. This technique allows data sharing for medical research while preserving privacy. Although there are some risks, such as potential inferences from the synthetic data, automation makes it easier for analysts to work with complex data sets. (Learn more about synthetic data in my conversation with Khaled in Season 2, Episode 1 of the Info Matters podcast). 

Khaled stated that organizations that use older methods of manually de-identifying information must shift from the status quo to more modern and sophisticated technologies. This shift requires time, investment, changing practices, and re-training of professional technical skills. But if we want to continue to share and use data for the greater good, we must start adopting PETs, for reasons of complexity and scale. Technology has to be part of the solution. 

How organizations are using PETs

Our second panel spoke about how they are seeing PETs being used by organizations in practice and shared what they felt the role of the IPC (and other regulators) should be in helping to encourage the adoption of PETs by Ontario’s institutions.

Mohammad Qureshi (Corporate Chief Information Officer and Associate Deputy Minister, Ontario Ministry of Public and Business Service Delivery and Procurement) kicked off the second panel by highlighting how organizations in the public sector are using PETs. He identified three “buckets” where the public sector leverages data that’s been collected to: 

  1. deliver government services 
  2. better inform public policy and
  3. advance economic development, prosperity and innovation

He spoke about the need for public sector innovation but that throughout, institutions must remain focused on maintaining public trust as their north star. 

Pam Snively (Chief Data and Trust Officer, TELUS) shared how PETs are being used in the health sector. She spoke about uses of artificially generated information that mimics real patient data, but still preserves privacy and doesn't contain any actual patient information. For example, synthetic data can be used to advance research into rare conditions and enhance clinical risk prediction modeling.

She also spoke about the concept of federated learning — a privacy preserving approach that enables multiple players to collaboratively share insights or train AI models, including by sharing only the models, rather than actual data. By sharing information in this way, organizations can also learn from each another. Pam emphasized that effective regulatory guidance can help remove uncertainty, foster trust among citizens, and inspire innovators to move ahead without the fear of doing the wrong thing. 

Adam Kardash (Co-Chair, Privacy and Data Management, National Lead, AccessPrivacy) described how the legal landscape is evolving both in Canada and internationally. Some jurisdictions are moving away from the idea of personal information as a black or white concept, to an understanding of identifiability on a spectrum. More and more, regulators are taking into account different levels of identifiability in regulating risks associated with data use and are advancing the adoption of privacy enhancing technologies as a reasonable means of mitigation. 

Adam spoke about how private sector, health sector, and broader public sector clients often look for guidance on leveraging large amounts of data, while still respecting privacy. He stressed the critical role of regulators and the importance of normalizing de-identification methods as central to those discussions, noting that otherwise, the misuse of data can lead to the erosion of consumer trust.

Key takeaways from Data Privacy Day

There is no question that with vast amounts of data and exponential advances in digital technologies, we have tremendous opportunity to improve our health, our economy, and our society for present and future generations. But without protections and safeguards in place, the risks to our privacy might be too great a cost, particularly in a context where public trust is riding low. 

Privacy enhancing technologies are innovative ways that allow organizations to extract valuable insights from data while protecting our personal information or avoiding its use altogether.  By being innovative, we can help enable responsible uses of data in ways that can deliver us benefits — without compromising our privacy rights.

Overall, this year’s Data Privacy Day event highlighted for me that PETs are no longer these highly remote and complex technological tools that only large-scale, sophisticated players can afford to experiment with. They have become essential safeguards that must be made more readily accessible to all organizations both large and small for protecting privacy in a data-driven world. In short, they have become a critical part of the practical solution we need. 

The IPC is committed to continuing to raise awareness about privacy enhancing technologies and helping guide and support organizations along their PETs adoption journey. Stay tuned as we update our award-winning guidance, De-identification Guidelines for Structured Data, later this year.

As regulators and privacy advocates, we must meet innovation with innovation if we stand any chance of being effective and relevant. It behooves all of us to work collaboratively together in search of the kinds of creative, practical and innovative solutions that PETs can offer. 

 

Toronto District School Board cyberattack: Recommendations for improved security Privacy and Transparency in a Modern Government, Technology and Security Letters Read moreExpand

A social engineering attack at a TDSB high school led to the unauthorized access of personal information belonging to current and former students, parents and staff across several schools. The threat actor gained unauthorized access to the affected schools’ systems by obtaining the login credentials of a school’s Vice-Principal (VP) through a social engineering attack and obtaining the login credentials for their OneDrive account from a browser cache connected to the Vice-Principal. The breach resulted in several recommendations to the TDSB by the IPC that will assist in improving its security posture.

Upholding Ontarians’ privacy and access rights in 2024: Not only the what, but the how Artificial Intelligence, Privacy and Transparency in a Modern Government Read moreExpand

As philosopher John Dewey once said: We do not learn from experience … we learn from reflecting on experience.”

Every December provides us with an opportunity to reflect on the experiences of the past year. Typically, I have reflected on the work we’ve done to advance our four strategic priority areas1) Privacy and Transparency in a Modern Government; 2) Children and Youth in a Digital World; 3) Next-Generation Law Enforcement; and 4) Trust in Digital Health. 

This year, I would like to reflect on our experience of 2024 through a different lens, using our four cross-cutting approaches. These are the ways in which we’ve undertaken to do our work: basically the “how.”  

  1. We will consider accessibility and equity issues

Throughout 2024, we continued our advocacy efforts to protect the privacy and access rights of the most vulnerable.  

For example, in the context of Bill 194, we strongly recommended that an amendment be made deeming children’s personal information to be sensitive personal information.

We provided our best advice on Bill 188’s proposed amendments to the Children, Youth and Family Services Act (CYFSA) to protect the privacy and access rights of those currently or formerly involved in Ontario’s child welfare system. We recommended that any ministerial efforts to enhance the safety and protection of vulnerable children, youth, and families under the CYFSA must be transparent, subject to appropriate public scrutiny, and matched by a proportionate level of robust privacy protection and oversight.

We called for greater protection of children in schools as well. In October, the IPC launched the Digital Privacy Charter for Ontario Schools encouraging school administrators and school board officials to take the pledge! The charter outlines twelve voluntary commitments that schools can make to uphold students’ privacy best interests. This includes protecting students’ personal information when using digital education tools and technology platforms. It also includes empowering students to understand and exercise their own privacy rights by, for example, providing age-appropriate notices about educational technology tools and services and guidance on how to set privacy controls. 

In a recent podcast episode of Info Matters, I spoke with Anthony Carabache from the Ontario English Catholic Teachers’ Association. He underscored the opportunity for schools to adopt the digital charter against a backdrop of rapidly increasing technology being used in the classroom. We discussed the real risks that children face when accessing commercial websites or apps that may nudge user behaviour or incorporate deceptive design patterns.  

In response to a recommendation of the Chief Coroner of Ontario, our office stepped up to address the widespread impact of intimate partner violence (IPV), particularly on women and girls. In May, the IPC released guidance for professionals on responsible information-sharing practices to help prevent IPV. Our publication explains when Ontario's privacy laws permit the sharing of personal information where there's a risk of serious harm to health or safety. 

I spoke about this important topic with Priya Shastri, Director of Programs at WomanAct, in another recent episode of Info Matters. WomanAct was a key partner in helping us develop our IPV guidance. They convened focus group discussions with victims and survivors of IPV so we could hear firsthand from them about the importance of information sharing, building trusting relationships with victims and survivors, and taking a collaborative, trauma-informed approach to combatting IPV, particularly among marginalized communities.

Throughout 2024, we continued to explore Indigenous concepts of privacy and data sovereignty. Jonathan Dewar, Chief Executive Officer of the First Nations Information Governance Centre, was invited to address the annual meeting of federal, provincial and territorial (FPT) information and privacy commissioners and ombuds, reminding us of our collective responsibility to reach out to relevant communities and to advance the Truth and Reconciliation Commission of Canada’s calls to action. 

Recently, I had the honour and privilege of delivering a keynote address to the Association of Native Child and Family Services Agencies of Ontario, where I had occasion to hear, and see, concrete evidence of the systemic, disparate and intergenerational impacts on Indigenous youth overrepresented in the child welfare system. 

I also had an inspiring and eye-opening conversation with Jeff Ward, CEO of Animikii, in another recent Info Matters podcast episode about the longstanding connection between technology and Indigenous culture. Jeff spoke of the ethical imperatives of recognizing data sovereignty and community interests in privacy, and how incorporating Indigenous values and principles into the development of new technologies can help empower communities. 

Several other Info Matters episodes this year have been dedicated to privacy and access issues affecting vulnerable groups and communities. For example, in one episode with Rob Fabes of the Ottawa Mission, we explored the privacy, access and identity challenges of people facing homelessness. 

In another episode, University of Ottawa professors Jane Bailey and Valerie Steeves, discussed the hostile environment of social media and technology-facilitated violence inflicted particularly on young women and girls. 

  1. We will be bold, but pragmatic 

We began 2024 by issuing guidance on the use of administrative monetary penalties (AMPs) under Ontario’s health privacy law that came into force January 1. The guidance addresses the criteria for AMPs and how the IPC will determine penalty amounts. In keeping with our bold, but pragmatic approach, we signaled our intention of being proportionate in our response to privacy violations, favouring education, guidance, and recommendations wherever we can to achieve compliance, and reserving AMPs for only the more severe cases. 

Also, in 2024, we launched the IPC’s Transparency Showcase 2.0. The Beauty and Benefits of Transparency is an online exhibit of several open data/open government projects and initiatives launched by public institutions to improve the day to day lives of Ontarians through greater transparency. Understandably, regulators hesitate to be seen as endorsing certain data practices that may one day become the subject of complaints or appeals that must be impartially investigated. That said, there is also tremendous power and influence in showing concrete, best practices to inspire others to do better, which is why we took this bold but pragmatic step of encouraging positive compliance, with all the necessary provisos and disclaimers. 

On the legislative front, it has been a challenging year to keep pace with an ever-changing landscape. We’ve noticed a worrisome trend as of late, with new laws being rushed through without the necessary time and opportunity for public consultation and debate. Nonetheless, we continue to speak up boldly and offer pragmatic recommendations on how the government could still achieve its policy objectives, while respecting Ontarians’ access and privacy rights. 

For example, as I wrote in my last blog, the Ontario government passed Bill 194, the Strengthening Cyber Security and Building Trust in the Public Sector Actas is, despite the IPC’s 28 recommended amendments on how protection and oversight could be strengthened in areas of cybersecurity, artificial intelligence, digital technologies aimed at children, and data privacy. 

Schedule 2 of the Reducing Gridlock, Saving You Time Act introduced an amendment that effectively shields certain records related to priority highway projects from freedom of information requests. The IPC strongly recommended removing that amendment and resorting back to well-established criteria of the Supreme Court of Canada for assessing confidential commercial information that have stood the test of time. But that recommendation was not heeded either. 

Schedule 6 of Bill 231, the More Convenient Care Act, proposes amendments to the Personal Health Information Protection Act to introduce the use Digital Health IDs as a way of opening individuals’ access to their Electronic Health Record (EHR) and other digital health services. The IPC issued recommendations on how the policy objectives underlying the bill could provide more meaningful access rights, while also being clearer and more practical to implement and enforce. The Ontario legislature rose last week before Bill 231 could be debated before committee.

Despite these setbacks and disappointments, we will keep advocating for the access and privacy rights of Ontarians and give the legislature our best expert advice, always.  

  1. We will be consultative and collaborative 

Throughout the year, any guidance we issued — from third party contracting, to information-sharing in the context of IPV, to automated licence plate recognition systems — was the result of targeted consultations with relevant interested parties who provided thoughtful input into the process of development. Members of IPC’s Strategic Advisory Council also provided invaluable feedback that greatly improved the end-result. 

In October 2024, we had the immense honour of hosting the annual meeting of federal, provincial, and territorial (FPT) privacy regulators and ombuds in Toronto. This was a significant opportunity to discuss key issues, enhance collaboration among jurisdictions, and reaffirm a shared commitment to protecting the access and privacy rights of all Canadians, together.

In November, we released a joint FPT resolution about identifying and mitigating harms from privacy-related deceptive design patterns (DDPs). We committed to collaborating with governments and other interested parties to modernize design standards, reduce the presence of DDPs, and champion privacy-friendly design patterns that respect user autonomy.

We also built on our earlier IPV-related guidance by calling on our FPT colleagues to leverage and elevate our work at the national level. The result of our collaboration was a joint resolution issued by privacy regulators and ombuds across the country to guide the responsible disclosure of personal information in situations involving IPV

To round off the year, we joined with Canada’s information commissioners and ombuds to issue a third joint resolution calling for enhanced transparency in government services. It calls for transparency to be built into the early design and implementation of new systems, administrative processes, and governance models. All of these joint resolutions demonstrate our common resolve, stronger than ever, to collaborate on privacy and access issues of national interest. 

Also, on November 25, together with our BC counterpart, we finally published our 2020 joint investigation report into the 2019 cyberattack on LifeLabs’ computer systems that affected millions of Canadians. The company’s long-time bid to stop its publication abruptly came to an end when the Ontario Court of Appeal refused leave to appeal the decision of the Ontario Divisional Court. The divisional court upheld our finding that the information contained in Lifelabs’ breach investigation report was not subject to solicitor-client or litigation privilege. The lower court also upheld our office’s statutory authority to share information with our BC colleagues and cooperate in a joint investigation, reaffirming our commitment to collaborate with other regulators on enforcement matters, wherever we can, and it makes sense to do so.  

  1. We will develop knowledge, skills and capacity both internally and externally

On December 11, the IPC created a new Research and Innovation Hub featuring in-depth research reports that provide original insights and comprehensive analyses of privacy and access issues. These research reports, developed in partnership with leading researchers and academics, are intended to leverage expertise and advance knowledge on emerging technologies and innovative approaches to help shape the future of privacy and access. 

Our first publication, co-authored by Dr. Teresa Scassa and Elif Nur Kumru of the University of Ottawa, was funded by the Social Sciences and Humanities Research Council and developed in partnership with the IPC. It provides valuable insights on how privacy regulatory sandboxes can be used to support the development, testing, and validation of new products or services under a regulator’s supervision to ensure compliance before they are deployed. 

We’ve supported and/or commissioned other research reports on specialized topics such as employee privacy, remotely piloted aircraft systems (or drones) and neurotechnology. These, too, will soon be posted on our research and innovation hub in the new year. 

Another way we help develop knowledge and capacity on cutting-edge issues is through our Privacy Day event every year that attracts thousands of participants online and in person. In 2024, our Privacy Day event was on AI in the public sector, which is still available to watch on our YouTube channel. Our upcoming Privacy Day event on January 28, 2025, will focus on the topic of Privacy Enhancing Technologies. There’s still time to register here! 

Conclusion

At this time of year, many dictionary and press outlets have already selected their word of the year for 2024. For Oxford University Press, it’s “brain rot” and for the Economist, it’s “kakistocracy.” Meanwhile, Merriam-Webster Dictionary has chosen “polarization” and Cambridge picked “manifest.” 

It’s difficult to identify a unifying theme amongst these. However, if I were to pick a word of the year, mine would be “awakening.” While 2023 marked the year when consumer-facing tools like ChatGPT were publicly released and massively distributed for individuals to experiment with — literally at their fingertips — 2024 marked the year when reality began to sink in. Citizens, businesses, and governments around the world have awoken to the power of AI to make everyday things better and easier, but also its potential impacts on privacy and human rights, not to mention education, jobs, arts and culture, safety and world order. Technology’s ability to profoundly change life as we know it has entered the world’s consciousness. 

As we look ahead to the new year, we have our work cut out for us, particularly as Bill 194 starts to unfold here in Ontario and regulation-making activity begins in earnest to set out actual rules with respect to AI, cybersecurity, and children’s digital privacy. We will press forward with advocating for a strong, coherent, and fit-for-purpose regulatory regime with robust protections for access and privacy rights across all sectors in Ontario. And hopefully, next year’s word will be “action.”  

In the meantime, I wish you and your families all the very best for a happy holiday season and look forward to continuing our important work together in the new year.

Exploring the Potential for a Privacy Regulatory Sandbox for Ontario Privacy and Transparency in a Modern Government Papers Read moreExpand

Innovators, public institutions, and regulators are continually challenged by rapidly emerging technologies, such as artificial intelligence, and understanding how privacy laws apply to ensure compliance. This report, funded by the Social Sciences and Humanities Research Council, was co-authored by Dr. Teresa Scassa and Elif Nur Kumru of the University of Ottawa, in partnership with the IPC. It provides valuable insights on how privacy regulatory sandboxes can be used to support the development, testing, and validation of new products or services under a regulator’s supervision before they enter the market.

Explore more of our in-depth research reports by visiting our Research & Innovation Hub.

Transparency by default: Canada’s Information Commissioners and Ombuds issue joint resolution calling for enhanced transparency in government operations Privacy and Transparency in a Modern Government Read moreExpand

Toronto, ON – December 10, 2024 – In a joint resolution, Canada’s Information Commissioners and Ombuds are pressing their respective governments to adopt a new standard of transparency by default in the development, management, and delivery of government services. This federal, provincial, territorial resolution calls for transparency to be built into the early design and implementation of new systems, administrative processes, and governance models.  

Government information belongs to the public and public institutions should proactively make information accessible to the people they serve. In today’s digital age, never has this been more necessary to counter misinformation and disinformation, say Canada’s information regulators. 

A culture of transparency and accountability must be the norm, firmly embedded in the day-to-day operations of government at all levels. Such a culture must be encouraged among all public service staff, including political staff conducting government business.

“Transparency is not just a nice-to-have in delivering government services, it’s an integral part of the service itself,” said Patricia Kosseim, Ontario’s Information and Privacy Commissioner. “Government services designed and delivered with transparency in mind are inherently better services. When transparency is built into the very design of public services, it enhances the robustness of government decisions and actions made in developing those services, demonstrates confidence in their quality, and ultimately engenders public trust and adoption of those services.”   

Learn more:
Transparency by default – Information Regulators Call for a New Standard in Government Service  

Media contact:
@email

Bill 194: Ontario’s missed opportunity to lead on AI Artificial Intelligence, Privacy and Transparency in a Modern Government Read moreExpand

What if the most transformative technology of our time — artificial intelligence — was already impacting Ontarians’ lives without the protections we deserve?

Ontario’s Strengthening Cyber Security and Building Trust in the Public Sector Act, arguably the most consequential bill of the current legislative session, was adopted last Monday.

Bill 194 regulates some of the most significant digital issues of our time: cybersecurity, artificial intelligence, and children’s digital information. Yet it leaves all the critical rulemaking for future regulations to be set by government overseeing its own public institutions. The lack of transparency, explicit independent oversight and democratic process around this bill should be a concern for all Ontarians. 

AI is already transforming public services in Ontario, shaping decisions in health care, education, and social services. Done right, AI can enhance efficiency and improve outcomes. Done wrong, it can cause serious harms and have discriminatory impacts. Bill 194 was Ontario's chance to set clear statutory guardrails for public sector use of AI. Unfortunately, that chance has come and gone, leaving Ontarians without the certainty and protections they deserve.

When AI systems influence decisions that touch people’s lives, we must demand that they respect the fundamental principles we all value as a society. 

To be trustworthy, AI systems must be valid and reliable. They must undergo meticulous testing, with human review, to verify that they’re functioning reliably for the purpose for which they were designed, used, or implemented, under real-world conditions.

AI must be safe and designed to protect our lives, physical and mental health, property and economic security, and the environment. This requires robust monitoring and cybersecurity measures.

AI must be developed using a privacy-by-design approach, with safeguards built in right from the start to minimize data collection, reduce privacy and security risks, and ensure personal information is used only when necessary. 

Institutions must be transparent about their use of AI by adopting accessible policies and practices that clearly explain to Ontarians how they are using AI and supporting their access to information rights. 

They must also set clear rules and processes to manage every stage of AI development — from its creation and use to any changes or retirement of AI systems. 

AI-enabled decisions by must be traceable — institutions must clearly explain how automated decisions are made and take responsibility for the outcomes. People must be provided with ways to challenge AI decisions, and there must be independent oversight to hold institutions accountable.

Most importantly, AI must affirm the human rights of individuals and communities and actively address historical biases to ensure that decisions made or assisted by AI are fair, non-discriminatory, and respectful of human dignity. 

These are foundational principles. Yet Bill 194 mentions none of them. Instead, it authorizes the minister to set out eventual rules by way of regulation. Regulations are easier to make and change as the technology evolves. This need for flexibility may make sense at the level of technical detail, but not at the level of principle. 

Can you imagine a world where we would not want AI to be valid and reliable, safe, privacy-protective, transparent, accountable and human rights-affirming? 

These globally recognized principles should have been codified in Bill 194 to signal a clear government commitment to stand and live by them. Public institutions seeking to use Ontarians’ data in AI systems or other applications should be bound by these principles as a non-negotiable part of the social contract. Principles as fundamental as these should not be left to the whim of a murky regulation-making process. 

Moreover, these principles cannot exist in a vacuum — they require independent oversight to ensure compliance and hold public institutions accountable for potential misuse or harm. Bill 194 provides no clear or direct avenue for individuals to file privacy complaints to my office if they are legitimately concerned about the over collection, misuse or inaccuracy of their personal information and consequential decisions made about them, including through AI.

Without statutory guardrails and explicit independent oversight, Bill 194 missed the opportunity to secure Ontarians’ trust in AI’s promise to deliver a prosperous digital future for them and their children. 

But continue forward we must. For my part, I will continue to advocate for stronger protections, clearer accountability, and independent oversight throughout the regulation-making process to ensure AI is used to serve Ontarians, not the other way around. 

Federal, provincial, and territorial privacy regulators address responsible information sharing in situations involving intimate partner violence Privacy and Transparency in a Modern Government News Releases Read moreExpand

Toronto, Ontario, November 27, 2024 — Privacy authorities across Canada have issued a joint resolution to guide the responsible disclosure of personal information in situations involving intimate partner violence (IPV). Finalized at their October annual meeting, hosted by the Information and Privacy Commissioner of Ontario, the resolution aims to empower organizations and their staff to make informed decisions about privacy, confidentiality, and public safety.

IPV is a pervasive problem in Canada, primarily harming women and gender-diverse individuals. In 2023, there were 123,319 victims (aged 12 years and older) of intimate partner violence reported to police. While alarming, this statistic very likely underrepresents the true number of IPV incidents nationwide, as many cases go unreported.

Professionals working in the justice, health care, and social services sectors play an important role in reducing or eliminating IPV harm. Private-sector actors can also help identify and take necessary and reasonable steps to prevent potential IPV-related harm to clients and employees. A critical component of IPV prevention and mitigation includes the timely and responsible disclosure of personal information. Effective information sharing could mean the difference between life and death.

In recent years, Canadians have seen a number of public inquiries and inquests involving IPV, which highlighted misconceptions about Canada’s privacy laws. Organizations and their staff reported feeling conflicted about how to respond to an IPV situation due to concerns around their obligations of confidentiality and the risk of infringing privacy rights.

Canada’s privacy regulators collectively affirm that Canada’s privacy laws generally permit the disclosure of personal information if there is a risk of serious harm to health or safety. The resolution calls for a collective effort from governments and organizations to develop privacy-compliant governance frameworks for responsible information-sharing in cases involving risk of serious harm to life, health, or safety when certain conditions are met.

The resolution urges governments to work with their respective privacy regulator or ombuds to ensure organizations develop clear privacy policies around permissible disclosures, conduct public education campaigns, develop culturally sensitive and trauma-informed tools to support organizations serving at-risk communities, and proactively disclose IPV-related data, statistics, and trends to help inform and improve policymaking on this issue.

The resolution also calls on public institutions and private sector organizations to establish corporate policies on permissible disclosures, require staff training, adopt culturally-sensitive and trauma-informed approaches particularly among marginalized, racialized, or vulnerable groups and consider the unique experiences of Indigenous communities, be transparent up front about potential disclosures and document them when they occur, ensure privacy and security safeguards are in place, and respect data minimization principles.

For their part, Canada’s privacy regulators commit to working collectively to clarify permissible disclosures under their respective privacy laws by engaging with governments and other key interested parties to educate professionals, affected individuals, and the public on when and how personal information can be disclosed in IPV situations. Together, they aim to provide ongoing policy guidance and support for the responsible disclosure of personal information to help prevent situations of IPV.

“Intimate partner violence is a deeply disturbing issue, affecting individuals and communities here in Ontario and across the country,” said Patricia Kosseim, Information and Privacy Commissioner of Ontario. “This joint resolution is a collective affirmation by federal, provincial, and territorial privacy regulators that privacy laws are not a barrier to disclosing information when there is a risk of serious harm to someone’s health and safety. Together, regulators, governments and organizations can help support responsible data sharing in situations involving intimate partner violence to protect victims and survivors and helps keep our communities safe.”

Learn more: 

Responsible information-sharing in situations involving intimate partner violence 

Information and Privacy Commissioner of Ontario hosts annual meeting of federal, provincial, and territorial information and privacy commissioners and ombuds 

Sharing Information in Situations Involving Intimate Partner Violence: Guidance for Professionals

Info Matters podcast S4-Episode 5: Addressing intimate partner violence: Information sharing, trust, and privacy

Media contact:
@email 
 

The IPC joins regulators from around the world at the 2024 Global Privacy Assembly Privacy and Transparency in a Modern Government News Releases Read moreExpand

The Office of the Information and Privacy Commissioner of Ontario (IPC) joined data protection and privacy authorities from more than 130 countries at the 46th Global Privacy Assembly (GPA) to address universal data-related challenges facing all of us around the world. 

Held in the Bailiwick of Jersey, Channel Islands, the theme of the 2024 meeting was the power of information. Regulators explored how to respect and balance the power of information with the need for citizens to have power, control, and dignity over their personal information. 

Ontario’s Information and Privacy Commissioner, Patricia Kosseim, led a panel discussion on Education from the Ground Up: The Societal Impact of Privacy Education. Panel participants included Baroness Beeban Kidron, Member of the UK House of Lords; Bertrand du Marais, Commissioner of the Commission Nationale de l’informatique et des Libertés (France); Joyce Lai, Assistant Privacy Commissioner for Personal Data, Office of the Privacy Commissioner for Personal Data, Hong Kong; Leanda Barrington-Leach, Executive Director, 5Rights Foundation; and Matthew Johnson, Director of Education, MediaSmarts.

The discussion emphasized practical steps regulators and other interested parties can take to help educate, engage and empower young people to protect their digital privacy rights and encourage responsible digital citizenship. 

Commissioner Kosseim also participated in a Capacity Building Session on Priority Setting, highlighting the IPC’s strategic plan and the importance of setting strategic priorities to better focus the IPC’s energies and resources on key access and privacy areas where we are likely to have greatest positive impact.

The assembly passed resolutions on key issues such as neurotechnology, data free flow with trust, certification mechanisms, and the rules and procedures of the GPA.

You can read the resolutions on the GPA website

Help us improve our website. Was this page helpful?
When information is not found

Note:

  • You will not receive a direct reply. For further enquiries, please contact us at @email
  • Do not include any personal information, such as your name, social insurance number (SIN), home or business address, any case or files numbers or any personal health information.
  • For more information about this tool, please see our Privacy Policy.