Privacy and Transparency in a Modern Government

Our goal is to Advance Ontarians’ privacy and access rights by working with public institutions to develop bedrock principles and comprehensive governance frameworks for the responsible and accountable deployment of digital technologies.

Our work to further this goal includes:

Showing 10 of 49 results

Title Topics Type Date
S4-Episode 3: No government ID: Navigating homelessness, identity, and privacy Privacy and Transparency in a Modern Government Podcast Read moreExpand

Most of us take our government issued ID for granted. If we lose it, it’s a minor inconvenience. But for people experiencing homelessness, not having valid ID or a fixed address to obtain these documents is a much more serious challenge. Robert Fabes of The Ottawa Mission shares insights on the barriers people experiencing homelessness face and how to provide access to essential services while respecting their privacy and dignity. 

Statement from the Information and Privacy Commissioner of Ontario on Supreme Court decision not to release mandate letters Privacy and Transparency in a Modern Government News Releases Read moreExpand

Patricia Kosseim, Information and Privacy Commissioner of Ontario, issued the following statement in response to the Supreme Court of Canada’s decision that the government does not need to disclose its mandate letters:

"Access rights exist to ensure the public has the information it needs to participate meaningfully in the democratic process and that government institutions remain accountable to the people they serve. The Supreme Court’s decision to allow the Attorney General for Ontario’s appeal from the decision of the Ontario Court of Appeal on the IPC’s Order PO-3973 is an important ruling that clarifies the limits of the public’s right to access government information subject to cabinet confidence. Although Ontarians have the fundamental right to know how their governments are operating, there are limited exemptions from this right of access to protect the government’s legitimate confidentiality interests related to specific types of information. We respect the Supreme Court’s decision and will be considering the significance of this precedent and its broader implications for future cases involving access to cabinet records."

Media inquiries:

@email

Artificial Intelligence in the public sector: Building trust now and for the future Artificial Intelligence, Privacy and Transparency in a Modern Government Read moreExpand

On January 24, 2024, the IPC had the pleasure of hosting Ontarians to a public event in celebration of Data Privacy Day. The theme was Modern Government: Artificial Intelligence in the Public Sector. If you weren’t able to attend in person or online, the webcast is available here on our YouTube channel.

Here are a few highlights and key takeaways from the event.

Exhilarating promises of AI

AI technologies offer tremendous opportunities to improve public services. They can be used to fast track the processing and delivery of government benefits, inform decision-making by policymakers, and improve communications and engagement with citizens.

There is also a growing use of AI technologies to enable earlier diagnosis of complex health conditions, improve public safety, and respond to global emergencies.

Simply put, AI has the potential to transform the world as we know it today.

A 2023 survey by Global Government Forum found that more than one in ten Canadian public servants say they have used artificial intelligence tools such as ChatGPT in their work. This figure is likely to continue rising throughout 2024, as these technologies rapidly advance and become more commonly integrated into one’s day-to-day work.

Associated risks and potential harms

While the opportunities of AI are promising, we know that there are risks.  AI is not infallible and can lead to costly mistakes and unsafe outcomes for people.

Flawed algorithms can perpetuate biases embedded in the data used to train them, exacerbating the adverse impacts experienced by vulnerable and historically disadvantaged groups.

AI often relies on very large volumes of personal information or data sets that may not be properly protected and may not always be lawfully collected at source. The lack of transparency around the use of AI, and the inexplicability of decisions made as a result, can lead to unfair outcomes for individuals and gouge away at public trust.

Ever since generative AI tools, like ChatGPT, were publicly released and became readily accessible at mass scale, concerns are growing about how consumers can use these to create and spread misinformation. Sometimes spoofs can be funny and quite benign. Other times, not so. Cyber thieves are already simulating CEO voices and using them to spoof employees into transferring money through increasingly sophisticated phishing attacks. “Deepfakes” are being used to mislead the public by fabricating false statements made by political leaders, undermining our democratic processes. Deepfakes can also wreak havoc with financial markets, and gravely harm individuals by ruining their reputations or creating false sexual images of them.

Where the magic really happened

We were very privileged to be able to discuss these opportunities and risks with a blue-ribbon panel of experts from different areas of expertise including philosophy, history, political science, economics, law, social psychology, and technology.  Each of them brought a unique perspective to the table based on their deep knowledge and experiences.

But hearing them in discussion with one another is where the real magic happened! Combined, their contributions were particularly rich, insightful, engaging, and helped advance the dialogue around responsible use of AI in the public sector.

What is your word cloud when it comes to AI?

As a conversation starter, we asked each panelist the following question:

Considering each of you spend much of your day thinking and talking about AI in your respective roles, if we were to create a word cloud above your head, what would be your top three words?

For Melissa Kittmer, Assistant Deputy Minister, Ministry of Public and Business Service Delivery, those were: trustworthy, transparent and accountable. She spoke about the Ontario government’s Trustworthy AI Framework that has been under development since 2021 as part of Ontario’s Data and Digital Strategy. This risk-based framework is grounded in three principles: 1) No AI in secret; 2) AI use that Ontarians can trust; and 3) AI that serves all the people of Ontario.

Melissa highlighted the importance of identifying and managing AI risks. These include potential discrimination and violation of human rights, privacy infringements, misuse of intellectual property, and spread of misinformation. She stressed the responsibility of public servants to mitigate those risks when leveraging the benefits of AI in their work.

Stephen Toope’s three words were: excitement, worry and complexity. As President & CEO of the Canadian Institute for Advanced Research (CIFAR), Stephen spoke about CIFAR’s pan-Canadian AI Strategy. The strategy was launched in 2017 to build AI research capacity here in Canada, while ensuring responsibility, safety, equity, and inclusion. Today, Canada has become a powerhouse in terms of talent. We rank first among G7 countries in the growth and concentration of AI talent, and first in the world in percentage increase of female AI talent globally! Canada is also first in AI publications per capita. Canada used to rank fourth on ‘AI readiness’ in terms of our investment, innovation, and implementation, but we’ve dropped to fifth spot partially due to our lack of access to supercomputing power. Whereas other countries are building major computing platforms, Canada lags in comparison. So, while Canada’s story is one of success, it’s contingent success that requires continued investments in infrastructure and improved ability to protect our intellectual property.

Stephen added that as we deepen our understanding of AI, we also need to have appropriate guardrails in place to address discrimination among other risks. Although some have called for a global AI pact, he thinks that is unlikely to happen. Rather, we should be looking to local and national frameworks— maybe even regulatory coalitions — to ensure harmonization of high standards and avoid a race to the bottom.

The IPC’s own Manager of Technology Policy and Analysis, Christopher Parsons, chose fast-paced, nuanced and noisy. Chris spoke about how AI is being used to enhance national security and law enforcement. He noted the rapid growth of surveillance technologies, plummeting costs of computing power, enhanced access to analytical capabilities used to extract insights from data, all of which are now being leveraged for public security purposes. While this can be positive in some respects, for cybersecurity and automated defense systems, for example, there can also be significant impacts on our privacy and human rights, and ultimately public trust.

Chris emphasized concerns about the obscurity of these practices, many of which happen in secret, and the mass collection of personal information, sometimes from unlawful sources. Inferences derived from these data are largely invisible and may not always be accurate, yet they can feed into life-impacting decisions. This can lead to people being wrongfully identified and accused without the ability to understand how they are being drawn into the criminal justice system. This could further exacerbate bias and discrimination, undermine the right to due process and fair trial, and cause a chill on people’s freedom of expression and association.

Interestingly, Colin McKay, former Head of Public Policy at Google, chose similar words to Chris. Colin took a historical and contextual look back to technology development over the past 25 years. Back then, technology companies did not have the internal teams to clearly communicate to the public or to regulators how they were collecting and using personal information, or the accountability frameworks in which to operate. This created a legacy of mistrust in the use of technologies that naturally frames the context in which we consider consumer applications of AI today.

Colin highlighted the opportunity for companies, large and small, to leverage their past experience with technology development generally. He suggested they could do this by broadening their teams of specialized experts, including technologists, privacy lawyers, data security specialists, and ethicists to explain and communicate publicly about the complexities of AI in a more nuanced manner. Private sector can play a key role in advancing the debate around data cleanliness and process optimization to reduce bias and improve outcomes. He also urged the development of sustainable AI governance frameworks supported by key investments across industry to ensure clear, focused, and ethically responsible use of AI technology.

For Teresa Scassa, Canada Research Chair in Information Law and Policy at the University of Ottawa, risk, regulation and governance were top of mind.  She pointed out that legislative and policy frameworks could be aligned across the country following the lead of the federal government’s Artificial Intelligence and Data Act. Nonetheless, there are still normative spaces for provinces to fill given Canada’s federal reality. One of these important spaces is the provincial public sector, including health care, and law enforcement.

There are fundamental governance questions Ontario needs to ask itself before deploying AI like: What kinds of problems are we trying to address? Is AI the appropriate tool to solve those problems? If so, what kind of AI, designed by whom, what data should feed it, and who will benefit from it?

In filling their regulatory role, provinces should strive for alignment with the laws and policies of other jurisdictions, both nationally and internationally, and draw from their practical experience implementing them. Teresa also emphasized the need to empower and resource existing regulators, like privacy and human rights regulators, to address AI issues that arise in their respective areas of competence.

The three words for Jeni Tennison, Founder and Executive Director of Connected by Data in the U.K., were power, community and vision. Jeni discussed some of the challenges and opportunities around transparency of AI. She spoke about the need for AI developers to be transparent for different purposes and different levels. This includes transparency to the public to enhance public trust, to those procuring AI systems so they can do their due diligence, to intended users of AI so they can carry out their professional obligations with confidence, and to regulators for audit and accountability purposes. A certain level of transparency is also needed to enable fair competition in the market, which is particularly important in a public context to avoid government getting locked into a relationship with a single vendor.

Jeni also stressed how important it is to explain how an AI-based system comes up with a given result, so that affected individuals and their representatives can understand what is happening behind closed doors. This knowledge can help challenge any biases, inaccuracies, and unfairness.

Jeni described why transparency is needed not only in respect of algorithmic models and the development process, but also the results of impact assessments, as well as the number and outcomes of complaints received. These insights are important for communities to understand when and where things may go wrong — a key point for re-equilibrating relationships of power and remedying the public trust deficit.

Finally, Jeni emphasized the need for enhancing capacity and computing power. This is important not only for innovators and developers, but also for civil society, academia, regulators, and other challenging organizations whose role it is to hold developers to account for their use and deployment of AI.

Need for guardrails and limits

Governments in countries around the world are developing laws to address these and other issues associated with AI.

The European Council and Parliament have reached a provisional agreement after lengthy negotiations over the EU’s proposed AI Act. This act takes a risk-based approach to regulating AI and supporting innovation, but with greater transparency, accountability, and with several backstops. These include prohibitions against cognitive behavioural manipulation, the scraping of facial images from the internet, and the use of social scoring and biometric categorisation to infer sensitive data.

In California, the AI Accountability Act has been introduced with the aim of creating a roadmap, guardrails, and regulations for the use of AI technologies by state agencies. This includes requiring notice to the public when they are interacting with AI.

In Canada, the Artificial Intelligence and Data Act, part of Bill C-27, would require having measures in place to identify and mitigate the risks of harm or biased output, and to monitor compliance.

However, this federal legislation would not cover the public sector in Ontario, which is why it is so essential for us to develop our own framework here.

The Ontario government has already taken some positive steps by building various components of a Trustworthy Artificial Intelligence Framework.  But Ontario can and must do more.

Moving forward with AI: Initiatives from the IPC

Raising awareness and bringing to light the critical need for strong governance on AI has been at the forefront of the IPC’s initiatives in recent years.

Last May, the IPC issued a joint statement with the Ontario Human Rights Commission. We urged the Ontario government to establish a more robust and granular set of binding rules governing public sector use of AI that respects human rights, including privacy, and upholds human dignity as a fundamental value.

My office also joined our federal, provincial, and territorial counterparts in releasing Principles for Responsible, Trustworthy, and Privacy-Protective Generative AI Technologies. These principles are intended to help organizations build privacy protection right into the design of generative AI tools, and throughout their development, provision, and downstream use. They’re devised to mitigate risks, particularly for vulnerable and historically marginalized groups, and to ensure that generative content, which could have significant impact on individuals, is identified as having been created by generative AI.

On the international front, the IPC co-sponsored two resolutions at the 45th Global Privacy Assembly that were unanimously adopted by data protection authorities around the world. One on Generative Artificial Intelligence Systems and the other on Artificial Intelligence and Employment, both of which closely align, and resonate with, the kinds of things we’ve been saying and calling for here at home.

The future of AI

We should be proud to know that Canada and Ontario are clearly punching above their weight globally when it comes to AI innovation. Algorithmic systems are powerful tools of measurement, management, and optimization that can help spur the economy, diagnose and treat disease, keep us safe, and perhaps even save our planet.

Ultimately, however, the successful adoption of AI tools by public institutions can only be achieved with the public’s trust that these tools are being effectively governed. To gain that trust, we need to ensure they are being used in a safe, privacy-protective, and ethically responsible manner, with fair outcomes and benefits for all citizens.

— Patricia

Ontario joins Canadian privacy regulators in passing resolutions on the privacy of young people and workers Youth, Surveillance, Privacy and Transparency in a Modern Government News Releases Read moreExpand

TORONTO, ON, October 6, 2023 — Privacy authorities from across the country are calling on their respective governments to improve privacy legislation to protect young people and employees — groups that are significantly vulnerable, each in their own way, to the growing influence of digital technologies.

Federal, provincial, and territorial information and privacy authorities met this week in Québec City for their annual meeting to discuss pressing concerns related to privacy and access to information. These discussions resulted in joint resolutions calling on governments to do more to protect the privacy rights of young people and workers.

For young people, the resolution focuses on the responsibility of organizations across all sectors to actively safeguard young people’s data through responsible measures, including minimized tracking, regulated data sharing, and stringent control over commercial advertising. It also calls on organizations to safeguard their rights to access, correction, and appeal regarding personal data.

The employee privacy resolution addresses the recent proliferation of employee monitoring software and how it has revealed that laws protecting workplace privacy are either out-of-date or absent altogether. In our increasingly digital work environments, there needs to be robust and relevant privacy protections in place to safeguard workers from overly intrusive monitoring by employers.

Privacy of young people

Youth have a right to privacy and all sectors, including governments and businesses, must put young people’s interests first by setting clear limits on when and how their personal information may be used or shared, the privacy authorities say. They called on their respective governments to review, amend or adopt legislation as necessary to ensure that it includes strong safeguards, transparency requirements, and access to remedies for young people. They also called on government institutions to ensure that their practices prioritize a secure, ethical, and transparent digital environment for youth.

The resolution notes that while the digital environment presents many opportunities for young people, it has also brought well-documented harms, including the impact of social media on physical and mental health. Regulators say that special protections are essential for younger generations, because their information can live online for a long time, and may become a life-long reputational burden.

The resolution also calls on organizations to adopt practices that promote the best interests of young people, ensuring not only the safeguarding of young people’s data, but also empowering them with the knowledge and agency to navigate digital platforms and manage their data safely, and with autonomy. Initial steps include identifying and minimizing privacy risks at the design stage. Other recommendations include making the strongest privacy settings the default; turning off location tracking; and rejecting deceptive practices and incentives that influence young people to make poor privacy decisions or to engage in harmful behaviours.

Privacy in the workplace

With the shift towards increased remote work arrangements and use of monitoring technologies in this digital world, the privacy authorities called on governments to develop or strengthen laws to protect employee privacy. They also urged employers to be more transparent and accountable in their workplace monitoring policies and practices.

Employee monitoring has undergone substantial expansion in its use, technological capabilities and application in recent years. Many employers have accelerated the use of monitoring technologies as they seek new ways of tracking employee’s performance and activities on-premises or remotely, whether during work or off hours.

Although some level of information collection is reasonable and may even be necessary to manage the employer-employee relationship, the adoption of digital surveillance technologies can have disproportionate impacts on employees’ privacy and can significantly impact an employee’s career and overall well-being, including heightened stress levels and other adverse mental health effects, not to mention reduced autonomy and creativity.

The resolution calls for a collective effort from governments and employers to address statutory gaps, respect and protect employee rights to privacy and transparency, and ensure the fair and appropriate use of electronic monitoring tools and AI technologies in the modern workplace.

“Digital technologies offer employees new ways of working and young people with countless opportunities to connect, learn, and grow,” said Patricia Kosseim, Information and Privacy Commissioner of Ontario. “Through technology, we are able to engage in ways that were previously unimaginable. The future is here and it’s critical that governments and organizations act now to protect the privacy of workers and young people, particularly in Ontario, where statutory gaps leave them vulnerable to the risks of digital surveillance.”

Learn more:

For more information:

@email

Ontario joins Canada’s information regulators’ call to modernize the access to information regime Privacy and Transparency in a Modern Government News Releases Read moreExpand

Today, Canada’s federal, provincial, and territorial Information Commissioners and Ombuds signed a joint resolution aimed at reinforcing the public's right to access government-held information.

Recognizing the urgent need for change, Canada’s information regulators are calling upon their respective governments to modernize legislation, policies and information management practices to help restore trust in institutions through the preservation and dissemination of our documentary heritage. To deliver on its promise towards greater transparency, accountability and reconciliation, Ontario must make urgent improvements to its access to information regime, supported by critical investments in resources and technological innovations.

“Government-held information is a valuable source of accurate and truthful facts about present and historical events. It’s an antidote to the increasing spread of toxic misinformation, and disinformation, that erodes trust in our democratic institutions,” said Patricia Kosseim, Information and Privacy Commissioner of Ontario. “This joint resolution urges our respective governments to strengthen access to information legislation, promote stronger information management practices and summon the courage it takes to build a culture of openness and transparency through proactive disclosure.”

Building on a joint resolution issued in 2019, today’s resolution was adopted during the annual meeting of federal, provincial, and territorial Information and Privacy Commissioners and Ombuds in Quebec City.

Learn more:

Contact:

@email

RTKW 2023: Why access to information matters more than ever! Privacy and Transparency in a Modern Government Read moreExpand

For a topic that doesn’t often get as much media attention as its privacy counterpart, access to information has been making a lot more headlines this year. Many are urging the government to improve access to information legislation, and some are even taking it a step further — calling for a complete overhaul of the freedom of information (FOI) system.

Recently, the Globe and Mail ran a series, Secret Canada, highlighting many of the barriers to access to information and the many challenges facing FOI offices in ministries and departments across the country. Commendably, the Globe also developed a database of hundreds of thousands of FOI request summaries filed in Canada, as well as a detailed guide on how to file requests and navigate the system.

The Secret Canada series honed in on the critically important reasons why access to information and government transparency matter and why we need to fiercely protect and uphold access rights as a central tenet of our democracy. As part of the series, the reporters interviewed former Chief Justice of the Supreme Court of Canada, Beverley McLachlin. In her words:

“… a democracy just can't work without the people having information. That is key to making decisions around how you vote. It's key to making informed decisions. We're in this age of social media where people are substituting opinions for facts. Facts are absolutely basic to good democratic governance and accountability.”

Her quote captures the very essence of why, in our modern digital world, having timely access to accurate facts is critical. Providing information from reliable sources is an effective antidote to all of the misinformation out there — and even disinformation — especially in the age of generative artificial intelligence, when it is becoming so much more difficult to distinguish legitimate sources of information from fabricated stories or lies.

My office has long been advocating for updates to Ontario’s access to information legislation, and as I mentioned in my recent appearance on The Agenda with Steve Paikin, the FOI system can certainly use some legislative improvements. That said,  there is so much that governments can do non-legislatively as well. For example, institutions can greatly advance public transparency and trust by:

  • allocating additional resources to support over-strained FOI offices;
  • streamlining processes and gaining greater efficiencies by leveraging new automation tools and technology;
  • proactively disclosing more meaningful information Ontarians care about, without waiting to be asked, and;
  • strengthening a culture – and courage -- of openness among Ontario’s institutions where transparency is normalized and disclosure of information becomes the default.

But as a modern and effective regulator, the IPC has to do its part too. We need to renew our own commitment to the cause and speak in a united voice with our counterparts across the country and internationally, which you’ll be hearing more about during RTKW and in the weeks to follow.

As an office, we also need to streamline our appeals processes, facilitate the participation of the parties before our Tribunal, and render more timely access decisions.  To this end, we’ve made it easier for people to file and pay for access appeals using our convenient and secure online service. As we head into RTKW 2023, you’ll also learn more about the amendments we’ve made to our Freedom of Information and Protection of Privacy Act (FIPPA) and Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) Code of Procedure to reflect updates to our tribunal processes and procedures and enhance our capacity to provide timely resolution to access appeals. And you’ll hear about the work we’ve undertaken to codify our past decisions into practical and actionable Interpretation Bulletins to help FOI coordinators on the ground when they receive an access to information request.

As a modern and effective regulator, our role is not only to call out non-compliant behaviour when we see it but to promote and encourage good transparency practices, too. Over the past year, we curated some great examples of how several Ontario institutions have succeeded in releasing data to the public in a way that is meaningful, readily accessible, and free of charge. We displayed these in a Transparency Showcase, a virtual exhibit of open government and open data initiatives. In case you missed it, take time during RTKW 2023 to visit the showcase and have a look around for some ideas and inspiration on how your institution can become more transparent, too!

You may also want to carve out some time to listen to a new Info Matters podcast being released as part of RTKW 2023.  In this episode, my guest Laura Neuman of the Carter Center, talks about how access to information — or rather, the lack thereof, can greatly exacerbate the inequities of a significant gender divide that continues to afflict women’s rights not only in developing countries, but developed countries alike. Laura also describes the Centre’s Inform Women, Transform Lives campaign that aims to empower women, helping them access essential information from their local governments to receive benefits or services, help support their families, and engage in civic life.

In fact, while you’re at it, you might want to make yourself a whole FOI playlist in celebration of Right to Know Week. You’ll definitely want to add this recent episode to your line-up, Trust and Truth: Navigating the Age of Misinformation, where I speak with Dr. Alex Himelfarb, chair of the Council of Canadian Academies’ Expert Panel on the Socioeconomic Impacts of Science and Health Misinformation, about how important it is for governments to provide legitimate sources of information that otherwise get too readily filled with so-called facts and theories that aren’t true and can in fact be harmful. Misinformation and disinformation not only adversely affect individuals but can destroy social cohesion in communities, with disproportionately negative impacts on marginalized groups and vulnerable populations.

And you might wish to round out your FOI playlist with this earlier Info Matters episode featuring best-selling author and community activist Dave Meslin in Power to the People! Access, privacy and civic engagement. When it comes to open data and access to information, Dave says transparency is everything. Access to information is one of those fundamental building blocks in this great arena we call democracy where every citizen should have an active voice and a role to play in bringing about societal change for the better, starting with their own school or neighborhood.

As we head into Right to Know Week, I encourage you to reflect on the importance of access to information and how it contributes to the well-being of our communities and to the health of our democracy.

My office has some interesting things planned to put the spotlight on access and transparency throughout #RTK2023. Follow the hashtag and our Instagram, LinkedIn, and X (formerly Twitter) accounts for the latest access initiatives from across Canada and around the world.

It’s going to be a great week, and I encourage you to join in the celebration of information rights! Access to information matters. It underpins the very foundations of our democracy and our fundamental freedoms. Let’s not take it for granted.

— Patricia

IPC letter on record-keeping concerns raised in Greenbelt report Privacy and Transparency in a Modern Government Advice and Submissions, Letters Read moreExpand

IPC response to Leader of the Official Opposition Marit Stiles’ concerns about government record keeping practices outlined in Greenbelt report

Commissioner Kosseim on "Leadership in the Digital Enterprise" Privacy and Transparency in a Modern Government Podcast Read moreExpand

Commissioner Patricia Kosseim appeared on “Leadership in the Digital Enterprise” on April 6, 2023. The title of this episode was “A conversation with Ontario's Information and Privacy Commissioner, Patricia Kosseim.”

How to Protect Against Ransomware Privacy and Transparency in a Modern Government Fact Sheets Read moreExpand

This fact sheet from the IPC discusses how ransomware has become an increasingly dangerous threat to the security of electronic records and provides guidance on how public institutions and healthcare organizations can protect themselves against it.

Right to Know Week 2022 Access, Open Government, Access Request Process, Privacy and Transparency in a Modern Government Read moreExpand
Help us improve our website. Was this page helpful?
When information is not found

Note:

  • You will not receive a direct reply. For further enquiries, please contact us at @email
  • Do not include any personal information, such as your name, social insurance number (SIN), home or business address, any case or files numbers or any personal health information.
  • For more information about this tool, please see our Privacy Policy.