Podcast

S4-Episode 8: Indigenous led innovation: Aligning technology with community values

Info Matters Podcast Awards Cover Graphic

Technology can expand opportunities for Indigenous communities, but it can also raise ethical concerns about data sovereignty and community interests in privacy. Jeff Ward, CEO of Animikii, discusses the longstanding connection between technology and culture, and how incorporating Indigenous values and principles into the development of new technologies can empower communities. 

Notes

Jeff Ward is the CEO of Animikii and a member of the Global Partnership on Artificial Intelligence working group on responsible AI. His areas of expertise encompass the development of technology solutions tailored for Indigenous communities and organizations and the integration of traditional Indigenous values and culture into modern business practices. 

  • The “move slow and empower people” philosophy [2:00]
  • Animikii’s day to day work and the development of the Niiwin platform that can be used to support Indigenous data sovereignty [5:07]
  • Incorporating Indigenous values and traditions into business practices — the seven sacred teachings [6:09] 
  • Indigenous people have always been data experts [9:08]
  • #Data Back, shares Animikii’s perspectives on Indigenous data sovereignty and principles like OCAP and FAIR [10:32]
  • Frameworks for Indigenous data governance [10:49]
  • Working with the Survivors’ Secretariat, focusing on data related to the Mohawk Institute Residential School [13:09]
  • The extractive nature of artificial intelligence and its environmental impact [16:26]
  • The Global Partnership on Artificial Intelligence (GPAI) working group [20:00]
  • How the principles of Indigenous data sovereignty can inform conversations about data ethics and governance in a digital world [23:14] 
Resources:

Info Matters is a podcast about people, privacy, and access to information hosted by Patricia Kosseim, Information and Privacy Commissioner of Ontario. We dive into conversations with people from all walks of life and hear stories about the access and privacy issues that matter most to them. 

If you enjoyed the podcast, leave us a rating or a review. 

Have an access to information or privacy topic you want to learn more about? Interested in being a guest on the show? Post @IPCinfoprivacy or email us at @email

Transcripts

Patricia Kosseim:

Hello, I'm Patricia Kosseim, Ontario's Information and Privacy Commissioner, and you're listening to Info Matters, a podcast about people, privacy, and access to information. We dive into conversations with people from all walks of life. And hear real stories about the access and privacy issues that matter most to them.

Hello listeners, and welcome to another episode of Info Matters. I'm recording this episode from Ottawa, built on the unceded Algonquin Anishinaabeg territory. The Anishinaabeg Algonquin Nation has called this territory home for thousands of years. Their culture and presence have nurtured and continue to nurture this land in ways that remind us of the deep connection between people, knowledge, and the environment. Technology, especially today, holds immense power to transform lives and strengthen communities. It can create new ways for Indigenous peoples to preserve their languages, share their stories, and expand economic opportunities.

But if not implemented thoughtfully, these technologies can also present risks, such as undermining cultural knowledge systems, perpetuating biases, or eroding data sovereignty. In this episode, we'll explore how technology, and artificial intelligence in particular, can both empower and challenge Indigenous communities. My guest is Jeff Ward, the founder and CEO of Animikii, a values-driven Indigenous technology company. Jeff is a web designer, software developer, author, and speaker, and he's recognized for his practical application of Indigenous data sovereignty principles, ensuring that Indigenous communities maintain control over their data, and how it's used in technological solutions. Jeff, welcome to the show.

Jeff Ward:

Thanks for having me.

PK:

So you previously worked in Silicon Valley, a place where many entrepreneurs live by Mark Zuckerberg's, famous motto, "Move fast and break things." Now you're leading a technology company where the philosophy literally is, "Move slow and empower people." So what inspired you to found Animikii?

JW:

Well, again, thanks for having me and by way of introduction, I'll greet you in my language. And I am originally from Treaty 1-territory, Manitoba, and I grew up there, and member of Sandy Bay Ojibwe Nation. I'm Ojibwe and Métis on my dad's side, and English and Ukrainian on my mom's side. I am chatting to you from Lekwungen territory, colonially known as Victoria, and also where Animikii Indigenous Technology is headquartered.

Happy to share some of my experience working in Silicon Valley, was the thing I was doing just prior to starting Animikii. I was there for the whole Silicon Valley-boom around the 2000s and subsequent bust. It was an interesting time to be there, and when you say, "Move fast and break things," there was a lot of things that were breaking around that time, including the entire bust of Silicon Valley.

So coming through that experience as a young Indigenous technologist, and coming back to Canada, I really wanted to use my skills through technology and as a technologist to support the communities I was raised within, and give back through technology, get communities and organizations and individuals online. I did that as a web developer, a software developer. So really wanting to build tech in a different way. Many years later, we've developed concepts like, "Move slow and empower people," as one example, as you alluded to there. But yeah, really wanting to see how we can weave data and web-based storytelling to share these stories in a safe and respectful way.

Fast-forward over 20 years later, still doing the same thing, still supporting communities, but the context has evolved, the language has evolved. We've had a lot of interesting movements in this country, including the Truth and Reconciliation Commission, including modern day-treaty negotiations, and from a rights-based approach, and also through the adoption of the United Nations Declaration on the rights of Indigenous Peoples. Putting all of that context from our rights-based approach with data sovereignty, that's how we come to the work at Animikii, is really from that rights-based approach, and respecting Indigenous voices, and how might we bring the Indigenous technologies that we're building with communities to the world.

PK:

How would you describe what your organization does on a day-to-day basis?

JW:

We work with communities. We build web-based data platforms. So if a community has a need to maybe work with data, get their data all into one place, be able to utilize that data for the collective benefit of its peoples, or to create innovations, that from an Indigenous perspective, we'll build web-based platforms. It started out early days, building websites, and a lot of stuff we do now is all data-focused. So online, web-based applications, and through all this work in the last 20 years culminated in building our data platform, which is called Niiwin, and takes all the wise practices that we've learned in community over the past many years in developing these web-based data platforms into something that can be used to support Indigenous data sovereignty.

PK:

What are some of the ways that Animikii incorporates Indigenous traditions and values into its technological solutions?

JW:

It's a great question. We, at Animikii, have adopted the seven sacred teachings from my culture. So those values are love, humility, respect, wisdom, courage, honesty, and truth. So as a technology company, weaving love, for example into technology, that's something that we think about and talk about on the day-to-day as an organization, as a tech company, we think about how do we connect those values at all levels from our mission, vision, strategic plan, our company goals, even individual goals that our team members make here, even impact measurement. So we think holistically about these values, they're not just words on our site, but they are the values that we measure and hold ourselves accountable to. As a social impact-focused organization, we release annual impact reports, and we do this as a requirement of us as a B Corp, B Corp as a social enterprise third party designation.

We report annually. So when we say, "We're a social enterprise," we have to prove that. We hold ourselves accountable and all those values are really woven into everything that we do. So thinking holistically, but even in the tech that we build. Before we write a line of code for ourselves or with our partners, we go through our process that we've developed that we call, "Pathfinding," which is a way to really think about how community values, or traditional values, or Indigenous data governance are brought into the development of digital technologies.

Again, before a line of code is written, we talk about rights-based frameworks like the United Nations Declarations on the Rights of Indigenous Peoples, or the TRC calls to actions, or the MMIWG calls for justice, and how do we bring these principles into technology? Many of us have been called upon through the TRC. We've all been called upon through the Truth and Reconciliation Commission, but specifically in technology, how do we do this in a way that eliminates, or reduces harm in the tech that we build, and bringing those Indigenous values to that tech. So yes, these are our values at Animikii that we bring to the tech, but we also talk about with the communities, that maybe are not from my culture, what are their values, and what are their traditions and governance, and we bake that into the tech and the software development process from day one.

PK:

Could you just repeat those seven principles?

JW:

The seven teachings are love, humility, respect, wisdom, courage, honesty, and truth.

PK:

Wow, those are values we should all live by.

JW:

Absolutely.

PK:

I'd be interested to know how Animikii upholds the concept of, "Indigenous data sovereignty," into its projects and client relationships?

JW:

As an Indigenous tech company, we're trying to shift the narrative. We're trying to let people know that Indigenous peoples have always been data experts. Data are embedded in our stories, teachings, and languages. Data is in the waters, it's in the land. It's documented orally and physically. This word, "Data," so-called, "Data," is really embedded deep in cultures, in all cultures, and technology and culture has always been tightly inter-weaved. You can't have culture without technology, and technology without culture. You see this in our ceremonies. You see this in our regalia, and you see it everywhere. So yeah, that's the first thing that we want to change, is that this notion of technology isn't new for Indigenous people. Then in fact, we've been bringing indigenous tech to the world for millennia.

In your opening, you mentioned for thousands of years, Indigenous peoples have been here, and you can't live and thrive that long without technology. So when it comes to Indigenous data sovereignty though, in our work at Animikii, of course the context is through software and through web-based systems. So if you're talking about digitizing and archiving traditional knowledge, for example, or storing confidential information of community members and their territories, it requires sovereign data systems to house that data.

We wrote a book called Data Back, it's a free e-book, and it shares a little bit more in depth and detail about our perspectives as practitioners over the last 20 years in Indigenous data sovereignty. And some of the principles there are OCAP principles, and I know an earlier guest you talked with the First Nations Information Governance Center, talked a lot about the OCAP principles, which are amazing and great. There's other frameworks as well, maybe more globally recognized standards. So there's the FAIR principles, so F-A-I-R, Findable, Accessible, Interoperable, and Reusable.

But then the CARE principles came out of this group called the Global Indigenous Data Alliance, and those are the CARE principles for indigenous data governance. So it kind of takes what FAIR started off with as a more general standard and brings that further for Indigenous communities. So CARE is Collective benefit, Authority to control, Responsibility, and Ethics. And so that book, that Data Back-book and some of our other work really shares a lot of these other frameworks.

So with the CARE principles in particular, the C stands for, "Collective benefit." And so thinking about Indigenous data as by and collective data, we also have individual data and data from non-human relations like the land and the waters and these kinds of things. But these frameworks like CARE, something that we really weave into, for example, the data platform that we're building, or the work that we do in communities. The E, ethical, how might we minimize harm and maximize benefit to the communities that the data is about.

So it's all in alignment with OCAP, and I think there's opportunities to go further with more of a rights-based approach. The United Nations Declaration on the Rights of Indigenous Peoples really is the framework for reconciliation. So there are articles that specifically mentioned technology, and decision-making and using information for that. So we think collectively there are local standards, there are international standards, and they're all rights-based. So again, back to that pathfinding-process, before we write a line of code, we talk about all of this stuff, and make sure that the technology that we build are responsible and ethical.

PK:

I'm wondering if you can share with us an example of a project where Animikii's work and philosophy had a significant impact, positive impact on an Indigenous community?

JW:

Sure. I'll share a story from Ontario actually. We're currently doing work with the Six Nations Survivors' Secretariat, which is a secretariat focused around the impacts of the Mohawk Institute, which was a residential school operated for a very long time, and actually predates Canada as an entity. So many communities from the region, even as far as out west, quite far, we're brought to the school.

So in this project, you have many years of data that impacts survivors, their kin, their families, their communities. And this data are in many different systems, from provincial data systems and federal, and within the churches, and RCMP, and so many data stores everywhere. So the community is really actioning data back, and how can they get data back into their community data that's about them, that's sharing these stories and these truths. How can these truths live in other systems? So from that data sovereignty perspective, we're working with them, building a web-based platform, using our platform, Niiwin, to get all that data into one spot, so that it can benefit the community that the data is about.

PK:

That's a great reminder and illustration of this powerful concept of community interests in data. When we talk about privacy rights, traditionally, they're always individually-based. I think more and more we're recognizing the importance of community interests in information and privacy about them, and information that belongs to them. So thank you for all that great work you're doing, particularly in Ontario. We're going to follow that with a lot of interest. Let me zero in if I can now on artificial intelligence. The world, as you know, is witnessing the game-changing power of AI as it's unfolding in real-time. It's in the headlines every day. My office is focused very much on this area, and we're advocating for strong guardrails for the use of AI technologies to ensure that they're used in a responsible, transparent, and ethical manner. There's that, "Ethical," word again. How is Animikii approaching AI from an Indigenous perspective?

JW:

Well, I'd say we're approaching it very cautiously and with a lot of intention. Again, it's that, "Move slow and empower people." So we're taking the time to assess how Indigenous communities are responding to AI, and it's varied. So through our work, we're learning some things, and as we're all learning in this AI-context, this world that is changing because of AI. I mentioned our Data Back-book, the next edition of our book is AI in an Indigenous context. So we're going to be sharing a lot of our thoughts in great detail with even some recommendations for Indigenous communities. In context of data, in the work that we've been doing and building our data platform, AI needs a lot of data. And so AI is very extractive in nature as well, especially when you're thinking about the large language models that just need lots and lots of data.

The conversations are a little bit more broadly being discussed about that extractive nature. And of course, Indigenous peoples have a unique position, or perspective, on being extracted from through this whole colonial project. So how might we protect Indigenous data from further extraction? It's more of the same, now they're after our data, kind of thing. So how might Indigenous data sovereignty protocols be respected in some of these new AI models, and is that even possible? But one thing for sure that is rapidly developing, understanding more broadly, is the environmental impact of AI. Obviously needs a lot of data centers, and data centers, they need water, they need cooling, and land. I think a lot of times we think of data and AI as these invisible things that are in space and in the cloud, "Cloud," things that don't really have a physical manifestation, but it really does. Data centers often take prime real estate along waterways.

If you look at the growth of data centers, it's almost like, I'm not sure if you've played Sim City, or if the listeners have played Sim City where you're placing down buildings along the water. It's just like plop, plop, plop, plop. This is happening right now and it's happening at an alarming rate. These data centers also need chips and materials, and those are extracted from the earth somewhere, from Indigenous territory somewhere. These data centers just need an inordinate amount of energy. The emerging research and conversation we're reading about, even these past couple of weeks, is that AI and the development of these energy needs and all this is set to undo many of the energy targets, and is going to set so much of these movements back.

So we're working with a community in the United States where many of the data centers, a lot of the data that flows through North America is, and I was sitting with one of the elders, one of the tribal leaders, and she says, "We are literally fighting data centers." Cool water is brought in to cool off the data center, and warm water is put back into the waterways. So there are real environmental effects.

So you're asking me about in an Indigenous context, what our thoughts around AI? We think about these data centers are being placed on Indigenous lands that have been stolen through the legacy of colonization. When we think about Indigenous guardianship, or Indigenous stewardship over the lands, as we have been for thousands of years, it really illustrates the need for Indigenous voices, and Indigenous technologists to be at the table in all of these discussions in this era of AI.

My note earlier around extraction, it's not a new phenomenon for us. We've been extracted from for years, but also thinking about, I mean, if you think on the cosmic scale, the internet, everything is still very early, and there's the opportunity to bring Indigenous voices to the table and start leading in some of these discussions. We have a bright, young, up and coming group of technologists. It was a lonely place to be as an Indigenous technologist 20 years ago. It's less lonely now, and there's so many bright minds that I'm meeting every week, that are ready and willing to have these hard conversations.

PK:

Well, you certainly pioneered the path for many other inspiring people and technologists to work alongside you and in your footsteps on some of these important issues. Hearing you speak really takes the word, "Extractive," to a whole other level when you think about not only data but the environment as well. I know you've been active on the international level with your work as part of a working group, expert working group associated with the Global Partnership on Artificial Intelligence. I was hoping you can tell our listeners a little bit about your international-level work with this global partnership?

JW:

GPAI, as we call it, it's an international multi-stakeholder collaboration. It's hosted out of the OECD, and really was there to help guide the responsible development and use of AI grounded in a rights-based approach like human rights, inclusion, diversity, innovation. So there's many different streams through GPAI, and one of them is responsible AI. So that's, as an expert appointed by Canada to that working group, we get to work on interesting projects. So the project that I'm working on, and this is my first year working with GPAI, is the digital ecosystems that empower communities. So our work right now is to identify digital ecosystems that rely on data, or community data, or data acquisition from other sources, using data stores responsibly, and then looking at how AI or other digital technologies like digital twins, or ways to simulate environments or workflows or process, how can we use AI approaches to inform decision-making support communities.

The result of those kind of projects could be anything from data storytelling, or even immersive experiences. When we say digital ecosystems, these projects, that's really broad. So we're actually looking currently for communities and case studies that we can feature as examples of communities that are using AI responsibly, and to build out their ecosystem. I will take the opportunity to say, we do have a call, so if you know of any communities that would be good for us to consider and put a spotlight on, maybe I can send that over on the show notes as well. But yeah, why I'm interested, I mean, GPAI, it's a global initiative, but from an Indigenous data sovereignty perspective, I'm trying to bring that voice, those examples from the communities that I work within to the global stage.

To really showcase that, "Hey, there are Indigenous communities doing some really amazing things in safe and responsible ways." I'm interested in bringing the conversations about data sovereignty to those global and international conversations, like the CARE principles for example, we've talked about already here today. Like the collective benefit aspect of data, and that responsibility to ethically treat that data. So in that whole context in AI and in a global community, I'm really excited that I get to bring those voices there.

PK:

Fascinating. I'm hearing a lot of potential lessons and learnings for our office. As you know, one of my office's strategic priorities is privacy and transparency in the modern government. And our goal there is to advance Ontarians' privacy and access rights by working with public institutions and others to develop bedrock principles and comprehensive governance frameworks for the responsible and accountable deployment of digital technologies. Listening to you today, I can't help but think that we need more voices at the table. So in your view, how can the principles of Indigenous data sovereignty inform our conversations about data ethics and governance in a digital world?

JW:

Well, I think ensuring that Indigenous data is protected, and that data remains under the control of the Indigenous peoples, and that data benefits those peoples specifically and directly, right? I know that was talked about in the other episode with the OCAP principles, and today we've talked about the FAIR and the CARE principles as well. Using those frameworks as a way to guide your process would be important. Also just supporting and promoting Indigenous-led technology that respects our data rights and protocols, and promoting Indigenous technologists to have a voice in all of your work, especially from communities that you're working with.

Yeah, look at the calls to action, the calls for justice, the United Nations Declaration on the Rights for Indigenous Peoples, for how organizations like your office may have responsibilities, and have been called to action. The truth and Reconciliation has called all of us. And there's something in there for everybody.

The frameworks are there, and a lot of people have put their hearts into that work, and a lot of this work comes from, yes, the Reconciliation-movement, but I think more importantly, survivors' voices. So the Truth and Reconciliation Commission, it wasn't just a government idea to do this, right? It literally came from the voices from communities, and that kind of thing. When I think about how do we get to this place where we're even talking reconciliation, and now Indigenous data sovereignty and having these important conversations in venues like this one, it's really been from a lot of hard work from community. So lean on those frameworks, lean on those outcomes. Lastly, just centering sovereignty, pushing for policies that acknowledge our data is part of our sovereignty and self-determination, and just from a rights-based approach, I think that's what I would recommend.

PK:

Well, thank you so much. Thanks to leaders like you and others that we've had the pleasure and honor of speaking with on Info Matters and in other venues, that we are becoming increasingly alive to many of these issues, these values, these principles. In fact, my office just hosted an annual meeting of the Federal, Provincial, and Territorial Information and Privacy Commissioners from across Canada, and we spoke about Indigenous data sovereignty as an important part of our agenda. Again, as I said, it's thanks to leaders like you that raise the importance and the visibility of conversations like this. So thank you so much, Jeff, for taking the time to speak with us today.

JW:

Thank you so much, miigwech. If there's anything else that I can do to support your work, or anybody listening, please reach out, I'm available on LinkedIn. There's all the materials that I've shared today. Hopefully, there's something there for everybody. Yeah, I just wish you all the best in your work, and thank you for doing this very important work.

PK:

Thank you, miigwech, and we will continue to follow your inspiring work at Animikii.

JW:

Thank you.

PK:

Well, that was truly an insightful, thoughtful, and humbling conversation, and it's clear that the future of Indigenous tech is not just about innovation, it's about respect, data sovereignty, and creating sustainable systems that align with cultural values and community needs. For listeners out there who want to learn more about the work of Animikii, there are links to resources in the show notes. You can also find a reference to Data Back, the book that Jeff mentioned. And if you'd like to dig deeper into the First Nations principles of OCAP, I encourage you to tune into our episode on this topic with Jonathan Dewar of the First Nations Information Governance Center. It's episode seven from season one, and there's a link to that episode in the show notes as well. To learn more about access and privacy topics, I encourage you to visit our website at ipc.on.ca. You can also contact our office for assistance and general information about Ontario's access and privacy laws. Well, that's a wrap, folks on this important episode. Thanks for tuning in, and until next time.

I'm Patricia Kosseim, Ontario's Information and Privacy Commissioner, and this has been Info Matters. If you enjoy the podcast, leave us a rating or review. If there's an access or privacy topic you'd like us to explore on a future episode, we'd love to hear from you. Send us a tweet, @ipcinfoprivacy, or email us at @email. Thanks for listening, and please join us again for more conversations about people, privacy, and access to information. If it matters to you, it matters to me.

Help us improve our website. Was this page helpful?
When information is not found

Note:

  • You will not receive a direct reply. For further enquiries, please contact us at @email
  • Do not include any personal information, such as your name, social insurance number (SIN), home or business address, any case or files numbers or any personal health information.
  • For more information about this tool, please see our Privacy Policy.