Podcast

S3-Épisode 9 : Renforcer l’autonomie des jeunes femmes et des jeunes filles dans le monde numérique

Les espaces en ligne peuvent responsabiliser les jeunes mais peuvent être hostiles aux jeunes femmes. Le projet eQuality de l'Université d'Ottawa vise à créer un environnement en ligne sûr et équitable.

L’info, ça compte

Les espaces en réseau offrent aux jeunes d’innombrables possibilités de se connecter et de partager des idées et des informations comme jamais auparavant. Mais pour les jeunes femmes et les jeunes filles, le monde en ligne peut être un endroit hostile, alimentant la gêne, le doute et la peur. Jane Bailey et Valerie Steeves, professeures à l’Université d’Ottawa, parlent du projet eQuality. Sa mission est d’aider les jeunes à créer un environnement en réseau où ils peuvent participer sur un pied d’égalité, sans surveillance ni harcèlement fondé sur l’identité.

Remarques

I'm Patricia Kosseim, Ontario's Information and Privacy Commissioner, and this has been Info Matters. If you enjoy the podcast, leave us a rating or review. If there's an access or privacy topic you'd like us to explore on a future episode, we'd love to hear from you. Send us a tweet @IPCinfoprivacy or email us at @email. Thanks for listening, and please join us again for more conversations about people, privacy, and access to information. If it matters to you, it matters to me.

Professors Jane Bailey and Valerie Steeves co-lead the eQuality Project. Its mission is to help young people create a networked environment where they can participate equally, free from surveillance, and identity-based harassment.

  • What led to research investigating the societal and cultural impacts of the internet on teens and particularly girls [3:06]
  • Goals of the eQuality Project, roots in the eGirls Project [5:36]
  • Impacts of social media on teens [8:32]
  • Defining technology facilitated violence [9:57]
  • Impacts of technology facilitated violence on young women and girls [12:01]
  • Young people self censoring themselves on social media [13:29]
  • The role of education in helping young women and girls participate equally in the digital world [14:41]
  • Recognizing and honouring the rights of young people through deliberative dialogue [18:18]
  • How to meaningfully engage young people in discussions about privacy in the digital environment [22:49]
  • Value of educators and schools committing to upholding the privacy rights of young people [23:28]

Resources:

Info Matters is a podcast about people, privacy, and access to information hosted by Patricia Kosseim, Information and Privacy Commissioner of Ontario. We dive into conversations with people from all walks of life and hear stories about the access and privacy issues that matter most to them.

If you enjoyed the podcast, leave us a rating or a review.

Have an access to information or privacy topic you want to learn more about? Interested in being a guest on the show? Send us a tweet @IPCinfoprivacy or email us at @email.

Transcriptions

Patricia Kosseim:

Hello, I'm Patricia Kosseim, Ontario's Information and Privacy Commissioner, and you're listening to Info Matters, a podcast about people, privacy, and access to information. We dive into conversations with people from all walks of life and hear real stories about the access and privacy issues that matter most to them.

Hello listeners, and thanks for tuning into Info Matters. We are living in a time like no other period in history. The internet has made it possible for young people around the world to share information and ideas like never before. Network spaces have given our younger generation the opportunity to make new friends, build relationships, explore new ideas, learn and grow. A young person's digital identity, formed by what they put out there and what others may post about them, has a huge impact on their present and future selves. This raises concerns around their privacy, identity, and equality in our increasingly networked world. For young women and girls in particular, the digital environment can be an unfriendly place. Sometimes even harmful. Online platforms make it easy to anonymously criticize or judge others without fear of real-life consequences. Commercial organizations may prey on their sense of insecurity to nudge them into buying products or services they don't need, and some predators capitalize on their vulnerability to lure them into dangerous situations.

Sexualized harassment, voyeurism, and other harmful behaviors, aided by technology, may prompt some girls and young women to self-hate at a time when many of them are just beginning to form their identities and self-esteem. Some may even turn to more desperate measures of self-harm. Others may turn away from the use of social media altogether, denying themselves their right to participate as equals in the digital world. In this episode, we'll be talking about how young women and girls in particular experience the online world and how we can promote healthy relationships and respect for equality online. My guests are professors Jane Bailey and Valerie Steeves from the University of Ottawa. They co-lead the Equality Project, a partnership of scholars, policymakers, educators, community organizations, and youth whose mission it is to help young people create a networked environment where they can participate equally free from surveillance and identity-based harassment.

Jane and Val, welcome to the show and thank you so much for joining us today.

Valerie Steeves:

Thanks. It's great to be here.

Jane Bailey:

Wonderful to be here. Thank you.

PK:

So let's begin by hearing more about your work and how you became interested in researching the societal and cultural impacts of the internet on teens and particularly girls. So Jane, why don't we start with you? What led you down this road in your law career?

JB:

Being a feminist has been a central part of who I am, how I think, and how I act for as long as I can remember, even when I didn't know what that word meant. And I've been practicing as a civil litigator at the outset of my career and I had a chance as junior co-counsel to be on the first internet hate speech case that went before the Canadian Human Rights Tribunal, that piqued my interest in the way that digital technologies were and I felt would continue to repeat and deepen pre-existing marginalization of equality-seeking communities. And that became my master's research project, and I got hired at uOttawa's Tech Law group 21 years ago and met amazing researchers like Val, who were equally as concerned as me about what the digital environment would mean for human rights.

PK:

And Val, what about yourself? You're a professor in the Department of Criminology. What led you to focus on the impact of new tech on human rights and, in particular, on the right to privacy among young people?

VS:

I always like to say that I started to think about privacy because I had five children, and when they were growing up, I didn't have any privacy. Four of them are girls, so I always had a bit of a focus group in my house. From oldest to youngest, the changes were phenomenal, the pressures were incredible. So part of it is just lived experience. Professionally, privacy has always been an important issue for me because I used to practice criminal law and it's such a central part of how we negotiate democratic relationships between the citizen and the state. And watching these technologies roll out, it's been fascinating to see how corporations have become part of that.

After teaching for a number of years, I actually got a PhD in communications specifically so I could think through the theoretical issues around privacy, data protection, and the kinds of solutions that we rely on in law because there's this gap between the way adults talk about these issues and what they're concerned about and what's actually happening in young people's lives. It's really incumbent upon us to think really carefully about the types of policy solutions that we need that are going to really reflect kids' interests and perspectives and lived experiences.

PK:

I wanted to ask you to tell us a little bit more about the Equality Project, which I understand builds on your earlier eGirls Project. What were some of its goals when you started out?

JB:

So when I look back, I see this clear line between eGirls and Equality, because girls and young women, when we'd spoken to them in the eGirls Project had sent a pretty clear message like really in no uncertain terms that they were very tired of being told what to do and what not to do. While it seemed like data-hungry corporations that they saw were feeding them stereotypes and setting them up for conflict and harassment seemed to be attracting so little attention. And I remember a conversation with Val and me and co-researchers about where would we go after eGirls, and we talked about the goal of creating a digital environment that kids wanted and what they deserved. And we knew based on what girls and young women had told us, that corporate practices really had to be a central component of that. The prior tendency to focus on what girls were doing, what parents were doing and so forth, wasn't going to lead us to the environment that we would hope for and that kids deserve.

VS:

One of the things that really became apparent when we were doing eGirls was that the corporate design of the sites was setting kids up for this form of judgment where it was leading them into conflict with each other, but also creating issues for them when it came to just navigating through their social lives. So it really became important to interrogate the commercial design of the sites. It was really important to sit down and say, well, actually the environment that they're in is designed to encourage that, that it's there to vacuum up all of this data. So I would say one of the biggest changes over time is watching young people's perspective shift as they became aware of this process. They realized that they were under observation, they realized corporations were interested in them. They felt like all of this stuff is happening, now it's happening in school.

It's not just on social media. Now I'm being required to use all these technologies at school and at work, and even with my family relationships to communicate with family members, and there's nothing I can do about it because I can't stop being pulled into this corporate relationship that I really don't want to be in. I would say in the last couple of years when we've been in the field, I think it's moved beyond that. I think they're angry about it now, especially in schools. We've done a lot of work on personalized learning applications and the collection of kids' data and how it's fed back to them and what they feel about the surveillance they're experiencing in education. And a lot of it is what is wrong with you guys? Leave us alone. We don't want to participate in the corporatization of everything. So I would say that's probably the biggest shift over that period of time.

PK:

A couple of years ago, I'm sure you'll remember, there was a former Facebook employee, Francis Haugen, who testified before a US Senate subcommittee after leaking some internal research showing that the company knew about Instagram's negative impact on some teens, including the toxic risks to their mental health. Did that testimony surprise you or did that resonate with your own findings about social media platforms in general?

VS:

Young people have been telling us that for years. We have spent a lot of time going through terms of use agreements and privacy policies and trying to track the data because it's so non-transparent, and everything that we found as we undertook that research indicated that this is intentional. It's not intentional in the sense that you've got a bunch of people sitting in a room saying, let's create an environment that's really bad for kids. It's intentional in a sense that you've got a group of people sitting in a room saying, if we can get access to this, we can commodify it in ways that will lead to a lot of profit. So you need to make it sticky. Addictive is a good thing in a marketing world because it makes it sticky. You keep kids in these environments, they'll drop more information and you'll collect more information. It's designed to do those things, not to create the harm to young people, but to create the profitability for corporations.

PK:

So Jane, I want to ask you a question that looks at both the many important social and cultural advances of digital technologies, but also their dark side, and in particular something that's been called technology-facilitated violence. What does that mean to you? What is technology-facilitated violence?

JB:

When I use the term technology-facilitated violence, it's as opposed to talking about cyber bullying or cyber violence or online violence, in large part, to get away from the idea that this isn't real or it's happening in some other space. It's not just the things that the public often thinks and hears about. Of course it includes online harassment, it includes non-consensual disclosure of intimate images and hate speech by individuals and by groups. But I also tie the idea of tech-facilitated violence and we have in the project very clearly to this corporate structuring of the environment. The way that the environment itself sets young people up for conflict with each other, and so it helps to perpetuate these forms of violence. It also engages in perpetration of violence by, for example, algorithms that sort and profile and embed discriminatory stereotypes.

By bringing certain kinds of search results to the top, and as Val has said in her work, wallpapering the worlds of young people with this kind of imagery and the impact of what is being perpetrated and sent to the top and distributed winds up influencing things. So it's important for me to say that it isn't just about what individuals are doing, of course it is, but it is tied to this corporate model.

PK:

And the impacts on girls and women in particular?

JB:

There's lots of interesting evidence and research coming out about that, right? Girls and young women that when we talk to them in the Girls Project, they told us, look, we have an environment where we're constantly being encouraged to disclose as much data about ourselves as possible, then that in turn is used to profile us, assign us to categories and to feedback then stereotypes based on the category. They were then encouraged to emulate, because in their own self-representations, the way to get likes and friends, the digital environment's markers of success created this sort of perfect storm. Even as far back as then, though I think understandings of exactly how the profiling was happening were really unclear, they gave a very clear sense of how things were tying together and creating this super exhausting environment that was fomenting competition, self-consciousness, self-doubt, and fear of the impact of this permanent record that was being created by what was happening in the digital environment as it was shaped by the kinds of corporate practices that we've both talked about.

VS:

I think there's also a really subtle form of violence that we've been able to track as these technologies have become more invasive. We sat down with young people and talked to them about how they make decisions about the privacy of their photos. And what really came out of that research was that there's this sharp divide between who I actually am, and what I actually think, and how I actually feel, and how I represent myself, and especially in technologically invasive spaces. I started one interview with a girl by saying, hey, I guess you like horses. And she said, why would you think that? And I said, well, because for the past two weeks all you've done is send me pictures of horses that you've been putting on the internet. And she said, oh, I can't stand horses. They just make a really good Instagram theme. Nobody actually posts their actual interests on the internet that's dangerous, that's terrible, that could put you at risk, you could be judged.

And one young boy loved baseball, he said the same thing. One young girl loved anime, she did everything she could to hide her actual interests from her peers, from her family, from the general population out there because being real is somehow going to set me up for judgment.

PK:

There's a lot of self-censoring going on among young people from your research findings, which is a concerning realization that in order to protect themselves, they actually have to hide who they are and be less real. Certainly not what we want to encourage among our younger generations. So let me turn to a very complex issue of education and what role education can play in helping young girls and women to protect themselves, but also to participate fully and equally in a digital online world.

VS:

I think the first group we have to educate our policy makers and adults, actually teachers, parents. Adults are really concerned about these issues. But as I said before, there's this gap between what they think the problem is and what the problem actually is. So often what we do in these types of situations is we give kids rules. Don't post your name on the internet, don't post a photo, don't do this, don't do that, and they kind of look back at us and say, as if that's a solution. First of all, it's not even the problem, but it's also not the solution. So I think that a lot of this really requires the deepest form of education, which is grounded in informed dialogue between people. So we do a lot of education and we look at these issues and try to create an environment, particularly in schools, but also in community groups where young people can talk about an example that's distanced from their own lives and raise the concerns that they have about that particular issue.

So we made a 20-minute film on the experiences of a young girl in a near future that's using a Google Glass kind of technology to improve her maths grade, and it ends up having all sorts of social and political consequences for her. And so we create those kinds of exemplars so we can go into a classroom and young people can use that to talk about the issues and have a discussion among themselves but also with their teachers and with their parents and the adults in their lives to bridge that gap so adults can begin to understand exactly what these kids are facing. We're trying to mobilize different kinds of methodologies to really take that logic forward to create spaces where young people can express their concerns and talk to adults about their concerns as equal members at the table.

So we've used workshops where young people have been able to talk amongst themselves and then create projects which really make these things visible, which has been a phenomenal educational experience, but we've also looked at using deliberative dialogue where we've brought policymakers like yourself into a room with young people to have this informed discussion so we can begin to bridge that gap. I think we need this much broader democratically grounded form of education where young people and adults can have conversations about really, really difficult issues, the way gender plays out online, the way race plays out online, the kinds of conflicts that they're seeing in these spaces.

PK:

So Jane, I've heard you say in the past that we shouldn't put all of the onus on young people and that there are many other players that have to come to the table as Val said. What are your thoughts on the importance of education and all of the stakeholders, all of the people who have to be involved in supporting young people, in particular girls and young women, to partake safely and fully and equally in the online world?

JB:

So it's understanding young people as rights holders. At the heart of the whole thing is that young people have rights and they include the kinds of rights that we often think about freedom from violence and they include rights to privacy and they include rights against discrimination, and there's a whole panoply of internationally recognized rights that children have. And so a critical part of everything that we've tried to do is to make everyone involved understand that, first and foremost, children are rights holders and that whatever it is that we do has to respect and center their lived experiences and the obligation to create environments that are rights-respecting and rights-affirming environments. And part of that is through the kinds of methods or strategies that Val has talked about. If you bring young people into a room with policymakers in a deliberative dialogue and you tell policymakers you're listening and young people are leading, you open spaces for recognition that these people are rights holders, they're knowledge holders, they are people with expertise.

At the same time, we've always tried to be really careful to say that when we're telling policymakers, you have to open up space to hear, really hear young people's experiences, that's not the same thing as saying to them, now kids are responsible for coming up to the solutions. Let's bring some kids into a room and ask them what the solutions are. No, it's up to adults. It's our responsibility. Decades of policy have put us in the position that we're in, and that's on adults and we have to take the responsibility. But if we do it in situations as we have in the past where the policy decision-making is not informed, we're just going to continue to fail, because we're going to address problems that from young people's perspectives aren't the real problems. And we're going to propose solutions like more surveillance that they've also told us are problems in the first place. Surveillance is the problem. It's not the solution. So monitoring me and responsibilizing me to do this and don't do that, that's just more of the same.

I don't want to say that any of this is easy in the sense that we've set up an environment that says this digital industry is critical to our economic well-being. And so when we actually hear young people and their experiences telling us, yeah, but the way this environment is set up is really not good for us or our health, then policymakers get into this really difficult position. On the one hand, we said this is critical to economic success. On the other hand, we're getting all indications are that the model that we're using is not going to get us where we want to go and it's not going to respect young people as rights holders.

PK:

You did invite me to participate in one of these deliberative dialogues with your focus groups of young kids and young adults involved in your project, and I found it incredibly eye-opening. And you're right, I was there as a listener and really felt enlightened and much better informed by kids leading the discussion and really sharing their experience firsthand. When we set out to create a youth advisory council, we reached out to both of you and I want to say thank you so much for giving us your insights and your experience on how to create a council of engaged youth. So going forward, what's your advice on how we can really meaningfully continue to engage our youth advisory council and bring them into these kinds of debates and discussions, and give them a meaningful seat at the table?

VS:

Well, thank you back for doing it because it's a wonderful initiative. So building on the success that you've already had with that, sometimes when you create advisory councils, you'll say, okay, I need some advice on the work I'm doing. Here's my work. What do you think? And then they give you feedback on it. And I think that's sort of the entry level of getting feedback from kids on privacy issues. I think actually what's much more powerful is developing these relationships with young people who'll bring their own concerns to you, and then you are in a position where you can both amplify that by taking that forward and giving voice to those concerns. You're also in this unique position where you can put those young people into policy rooms where they can articulate it for themselves as well. I think that's really an important part of the process. It's important to hear how it feels to be in that environment and still own the responsibility for making sure that we make it better.

PK:

Let me turn now to our role as a commission and my role as a commissioner in Ontario. We've adopted children and youth in a digital world as one of our strategic priorities. Jane, you're on our strategic advisory council. As you know, we've adopted and co-sponsored international and national FPT resolutions to advance the rights of children and their best interests in the online world. And we recently launched lesson plans to accompany our Privacy activity book that we encourage schools and teachers and parents to use as a fun way of engaging kids on questions of privacy.

And one most recent thing we've done is issued a draft digital privacy charter for Ontario schools really hoping to get feedback on the principles we set out in that charter, and eventually our hope is to have schools and school boards sign on to the charter, which is really just a restatement of many of the obligations that exist in law. But it's a form of public commitment to their students, to parents, to their communities that they stand by those principles. How would you describe the benefits of signing on a charter like that among the schools and school boards in Ontario?

JB:

I think a big part of it is to signal to the public that you are a rights-respecting institution, that you are someone who centers the very people that are meant to be centered in the education system, which is young people themselves. And so to be a leader on that issue I would think is sort of ideally what every educator wants. And I suppose the trick to it is that then once you've taken the position, you got to walk the walk. You can't just say you're committing to it because when you do commit to it, then you're held to it. Then that makes you accountable, and that's really important. That doesn't mean that you're always going to get it right, probably not. Probably not going to always get it right. It centers you as an institution that says, we're so committed to this that we are willing to come out and publicly endorse this, knowing that it opens up conversations about how to get this right and to recognize that we're going to make mistakes and accept that, but it's inviting that necessary dialogue that Val talked about.

PK:

Great advice. Jane and Val, thank you so much for joining us on the show and sharing your insights from your research, which I, for one, have been following for years. Your work sheds light on what we can all do to help foster healthy relationships and respect for equality online. For listeners who want to learn more about the Equality Project, there are links in the show notes to this episode. We've also included links to other Info Matters episodes about privacy and access issues facing children and youth. The IPC has free activity books and lesson plans to teach kids about privacy that are available on our website at ipc.on.ca. All of these resources are a must-have for parents, teachers, and kids, and we really hope they help spark some interesting conversations in schools and around the dinner table at home.

Clause de non-responsabilité
Les informations, opinions et recommandations présentées dans ce podcast sont uniquement à titre d'information générale. Il ne faut pas s’en servir comme substitut à un avis juridique. Sauf indication contraire, l'IPC n'approuve, ne recommande ni ne certifie aucune information, produit, processus, service ou organisation présenté ou mentionné dans ce podcast, et les informations de ce podcast ne doivent pas être utilisées ou reproduites de quelque manière que ce soit pour impliquent une telle approbation ou approbation. Aucune des informations, opinions et recommandations présentées dans ce balado n’engage le Tribunal du CIPVP qui peut être appelé à enquêter de manière indépendante et à statuer sur une plainte ou un appel individuel en fonction des faits spécifiques et des circonstances uniques d’une affaire donnée.
Aidez-nous à améliorer notre site web. Cette page a-t-elle été utile?
Lorsque l'information n'est pas trouvée

Note:

  • Vous ne recevrez pas de réponse directe. Pour toute autre question, veuillez nous contacter à l'adresse suivante : @email
  • N'indiquez aucune information personnelle, telle que votre nom, votre numéro d'assurance sociale (NAS), votre adresse personnelle ou professionnelle, tout numéro de dossier ou d'affaire ou toute information personnelle relative à votre santé.
  • Pour plus d'informations sur cet outil, veuillez consulter notre politique de confidentialité.