WENDY WONG - Author of We, the Data: Human Rights in the Digital Age

WENDY WONG - Author of We, the Data: Human Rights in the Digital Age

Does privacy exist anymore? Or are humans just sets of data to be traded and sold?

Wendy H. Wong is Professor of Political Science and Principal's Research Chair at the University of British Columbia, Okanagan. She is the author of two award-winning books: Internal Affairs: How the Structure of NGOs Transforms Human Rights and (with Sarah S. Stroup) The Authority Trap: Strategic Choices of International NGOs. Her latest book is We, the Data: Human Rights in the Digital Age.

THE CREATIVE PROCESS · ONE PLANET PODCAST

There is a prediction or the possibility that we'll have super intelligence and within a decade and that scares me without proper guardrails. It’s not as though AI were this entity that isn’t managed by owned by companies and people who own those companies. So while we still can, what should we be doing to strengthen those reins?

WENDY H. WONG

One of the things that we need to remember is that we are data stakeholders and not data subjects. We're often called data subjects if you look at the way legislation is written and tech companies talk about the users of their technology as data subjects.

Being a subject casts this sort of "you can't help but have this happen to you" effect. But we're actually data stakeholders for the reason that data cannot be created without us. If companies were incentivized to follow data minimization for example, where they only collect the data they need, that would change the way we interact with digital technologies.

THE CREATIVE PROCESS · ONE PLANET PODCAST

On the other hand, there’s data collection for the environment. So what are some of those positives? And how can we embrace them while still keeping a critical eye on other forms of data collection.

WONG

I do think that the environment is a place where having more data will help us create better models for thinking about how climate change is going to affect life on Earth. And I agree with you, I think that we should be thinking about the now and life on Earth today and not doing harm going forward because I think it's important to live now and not in this projected future with regard to AI with the "killer robots" but also with climate change with some of the horrible projections that people have put out there that might happen if we don't mitigate carbon production.

So let's focus on creating solutions for today. Like, how are we going to get to net zero by 2050, for example. And so in some ways, data minimization as a standard or as a norm is really in my mind. And so when we think about other fields, if we think about climate science, for example, I don't know if I'd follow a data minimization model because I think we have a lot of data.

Earlier this year, there was a lake in Ontario where they were able to pull some really important soil samples out to think about the dawn of the Anthropocene. And I think that's really important. That's a great discovery for thinking about the effects of human-driven climate change, but also it creates lots of data, more data for us to understand the process.

THE CREATIVE PROCESS · ONE PLANET PODCAST

What’s troubling is that the big technology companies are agnostic about the effects of their technologies. They have tremendous reach but don’t have the same responsibilities as governments.

WONG

Meta reaches between three and four billion people every day through their platforms, right? That's way more people than any government legitimately can claim to govern. And yet this one company with four major platforms that many of us use is able to reach so many people and make decisions about content and access that have real consequences. It's been shown they fueled genocide in multiple places like in Ethiopia and Myanmar. And I think that's exactly why human rights matter because human rights are obligations that states have signed on for, and they're supposed to protect human values. And I think from a human rights perspective, it's important to argue that we shouldn't be collecting certain types of data because it's excessive. It's violating autonomy. It starts violating dignity. And when you start violating autonomy and dignity through the collection of data, you can't just go back and fix that by making it private.

THE CREATIVE PROCESS · ONE PLANET PODCAST

How can the humanities be integrated into the creation and governance of new technologies before they're brought to market. How can political scientists, philosophers, and those from the arts and elsewhere be more involved in the creation of new technologies and data management?

WONG

The majority of people working in tech companies do not have a background in social science and humanities. And I think that's a real travesty. I think that's the real problem: people working in computer science, data science, and computer engineering who want to work on AI should really be taking some social science or humanities courses. I think any social science, any humanities series of courses will teach students to think differently. And I think that's what we want. We want people working on these technologies that have effects on humanity to understand what kind of effect they can have and both the positive and the negative effects.

I don't know if that's a fix. Whose problem are you fixing? And so I think one of the things that I think has been really lacking is the sort of normalization of requirements around understanding humanistic topics. And so sometimes you see in different programs that there are classes on tech ethics.

THE CREATIVE PROCESS · ONE PLANET PODCAST

As of August this year, the largest digital platforms in Europe are now subject to the Digital Services Act, which is designed to get rid of the non transparent practices and take illegal content off social media, search engines, and other major websites. Not to mention Biden’s new executive order. Do you think that leaving it without guardrails for almost 20 years was wrong?

WONG

To say it's wrong or right, I think is casting a moral judgment that we probably couldn't have had even 10 years ago. The problem with AI, unlike other computer applications - at least the way we use AI - is that it depends on data but not just any old data. It's data about people, data taken from people. And I don't think the creators of AI today are placing the same amount of emphasis on the humanity in those data that I as a social scientist do. And the reason is because as a social scientist, I have never not dealt with data that didn't come from people. I write about how in my past work, I've written about how NGOs have lobbied and advocated for social and political change at the international level. That requires talking to people. That means I have to go through a very onerous ethical process at the university level to consider every single facet of what could happen to those data. Like, what could happen to the people whose data I'm taking through interviews? For example, if the data got out, what are you going to do to safeguard those data? How do you secure consent? How do people de-consent? How do they take themselves out of your study if during the interview they decide they don't agree with you anymore, or they don't want to be part of the study? These are all things that we think about working with human subjects. And I think that kind of perspective is really important when we think about the applications of AI.

THE CREATIVE PROCESS · ONE PLANET PODCAST

As you point out in the book, we've always collected different kinds of data, but it’s been very, very analog and never this intense amount of data. You wrote about people whose data has been used to “resurrect” them after they pass. I don't know what your reflections are about how you would want your data to be treated after you pass?

WONG

There are also other technologies out there that actually don't really require any sort of personal connection in life to resurrect someone, so to speak, using data they generated throughout their life. And you can sort of bring them back in ways that they may not have agreed with. And also how we treat human beings with dignity, as though we're beings with worth. And so if we can take the data that describe our activities in life and use them to create digital alternatives or digital resurrections, what does that really mean about how we think about what that person did in life and how we can treat that person once they're gone? So there's a lot of different questions here.

One of the other themes of this particular chapter is this idea of discretion and choice, which goes back to autonomy. So, as human beings, we have discretion over how we interact with people. Think about this in real life. We don't talk to police the same way we would talk to our friends, and that's because we're exercising discretion, even though we're the same person. So I think what what all these data out there about us do is effectively eliminate our ability to exercise discretion if someone did choose to bring us back, so to speak, using data that really describe a lot of what we did when we were living.

This interview was conducted by Mia Funk and Aaron Goldberg with the participation of collaborating universities and students. Associate Interviews Producers on this episode were Sophie Garnier and Aaron Goldberg.

The Creative Process is produced by Mia Funk. Additional production support by Katie Foster. Mia Funk is an artist, interviewer and founder of The Creative Process & One Planet Podcast (Conversations about Climate Change & Environmental Solutions).

 
logo-white-space interviews podcasts.jpg

 
 

CLICK FOR MORE

PODCAST INTERVIEWS