interview  | 

Technology as a tool of liberation, but also of domination? With Renata Ávila

Interview with Renata Ávila, human rights lawyer specialising in Intellectual Property and Technology.

SHARE

Renata Avila is a Guatemalan human rights lawyer specialising in Intellectual Property and Technology. She is currently the CEO of Open Knowledge Foundation. She worked as one of the lawyers representing the Guatemalan Nobel Peace Prize Laureate Rigoberta Menchu and more recently, Wikileaks and other whistleblowers and publishers by providing legal advice. Involved in Internet and Human Rights research since 2006, Renata worked with the Web Inventor Sir Tim Berners-Lee and more than 125 organizations from the global south, in an effort to uphold human rights in the digital age.

She serves as a Board Member of Creative Commons and is an active advisory member for different initiatives, from the Whistleblower Network in Germany, Coding Rights, to the Data Activism Project from the University of Amsterdam and the the Municipality of Barcelona’s BITS initiative, aiming at reducing surveillance and empowering citizens with privacy tools. She is currently Executive Director of the Fundación Ciudadano Inteligente (Smart Citizen Foundation), a contributor to the international bloggers’ community Global Voices, and is co-author of Women, Whistleblowing, WikiLeaks (OR Books, 2017). She is currently writing a book on Digital Colonialism.

Can you please explain how and why you first became interested in this discipline and why human rights are so important in the digital age?

I’m a Guatemalan human rights lawyer with a deep interest in technology. I became interested in it when I was in law school. So I have been advocating for a long, long, long time. It was the early days of the Internet, so you will see, for example, that the first censorship was happening and it was very simple censorship. It was just a blocked website. But it it occurred to me that we were seeing the transformation of our rights because it was not the usual, you know, violations that you will see, like torture people of people beaten up in the streets. But it was technology both as a tool of liberation, but also emerging as a tool of domination by powerful actors. And so it was only logical for me to explore these, my two passions, and to merge them in my career.

As the world becomes increasingly digital, we’re producing more and more data and becoming more and more predictable and controllable. Algorithms in our apps, in public services, in social media are influencing our behaviors, often in ways we don’t know about. Why do we continue to give away our data freely in exchange for convenience?

When you ask the question about data and why people are just clicking here, accepting and giving away all their data, all their personal data, and giving away so much power to companies, I would say that there’s some deep disempowerment. It is not that people do not know. There’s a lack of public support for people to fight for options, for respectful and viable options. And so it is really costly.

I have almost sympathy for people because it is not that they don’t care about their rights. The actor that they have in front of them, the data extractor, is so powerful and there’s no powerful actor on the side of the people. And so to me it is the absence of the public sector, the absence of the state, not defending and standing by citizens, but standing on the other side, by the data extractor and making sure that they can have their business, without realising that by doing so they are becoming less and less relevant and less and less powerful governments.

Why is there so much opacity around algorithms and how can we ensure technology is designed with our wellbeing in mind?

I would say there’s a lot of opacity in algorithms by design. There’s a lot of opacity in cars today. As technology becomes more complex and sophisticated, we cannot expect that everybody will understand what’s happening inside, but we need to trust what happens inside. So there’s advocacy for the transparency of algorithms, but, for me, transparency is not enough. And transparency is a big ask for citizens. I mean, not even the engineers themselves can understand what’s going on inside. What I demand is a high degree of accountability for the results and constant evaluation at different stages of the process.

AI often betrays the most vulnerable or marginalised. How can we future-proof technologies so that our data does not exacerbate existing inequalities?

I don’t think that we should obsess and focus too much on technology. We should focus on society, because what AI is doing is taking data sets of our society today and our society today is racist and sexist. And what is happening is that when you feed the systems with these data sets, you’re magnifying the problem instead of wiping it out. So what I am advocating in the Alliance for Inclusive Algorithms is actually precisely that. You know, how can we create systems and models that instead of replicating and maximising and magnifying what is bad in society, they accelerate the process of gender inclusion or a more inclusive and tolerant society. And the starting point for it is creating the right data sets and making visible the invisible, counting those who are excluded and making people participate in the design of those systems.

You’re writing a book on digital colonialism. What do we understand by the term digital colonialism?

It is very interesting because many people think that it is only about, you know, poor countries and rich countries. And the thesis that I’m proposing in my book is that it’s not about that. It’s about powerful companies as never before and citizens on the other side. Even European citizens from rich countries, because that degree of domination, of being predictable and being subject to this unprecedented control, depending every aspect of our lives on a few companies that dictate everything and impose rules on us is a new form of colonialism that transcends just the rich-poor dynamic.

It is penetrating each and every country in the world. And the effects are unpredictable because it is the rise of a very different form of corporate power. We cannot vote out corporate power. You can vote out the government, you know, you can replace the system, but this is very different. All the globalisation before and all the trade agreements paved the way to make these companies very resilient to changes in governments, to changes in laws.

If you look at numbers at the European Union and in Washington, they are the ones writing the laws, lobbying the government. So that’s the new form of domination.

Is it possible to resist this type of exploitation?

It is absolutely possible to act against digital colonialism, and I think that the work that Digital Future Society, amongst others, are doing is is the first step, because to tackle the system, the first step is to understand it from a global perspective and to analyse the topics from a social, economic, and political perspective, and then we need to design strategies to really decentralise power and to unlock the power of creation. Innovation has stagnated and it is really, really important to unlock the power of creation for the next generation.

The GDPR is considered to be one of the most advanced regulations in the World regarding data protection. Is Europe the answer to digital colonialism?

Many people celebrate the GDPR as the golden standard of privacy and data protection. But I will say that is questionable because the business model remains intact. And the problem is the business model. It fixes privacy and data protection problems, but it doesn’t fix the innovation and competition problem, because if only one actor has all the data and, you know, finds a way to get the consent of citizens, it is so ahead of the game that small companies cannot compete, because nobody has access to that data. And we need access to certain data sets, to a common pool of data, so everybody has access to the same level of innovation.

I don’t think this regulation on privacy and data protection is the answer. I’m very hopeful on what’s going on with the competition and antitrust law, because antitrust law gives you mechanisms to split the companies, to force them to share resources. And in this case, it will be very interesting if one of the antitrust processes comes up with some forced sharing data mechanism to also enable other companies.

Is the power of big tech outgrowing that of nation states?

We can see after the pandemic, they were the only ones winning. They were the only ones winning, but not winning a little bit. You know, they doubled their profits without increasing the number of jobs, without even paying taxes. So it feels like this power is beyond state and above the law. It’s very dangerous.

Government surveillance used to trace the spread of Covid-19 sparked major privacy concerns worldwide. According to historian Yuval Noah Harari, we face a "choice between totalitarian surveillance and citizen empowerment". What do you think the future of surveillance will look like after Covid 19? Do we have a choice?

Of course it will increase. I fear an economic crisis is coming. And, you know, with an economic crisis protests increase. With protests, surveillance on protesters increases. So I tend to agree with Harari that we are heading towards a very, very dangerous digital dictatorship and it’s the combination of the power of technology with the power of the states.

And I will add that citizen participation can be meaningless if we don’t have a plan that is actionable. We need leadership. In the case of surveillance, what I am noticing is that it is going to be less visible and more powerful and more powerful because instead of tracking wherever I go, is is going to try to predict what I’m thinking and not as an individual, but as a collective.

And governments are very well aware of it. For example, President Biden, in a recent legislation, passed this worrying list of topics that they will monitor against US internal security threats which could criminalise protesters against globalisation. So imagine that. I mean I mean, if you think back in time, 20 years, the protests in Seattle wouldn’t be possible. You know, things that shape our world wouldn’t be possible. It is hypocrisy to criticise China while the West is entirely doing the same.

It is a combination of precarity, repression and technology. I saw what happened after the Arab Spring. You know, citizen participation alone is not enough. We need to restore multilateralism. We need to restore global institutions, shielding what happens locally, and global solidarity. Only with a combination of many things is change possible. Citizen participation alone is not. And it has been fetishised so much as the only possibility.

Smart cities are built on sensors, data and analytics. Can a city be “smart” and simultaneously protect citizens’ rights to privacy? What kind of tools are needed?

A city can be smart, can be green and can protect the rights of citizens, they can actually enable the rights of citizens. I have been writing about this with my co-authors on how City Data Commons is our hope against climate change, because all the pollution, many of the things happen in inner city and most of the problems with heating, with emission problems, with housing problems, energy transformation and so on are happening in cities.

Let’s get surveillance to surveil the problems instead of surveilling the people. And let’s empower people with data to understand the problems surrounding them.

But we don’t have the data to enable us to understand that and to act. So our proposal is that cities as a collective, especially progressive cities, make an unprecedented effort to produce that data commons that we need to analyse how the situation is and deploy fixes very quickly. So absolutely, it doesn’t need to be abusive or intrusive. We need to monitor what’s happening in the city, in the infrastructure, with the quality of air, with the quality of water.

AI looks like magic to people outside of the digital world. Only an elite few have access to or know how to use the data we are producing. You are a Global Trustee of Digital Future Society. To what extent do you think organisations like DFS are important when it comes to making technology more equitable and inclusive?

As a global trustee of Digital Futures Society, what fascinates me is the access to the actors in charge of the decisions. We can whisper into the ear of really powerful people and suggest ways out of the mess that we are in. You know, in my experience, Digital Future Society has managed to create a safe space where we can say to the powerful what we think without filters. But also I think that it is only possible because it is not just activists shouting, but we have evidence based the research that the that the think tank is producing is very valuable because it enables us to come to the table and, for example, discuss gender inclusion or discuss climate aspects or labour aspects with the evidence in our hands. It also connects us with the private sector and governments. And that combination you don’t get in many places.

Why is it so important that we ensure great diversity in the field of data science and what role do you think women have to play in the digital revolution?

We have neglected the presence of women and a more diverse crowd in data science and in computer science. When I talk about women it is not only, you know, the scientists. I think women, intersectional women, have a lot to contribute to the process. Women are connectors in society, since ancient times, we have been the builders of society. We know how to build spaces of care, spaces of trust and spaces where each and every family member feels welcome. You can feel that in technology, that’s precisely what is missing.

We are doing a project called the Tech We Need. And what we are doing is inviting women from the slums and from the popular markets, street vendors and so on to tell us what is the technology they need and why and what they would like to do. You don’t need to go five years to the university, but you need precisely what you need. So local knowledge is very, very important.

You once said that “we live with the same or more inequalities than in the past, even though we were promised that the Internet was going to change everything.” Yet the internet, when used in the right way, can be a powerful tool for empowering citizens. How can we return the power of the internet to the people? Are you hopeful for the future?

I am very hopeful for the future. Even if I said in the past that I felt betrayed because the Internet of creation disappeared and was replaced by an internet of surveillance and control. I am hopeful because I see how the new generations are taking the bad technology that we made available for them and hacking it somehow, more than coding it, hacking it socially.

What gives me hope is that this generation doesn’t see the Internet or technology as the promise, it is just as a tool to make their dreams come true and a tool that they appropriate.

So for us, it was the future. For them it is just an element of a future that they want. We had a world to create and this generation has a planet to save. So it’s a very different scenario. And we owe them a lot. So we need to listen to them and collaborate.