Q&A  | 

Cindy Cohn

“We probably cannot future-proof technology. We need to future-proof society.”


Reading Time: 4 minutes

Cindy Cohn is the Executive Director of the Electronic Frontier Foundation, the leading nonprofit organization defending civil liberties in the digital world. In 2018, Forbes included Ms. Cohn as one of America's Top 50 Women in Tech. We talked with her about surveillance and data privacy. Read the full interview in the link on our bio.

Is an increase in surveillance inevitable?

I don’t think it’s inevitable but I think it’s going to take focused hard work to ensure it is not our reality in the future. Frankly, it’s also going to take work to undo where we are even now. 

And how do you think governments can strike a balance between privacy and protection for their citizens?

I think you have to make sure you give the government the tools that it needs -although not all that it wants- without infringing on people’s basic rights. I generally have a human rights based approach to governance.

Can a city be “smart” and simultaneously protect citizens’ rights to privacy?

I think so, but it takes much more careful work than what we’ve seen done so far and a much better recognition of citizens’ rights to privacy and how privacy protects freedom of expression.

Do you think such changes are possible in the near future or will it be a more gradual evolution?

Let me say this: the technology makes all of this possible – whether it’s protecting privacy or not. The question is whether we have the political will to implement these things in the most privacy protective way, because it’s really all about us and our decision making. The technology doesn’t care. It can be implemented in a privacy protective way or not. The decisions that matter here aren’t about the technology, it’s the human decisions that matter.

How can we as citizens protect our own privacy and our own right to privacy?

Well, I think the biggest thing that citizens can do right now is to advocate for this in public policy arenas. That means you have to get involved. You have to let your representatives know that this is something you care about and something you’re going to vote about. You need to show up when these questions are being debated, whether it’s in a parliament or a local city government. Make your voice heard.

There are steps you can take as an individual to protect yourself, but they’re all different depending on the kind of surveillance you’re talking about and your personal threat model. So for instance we’re seeing right now in Hong Kong that the protesters are wearing masks because they’re trying to protect themselves against facial recognition, a type of technology that has been deployed in Hong Kong.

When you go online or you use digital platforms, using encrypted services like Signal or Tor can help in many circumstances. It depends on what you’re doing, who you’re doing it with and on your personal objectives. It’s not a one size fits all. That’s part of the problem and that’s why we need to make sure that nobody can deploy any technology that would surveil you without paying attention to your privacy rather than forcing all of us to have to come up with individual protective solutions for each individual surveillance technique.

In terms of privacy consciousness and data ethics, do you have an example of a city or government body that is working in an ethical way and protecting citizens' rights?

Well, again, I think that there are different places that are doing good things in different areas. Illinois has a good biometric privacy law. So does San Francisco for police use of facial recognition. Then there is GDPR, the European regulatory law – that’s probably the biggest attempt to try to create privacy in the world today and it’s still very early days.

Is forfeiting your personal data and privacy for greater efficiency a fair trade?

No, it is definitely not a fair trade in many circumstances. I think many people feel resigned and accept that trade, though, because they think there is no other way. People believe that either you have to basically drop out of modern society or become resigned to it, because we need digital technologies for our jobs, our social interactions, etc. 

In some cases, for example in some health situations where someone has a rare disease, it may be a good tradeoff for them to provide their health data for further analysis or for society as a whole if we get better healthcare and medicine.

But, the truth is that many of the times that our data is gathered and used it is not justified or beneficial for citizens. And citizens are definitely not in control or informed about what is happening and all the possible ramifications for them and others.

The purpose of surveillance is often not to search for identified targets but to gather mass data and decide who should be a target. Do you think this is justified in a democratic society?

No. I have been suing the NSA (National Security Agency) for 11 years now for the improper gathering and misuse of information and for betraying people’s privacy in this way. I think that they are gathering mass data in an abusive way. In the international framework, surveillance must be both “necessary” and “proportionate.”  This mass gathering of data is neither.  

In an increasingly datafied world, the digital devices we rely on often betray the most vulnerable or marginalised. How can we future-proof technology so that our data does not exacerbate existing inequalities?

We probably cannot future-proof technology. We need to future-proof society.  We need a society that is engaged and ensures that the technologies that do hurt the most vulnerable are found out (transparency), accountable (through litigation or regulation) and required to reform.  This is an ongoing process. We also need to make sure the digital revolution and development of technology reaches everyone, that at least the majority can benefit from it so that no one is left behind because of their race, economic state etc.