Q&A  | 

Kade Crockford, preserving privacy in the era of face recognition

"A technology to track every place every person goes, with whom, and when is the perfect tool of social control".

Tags: 'Face surveillance' 'Kade Crockford' 'Privacy'

SHARE

Reading Time: 4 minutes

Kade Crockford is the Director of the Technology for Liberty Program at the ACLU of Massachusetts. Kade researches, strategizes, writes, lobbies, and educates on how systems of surveillance and control impact not just the society in general but their primary targets—people of color, Muslims, immigrants, and dissidents. Kade writes for The Nation, The Guardian, The Boston Globe, WBUR, and for her own blog, privacysos.org/blog.

Can you give us an overview of your work?

I research, write, do public education, lobby, and organize towards the end of protecting civil rights and civil liberties in the digital age.

How is face recognition present in our everyday life?

In two ways, by governments and corporations. Some businesses are starting to use face surveillance in stores, to track people’s movements and habits, and to look for specific people who they may suspect of shoplifting. Other companies use face recognition as an authentication tool—think the iPhone, which allows you to use your face to unlock the device. Governments, on the other hand, are using the technology for at least two purposes: to identify and to track. In the first case, police use face recognition to run an image of an unknown person against a database of known people, to try to identify the person in the photograph.

In the second case, governments—for example in China—use face surveillance technology in connection with surveillance cameras to track and catalogue people’s public movements, habits, and associations.

In the second case, governments—for example in China—use face surveillance technology in connection with surveillance cameras to track and catalogue people’s public movements, habits, and associations.

Which are its up and downsides?

No one—not governments nor corporations—should use these technologies absent legislative protections to guard civil rights and civil liberties.

No one—not governments nor corporations—should use these technologies absent legislative protections to guard civil rights and civil liberties.

Unfortunately, most places in the world do not have comprehensive laws protecting our biometric privacy. But that hasn’t stopped companies or governments from rushing head first to adopt this dystopian technology in our communities—often in secret. People in free societies should retain a right to anonymity in public, and this technology fundamentally threatens it. It ought to be rejected by free people.

Is there a way we can protect ourselves from it?

We must pass laws prohibiting governments from building face surveillance networks that enable the tracking and cataloguing of people’s movements in public spaces. Any surveillance system that is constructed will be abused and misused. The only way to prevent this abuse is to prevent the government from building the surveillance architecture in the first place.

As security expert Bruce Schneier has said, it is bad civic hygiene to allow a government to build a system that can eventually be used to create a police state. That’s face surveillance in a nutshell.

As security expert Bruce Schneier has said, it is bad civic hygiene to allow a government to build a system that can eventually be used to create a police state. That’s face surveillance in a nutshell.

Will facial recognition reshape the way we see others?

Not if we stop it.

Which basic reforms should be implemented world wide to ensure new tools do not create a dystopia?

We should ban governments from using the technology in concert with surveillance cameras to track people’s locations and habits, either in real time or on stored video data.

Your work aims to protect and expand core First and Fourth Amendment rights and civil liberties in the digital 21st century. Those protect freedom of religion, speech, assembly and the right to privacy, among others. How are they being threatened by face recognition?

If the government can use a piece of technology to track every place every person has gone, and with whom, and when, the government has a perfect tool of social control. If local police want to find out who attended a protest, this technology makes it as easy as pressing a button. If a mayor wants to know which person in his administration has been speaking to the press, he can use this technology to comb through hours of video to find the staffer who visited the newspaper office.

If a mayor wants to know which person in his administration has been speaking to the press, he can use this technology to comb through hours of video to find the staffer who visited the newspaper office.

If a corrupt cop wants to use the technology to find out who has been getting treatment for alcoholism, and then use this information against someone for personal reasons, this technology makes it simple. If the security state wants to track how people pray, and how often, and where, this technology automates the ability to do that at every house of worship across a country. Privacy is a must in a free society, because it gives us the space to live freely, without being monitored and controlled by the state. This technology fundamentally threatens those rights.

Experts are warning about face recognition technology's algorithms being biased in terms of gender and skin. How is that, and what does it imply?

See “Gender Shades” a study from MIT. [NoE: “Recent studies demonstrate that machine learning algorithms can discriminate based on classes like race and gender. […] The substantial disparities in the accuracy of classifying darker females, lighter females, darker males, and lighter males in gender classification systems require urgent attention if commercial companies are to build genuinely fair, transparent and accountable facial analysis algorithms” (Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, Feb 4, 2018)]. 

Can we recognise the technology before it recognises us?

We must expose the secrecy surrounding the adoption of the technology by pushing government agencies to disclose how and where they are using it, so we can work with legislative bodies to prohibit its use.

We must expose the secrecy surrounding the adoption of the technology by pushing government agencies to disclose how and where they are using it, so we can work with legislative bodies to prohibit its use.