Article  | 

Big data, little recourse: Sorine’s Story

Tags: 'Data ethics' 'Digital trust' 'Personal data' 'Privacy'

SHARE

Reading Time: 8 minutes
  • Unethical data breaches and privacy abuse scandals have sparked a renewed awareness about what personal data we give away, to whom, and for what purpose.
  • But how do data-driven technologies, markets and economies affect the autonomy, agency, and privacy of citizens like Sorine?

Meet Sorine

The year is 2030, and Sorine is digging through her purse for an ibuprofen.  Only a few more stops until she gets home from the first day of her new graphic design job. Clutching her middle, Sorine pops a painkiller and gratefully settles into a newly liberated seat on the driverless tram. “It’s that time of the month again,” she thinks wearily, Facebook-logging into her period tracking app and inputting the date and a few symptoms. Cramps, fatigue, irritability, cravings for salty food. Sorine flips back to scrolling through Instagram for the rest of the ride home, eyes peeled for a healthy recipe.

Finally reaching her shared flat, Sorine greets her roommates and starts to prepare dinner, throwing YouTube on in the background. As she waits for the water to boil, Sorine checks her email. Surprised to see the name of the period tracker app she was just using, she opens the message to discover that her account has been compromised. Startled and annoyed, Sorine quickly deletes the app from her phone and resolves to find a more secure period tracker later. For now, it’s time to eat.

Curling up on the couch in front of her laptop, the next video in Sorine’s queue is promptly interrupted by an irritably chipper female voice. “Sorine, share your magical news this spring, like @finally_a_mom did, using our new digital pregnancy test! Get unmistakably clear results in words!” Sorine’s abdominal cramps are replaced by a sinking feeling. Did that ad just speak to her about sharing “magical news”? Narrowing her eyes, Sorine grabs her phone and re-checks the email. She just found out about the breach – how could this have happened so quickly? Who else has her data by now, and what do they know?

Sorine cannot be sure that her personal health data will not be sold to insurance companies. It has already been sold to marketers – she knows that because pregnancy test ads keep showing up on her YouTube videos and as she scrolls through Instagram. But if her company should somehow get a hold of it, could it put her new job at risk?

Questioning accountability: a crisis of trust

One might argue that Sorine should take a chill pill – after all, what’s the harm if a few companies know about her menstrual cycle symptoms? If she has nothing to hide, she has no reason to worry. Right?

The problem with the “nothing to hide” argument is that lots of individually benign bits of data add up to something much more sinister. In fact, it is not the data about Sorine’s menstrual cycle that these companies are after, but rather the  “digital debris” that she leaves behind when she reads articles or watches healthy living videos, posts about her new job or even looks up recipes. This metadata is used to make inferences that combined create a profile of Sorine – her “digital twin” – whose potential behaviour is carefully analysed and tagged accordingly. Companies betting on how “digital Sorine” will behave in such markets are profiting from Sorine without her knowledge or consent. It is these secondary uses of metadata over which Sorine – and the rest of us using apps and “free” online platforms and services – that have no control.

Image credit: Panoptykon Foundation

Reconsidering recourse

So what can Sorine do now that her digital trust has been breached? Simply clicking a “I don’t want to see this anymore” “This ad is not relevant to me” or “Why am I seeing this?” button doesn’t seem to solve the problem of inferences about her sold without her permission. Even if no data breach had occurred in the first place, this kind of “free-for-all” data scraping and predictive analysis is common industry practice as a recent IBM case demonstrates, and citizens like Sorine have little recourse. Sometimes the results can be sensitive, unethical and even traumatising, as in the case of journalist Gillian Brockell, who continued to receive ads for newborn baby products after giving birth to a stillborn child.

Sorine could simply stop using apps like the period tracker or even quit social media altogether through a “digital detox” of sorts. This is easier said than done, as one journalist found out after spending six weeks trying to completely give up the “frightful five” tech giants (Google, Apple, Facebook, Amazon, Microsoft). Sorine could also try to “switch” to alternative equivalents of each platform, but that doesn’t seem fair either. We have network effects to thank for that, as Sarah Jeong noted after having swapped Twitter for its open-source, decentralised cousin, Mastodon:

“You aren’t on Mastodon because your friends aren’t on Mastodon. Your friends aren’t on Mastodon because you’re not on Mastodon. And I wouldn’t be on Mastodon, either, if I hadn’t promised my editor to write an article about it.”

But wait. Why is Sorina forced to clean up the mess the period tracker app and Facebook made? Who is really accountable for the potential consequences of Sorina’s data being revealed and bet on without her consent? Is the onus really on her to figure out and control exactly what data she is giving away, and understanding every detail of whether the trade-off is worth it?

In search of people-centered privacy policy

One might point to better regulation in the form of stronger data privacy laws like the General Data Protection Regulation (GDPR) to protect people like Sorine and Gillian. The GDPR was designed to give European citizens greater control of their data, and impose fines on companies that fail to comply. While we are seeing unprecedented public interest in digital privacy laws, especially with the growing exposure of privacy scandals and data breaches, the main focus of these laws is on data collected by corporations and mediated through contracts.

“This hyperfocus on protecting data skews our attention away from the fact that people are the entities that need protection,” argues writer and tech education advocate Tyler Elliot Betillyon. When it comes to the idea of privacy itself, critics argue for a need to think far beyond traditional notions of privacy and consent and consider how technology impacts a person’s autonomy, power and individual agency. This is especially true for vulnerable populations, whose “expectation of privacy is very much influenced by the amount of economic or political power” they hold, according to Automating Inequality author Virginia Eubanks.

“This hyperfocus on protecting data skews our attention away from the fact that people are the entities that need protection”

Data is just one avenue of many for abusing people’s privacy, and government regulation is just one piece of the puzzle. Even perfect privacy laws on paper cannot fully ensure effective outcomes. While the GDPR is an important step toward this change, it took 7 years to become a reality. Moreover, producers of products, services and applications have no direct obligation under GDPR to actually carry out data protection by design. In the meantime, public, private and civil society organisations must be involved in the design of digital systems to fill the gap of regulation always lagging behind technological advancement. The cost of violating user privacy has been essentially zero since the inception of the internet, and entire markets have been built on the very idea. This has to change – but its global and deeply technical nature render most lawyers and policymakers ineffective at tabling legislation that protects people instead of the data itself. So what might that look like?

data ethics digital privacy digital trust security facebook

Privacy by design: my data, my rules

Imagine another future scenario where a new and necessary understanding of privacy and data ethics has taken hold in business and across society. Organisations use privacy-by-design in their products and services to collect as little data as possible – the minimum needed. Citizens like Sorine can visit a consolidated platform to not only see which companies have collected their data, but can actually cancel and erase unwanted transactions. While technically daunting, this type of tool would play an important role in fighting the surreptitious tracking, collection and sale of metadata and other unethical data handling practices. There is also an option to see visually where the data has gone, the value of its having been shared or sold, together with the ability to seek recourse and even compensation enforced by a global data ethics council similar to the European Union Agency for Network and Information Security (ENISA), which would act like a consumer protection bureau for data ethics and privacy issues.

Forward-thinking policymakers are taking the first steps to make such a scenario into reality in different places around the world. Take New York City: the first city in the United States to pass an algorithmic transparency law. This landmark legislation addresses the “black box” problem of explainability (or lack thereof) by focusing on data transparency: not only making data publicly available but interpreting the decisions made using the data so that those affected might understand and seek recourse for the consequences. In addition to passing tougher privacy regulations, the governor of California is studying the concept of a “digital dividend” policy that would enable consumers to share in the wealth created by their data.

Using data the “right” way is not about what is technically possible, but rather desirable from society’s point of view. Given the near impossibility of doing anything online without leaving a digital shadow or trail, citizen and consumer rights that are embedded and upheld in “offline” agreements must be equally applicable and enforceable in the online world. Still, efforts to fully preserve privacy and deploy ethical data solutions will require deeper thought and more imagination than simply tightening regulation or hoping technology companies will “self-regulate”.

As consumer awareness about how much are we worth to companies that are collecting and claiming ownership of our data grows, a new conception of privacy is necessary in the digital age. Perhaps more effective paths to recourse can be found in collaboration of experts who understand enough to properly address data ethics and privacy concerns together with policymakers in the political sphere. Building new communities and modes of inquiry, connecting them to the public interest and galvanizing action are the essential next steps in shaping a better digital future society.

 

Sorine’s story brought to you by Digital Future Society

Although Sorine is a made-up persona, his story emerged from a workshop of the Digital Future Society think tank, a group of academics, researchers, policymakers, entrepreneurs, corporate leaders and civil society experts who have come together in the spirit of collaboration to explore the impacts of technology on society.

Working under the theme of digital trust and security, this particular working group has been exploring the data (and metadata) ethics landscape with a focus on privacy and seeking practical solutions that lie beyond access, ownership and consent – especially in a post-GDPR world. In the coming months, the working group will publish a report detailing what policymakers need to consider when designing and enforcing effective data ethics and privacy regulations and suggest concrete actions to tackle ethical challenges effectively – especially when it comes to marginalised groups like low-income families, small businesses and the digitally excluded.

 

Learn more

Keen to find out more about this topic? Check out the following resources:

Articles

Videos

Podcasts

Books

Tools

  • The Electronic Frontier Foundation has created a set of free privacy protection tools designed to make your experience of the internet safer.
  • Switching.social – ethical, easy-to-use and privacy-conscious alternatives to social media and other platforms
  • Data detox kit – an 8-day guide to help users take control of their digital footprint