Q&A  | 

Governing data for the benefit of all, by Christoph Steck

Christoph Steck, Telefonica's Public Policy Director: "I use and value a lot many digital services and applications but would often be happier to pay with money, and not with my data."

Tags: 'Christoph Steck' 'Data ethics' 'Data Governance' 'Digital Rigths' 'ethics' 'Telefónica'


Reading Time: 7 minutes

Christoph Steck is the Director of Public Policy and Internet at Telefónica. In that role, he leads the advocacy and defines Telefonica's Public Policy positions on Internet Policy, Governance and other digital policy issues.

"We believe that there is a chance to use the disruption the pandemic has caused to make our societies fairer, more sustainable and inclusive by digitalization. However, this will not happen if we do not set the right Public Policies to make sure that the digital transformation ultimately improves all our lives, leaving no one behind, and helps us fight climate change.", he says.

How will the use of Big Data change the analysis of the world, the way business and social decisions are made?

Data is transforming everything, allowing us to move at a pace we could never have imagined. It is an essential tool to make better decisions, not only in the business sector, but also to improve the quality of life of people and promote the progress of society, e.g., towards smarter, more sustainable industries, production, but also agriculture.

The total amount of data created, captured, copied, and consumed in the world is expected to keep the massive growth we have seen over the last years, reaching 149 zettabytes in 2024, according to recent research. To put that in context: Just 6 years ago, in 2015, we only created around 15 zettabytes worldwide, 10 times less.

The rise of the Internet of Things (IoT) will rapidly increase the number of connected objects with embedded sensors, creating new ways to improve our world through data-driven insights. Advances in automation and Artificial Intelligence (AI) will use such data and revolutionize current industrial processes, supply, and value chains.

We already now have access to a greater volume of information than ever before, and possibilities to make our world more efficient, informed, and better managed.  Consequently, data has become a valuable resource that can enrich user experiences and generate new opportunities, benefit businesses, and facilitate the progress of society at large.

We got a glimpse of that in recent months during the pandemic and fight against COVID-19. Data-driven applications have helped keep people safer, better informed and have supported public administrations in evaluating the effectiveness of their measures in real time. I think we have only started to understand how we can make use of data to improve our public administrations, public transport and services, but also how we produce goods, provision services or even grow and harvest crop in agriculture.

How important is an ethical framework for data extraction and usage?

Data usage is and will be an important part of our lives. Of course, this makes also keeping data private and secure a cornerstone of any sustainable digital society and economy.

Given the enormous value of data-driven services for people and society, we need to achieve an ethical and value-based usage of data that includes accountability and new governance rules, for example transparency and choices for people. We need to think about better ways to provide people with control over their data.

Actually, I think this is the essence of an ethical usage of data: that people´s digital rights are respected, and their personal data is seen as a reflection of themselves as self-determined, dignified human beings. I believe this could be a positive vision we should work towards as democratic societies and a sharp contrast to other dystopian, totalitarian scenarios.

It is said that users are constantly signing implicit contracts with certain providers where they are not reading the privacy and data disclaimers, do you think we are aware of the implications of accepting the terms and conditions of these services? Is there a data governance model that enables greater user choice?

A few years ago, researchers at Carnegie Mellon University calculated that it would take a person an average of 76 days to read all the privacy policies and notices we come across in just one year. I suppose today it would be even more if we think about all the apps we have installed on our mobile phones.

Obviously, this volume of information and data is unmanageable for any user and people end up accepting all privacy terms and conditions without really reading or understanding what that “tick in the box” implies. Although in Europe the GDPR has clearly improved data protection and is rightly setting a precedent for other parts of the world to follow, it has not really changed much in that regard. That shows that only relying on data protection rules is not enough, and a new model of data governance is needed.

That means that businesses should do its part to restore trust of users by developing new models with a fair exchange of value and better control for users. For example, users should have better possibilities to manage and have control over their personal data, which implies enabling more dynamic and user-friendly access to their data and to additional information on risks and benefits associated with its management.

Additionally, real choices on how their data is used on a more granular basis should be offered, avoiding the currently prevailing “accept-all-or-nothing” decisions that are often used for digital services. Finally, all data must be safe, and the privacy of users preserved during the whole process of data handling.

How important do you consider a global debate on digital ethics to be?

I am convinced that improving in these aspects and putting people and their rights at the center when designing digital services, will over time improve trust, which will again help the implementation of data-driven services.

It is important to have initiatives like Digital Future Society because we also need more, and broader, debates and reflections on these issues. The pandemic has accelerated digitalization of our lives and I am convinced that one day future historians will identify 2020 as the year when we finally became digital societies.

At the same time, our policymakers have been rightly focused on fighting the virus and our policies and regulations are simply not up to speed with the level of digitalization of our lives.

So, the need to discuss, define, and modernize policies is greater than ever before and needs to happen as fast as possible or this rift between our digital realities and outdated, analogue policies will create major issues and problems.    

Which data governance best practices are put in place in companies like Telefonica and is there any regulation associated?

We have several instruments that work together to safeguard our commitment to our customer´s privacy. One is our Global Privacy Policy, which establishes the general guidelines not only for the compliance with the legal systems in each country in which we operate, but also configures a common, general standard for all businesses in terms of privacy.

Additionally, we also have the Data Protection Governance Model, which defines the strategic, organizational, and operational framework for the management and protection of personal data. In compliance with the EU´s General Data Protection Regulation, Telefónica has also a designated Data Protection Officer, who oversees and is responsible for the group’s Data Protection Governance Model and reports directly to the Board of Directors.

Apart from data protection, for me one of the most relevant issues is how we can assure that we use Artificial Intelligence in ethical ways and avoid, for example, unjustified biases. I am proud to say that we were one of the first companies in the world to establish our own Artificial Intelligence principles to ensure that this amazing technology has a positive impact on society. Since 2018 these principles are applied to the design, development and use of all our company’s AI services.

Based on five key principles we make sure that AI delivers fair results without discriminatory impacts and its use is transparent and explainable, which means that users know when they are acting with an AI system, which data is being used and for what purpose. Such internal policies and principles are very important to assure that technology is human-centric and that we build trust with our services, which is very important for the overall acceptance of digitalization.

What are the economic and social implications of the concentration of power in the tech sector?

In the European Union, competition policies have historically also been guided by the idea of fairness and equity when markets are kept open for competition and innovation for everyone.

Apart from economic reasons, there is a value-based thought that it is good and fair when the famous “two-guys-in-the-garage” can innovate and challenge established market orders.

Especially technology markets are full of such disruptive success stories since the 1950ies, but as researchers like Mariana Mazzucato have shown, many of the underlying technology innovation leaps were in fact supported by government intervention, by regulation and policies, and not only by private companies.

Silicon Valley arguably would not exist today without the antitrust interventions that forced AT&T in the middle of the 20th century to first invest parts of their monopoly profits in research & development, and afterwards to open up the patents of these famous “Bell Labs” to other companies.

Today, we are only at the very beginning of having digital, data-driven economies and societies and we need to find the adequate 21st century policies and regulations that balance the different values and rights at stake. Current regulatory frameworks clearly have been overtaken by the speed of digitalization and have not been adapted quickly enough to the new market realities of two-sided markets and digital, data-driven business models.

Policymakers have only recently woken up to the challenge of creating a fair Level-Playing-Field for both, traditional and new services in regard of taxation, privacy, security, or consumer protection. While we should avoid doing “regulation by outrage” and need to make sure that the many benefits that digital services have given our societies are preserved, we need to tackle that issue now with priority.

Maybe we can say that it is important to create fair and equal conditions for everyone from a morale point of view, but I believe that it is even more relevant to make sure that people everywhere perceive that they have the possibility to use, innovate and ultimately benefit from digitalization. That perception will be key for the acceptance of the changes that are happening and the many that will happen soon, for example regarding the future of work.


Finally, how do you manage your own personal data and data give away and is there any recommendation you would like to make?

I would say that I try to manage my personal data and privacy as good as I can. I have for example personalised, restrictive settings of privacy for apps or social media accounts.

However, sometimes the only way to have privacy is to avoid that the information is online in the first place. I have for example never posted any photo of my children and have also asked friends to respect that decision.

In the past that has created sometimes animated debates, but my experience is that today most people respect and understand such individual wishes for privacy, even if they do not share them. I think that we are as societies collectively on a huge learning journey and I am convinced that we will see in the future much more differentiation also on data usage.

There are many ways to offer digital services and data usage and privacy can go together. To me, it doesn’t make much sense that people want personalized sneakers or clothes but are forced to accept all the same level of privacy for digital applications and services.

It is obvious that also in that regard “One size does not fit all” because there nothing like “the user”, there are many. I am convinced that businesses are understanding that better and, in the future, will offer much more differentiated models of data-usage and privacy to people.

I for my part use and value a lot many digital services and applications but would often be happier to pay with money, and not with my data. And I think I am not alone with that view.