Q&A  | 

Kate O’Neill, tech humanist

"Digitalization helps fight climate change because it allows greater interconnection between humans."

Tags: 'Future of work' 'Human centered technology' 'Kate O’Neill'

SHARE

Reading Time: 5 minutes

Kate O'Neill is the founder and CEO of KO Insights, a strategic consultancy committed to improving human experience at scale by guiding business and civic leaders to be both successful and respectful with human-centric data and technology, and by helping people better understand the human impact of emerging technologies. She has more than 20 years of experience and entrepreneurship leading innovations across technology, marketing, and operations, and was one of the first 100 employees at Netflix, where she created the first content management role and helped implement innovative dynamic e-commerce practices that became industry standard. Kate also developed Toshiba America‘s first intranet; and has held leadership and advisory positions in a variety of digital content and technology start-ups.

Could you give us an overview of your work?

I am an author, speaker, and “Tech Humanist,” which is also the title of my most recent book. I like to say that I am helping humanity prepare for an increasingly tech-driven future. Some of that entails writing and appearing in the media to discuss and help shape the debate around how emerging technology impacts humanity across a wide variety of topics, like the future of work, facial recognition, privacy, and so on. My company, KO Insights, is committed to making human experiences more meaningful at scale. Most of my work consists of advising executives through keynote speaking, workshops, and private consultations around digital transformation and future-ready business models that treat human data with respect and center the human experience in strategy.

Some people refer to big data as the fuel of the 21st century. Does that mean that we are the wells companies are eager to drill? How do they drill us?

I’ve never been a fan of the “data is oil” metaphor because it flattens out the important emotional nuances of understanding that the majority of the meaningful data companies work with is from our human interactions and transactions in this world. But certainly our data is being harvested for profit-seeking uses across every imaginable source and touchpoint, whether individually or collectively, from our communications to our purchases to our medical histories and everything in between.

But certainly our data is being harvested for profit-seeking uses across every imaginable source and touchpoint, whether individually or collectively, from our communications to our purchases to our medical histories and everything in between.

Some of that is understood to be a fair exchange of data for access, for convenience, or for mutual value. But a lot of it is happening in ways that most people have no concept of, and might feel uncomfortable with if they understood.

In one of your conferences you said that our data is sometimes used in ways we didn't opt into. Can you give us some examples?

Well, to begin with, most people don’t read the user agreements they accept when they sign up for a service, a product, an app, or a game. So while the agreement may specify that the data may be used for advertising, targeting offers, or reselling to third parties who may use it in just about any way they see fit, people may legally have opted in to the use of their data that way, but they didn’t expect it. And that’s not the overtly malicious kind of data abuse; that’s more just the commonplace sort of misuse and neglect that happens throughout the digital ecosystem. The malicious kind is easier to spot: the Cambridge Analytica-style game that harvests useful data for manipulating political campaigns and elections is easier for people to recognize after the fact.

But the everyday collecting of human data across our movements, behaviors, interactions, purchases, near-purchases, relationships, and communications is far more complex and much harder for people to comprehend.

But the everyday collecting of human data across our movements, behaviors, interactions, purchases, near-purchases, relationships, and communications is far more complex and much harder for people to comprehend.

GDPR is a start, but it is enough to protect privacy on its own?

GDPR is indeed a start, but it deals with data collection and privacy as we’ve known them historically, and not how we’re coming to treat them going forward. Unfortunately its implementation has also left the web experience clunky and clumsy. I would love to see solutions emerge from that. Moreover, we need to advance our understanding of what privacy is in a world of surveillance capitalism, ever-present social media, selfie culture, and more. We need a contemporary understanding of what people should be able to expect and demand from business and government.

How can we turn big data into something we all profit from?

I’ve seen data ownership movements and other empowerment movements that seem well intended. There may yet be collectivist approaches to the management and maximizing of personal data and big data that could work. I look forward to encountering more of those.

Can we, as individuals, work to humanise the technology we are using? How?

As individuals our responsibility is to use and choose technology mindfully. We can choose some of the technology that surrounds us, and we should do that with as much savvy and critical thinking as possible, with as holistic a view as possible about the consequences of those technologies. For example, we don’t have to accelerate the adoption of surveillance technologies in our homes and private spaces, especially when we consider how these technologies and the lack of precedent around the data they collect can entrap the data of other people who have reason to come into our space, such as surveillance doorbells and delivery people, or smart speakers and guests in our home. In some cases that data collection can have risky or dangerous consequences for those people, people who never elected for their data to be captured that way at all.

We can choose some of the technology that surrounds us, and we should do that with as much savvy and critical thinking as possible, with as holistic a view as possible about the consequences of those technologies.

We also need to hold companies and governments accountable to create policies and standards that protect human data. We also need to participate thoughtfully, with savviness and sophistication in online communities and culture, and not fall prey to every meme or game that asks us to share our personal data.

How can the digitalization of society help fight climate change?

On one level, the digitalization of society helps fight climate change because with it has come a level of interconnectedness between humans on earth that allows us to have greater awareness of one another, greater connection with one another, and greater empathy for one another. There are certainly tradeoffs and not all of digitalization has had noble results, but I think a better ability to feel empathy for people halfway around the globe is a net positive and can help us all feel more urgency when we see humans suffering due to climate-related catastrophes.

A better ability to feel empathy for people halfway around the globe is a net positive and can help us all feel more urgency when we see humans suffering due to climate-related catastrophes.

On a larger scale, we need to consider how the increasing technologification of society is going to help climate change. I keep a collection of headlines and articles that relate to emerging technologies, especially machine learning and intelligent automation, and how they’re being tested to deal with some kind of problem or other such as land use, water sanitation, and so on. I have a slide that I sometimes show in my talks that lists some of these headlines and color-codes them to match the Sustainable Development Goal with which it aligns. It’s a powerful visual, I think, that illustrates how much good emerging technology can do if we harness its capacity and power for good. On the other hand, the carbon footprint of many of these solutions from machine learning to blockchain is dauntingly high. I imagine those footprints will come down as both the logic and the algorithms as well as the hardware continue to become more efficient. But it’s a difficult equation and one we need to keep our eyes on.

Technological waste is currently an issue for third world countries which are used as dumpsites by first world countries. Is the digitalization of society viable at this rate of techno-pollution?

This is another reason why I say we need to choose and use our technology mindfully and with a holistic view.

The macro view of this is that with the right guidance, we may be able to change the equation there too and use technology to make the progress we need while also using it to help solve the problems it creates.