Q&A  | 

When the welfare algorithms forget women with Cristina Pombo

“Seatbelts, headrests and airbags in cars have been designed mainly based on data collected from car crash dummy tests using the physique of men”

Tags: 'Cristina Pombo' 'Digital gender gap' 'Gender equality' 'Gender-inclusivity'

SHARE

Cristina Pombo is the advisor of the Social Sector at the Interamerican Development Bank (IDB). She is an expert on operational issues and special strategies in education, health, social protection, labour markets, gender, diversity, and migration.

Pombo also leads the digital transformation strategy for those areas in Latin America and the Caribbean. In this capacity, she spearhead the Social Digital and Data cluster which includes initiatives that leverage digital technologies to improve social services such as fAIr LAC, the first public-private alliance for a responsible use of artificial intelligence in the region.

What do we understand by the Digital Welfare State?

In a digital welfare state, as stated by Philip Alston from the UN, systems of social protection and assistance are increasingly driven by digital data and technologies with emphasis on the associated risks such as the digital dystopia.

However, for us at the IDB, a digital welfare state can be considered “intelligent” if the definition extends and strengthens the preventive role of the state using digital technologies and artificial intelligence to make and support public policy decisions.

Furthermore, it is the state that makes responsible and ethical use of digital technologies and data tools to deliver public services at scale, weighing risks and opportunities.

Following an article by The Guardian, Britain's welfare state has been taken over by shadowy tech consultants. How big is this a risk in the process of welfare state digitalisation worldwide?

It is a real risk. Nowadays, we are seeing how governments are using more technology, like Artificial Intelligence, to provide social services such as cash transfers. From our perspective, these tools can be very helpful for citizens but only if they are designed considering key principles such as ethics, privacy, transparency and, maybe the most important one, user-centered design. Digital social services cannot be designed with the sole principle of making the job easier for governments, state departments or public servants. These tools must be designed keeping the citizens’ interest in mind, and by this we mean considering students’ needs in the education sector, patients’ needs in the health sector, and so on.

Is there a gender gap in the digitalisation of welfare systems and why is that?

There are several gender gaps when we talk about digitalisation of welfare systems. Some of them are in digital access, ownership of digital devices, digital fluency, and there is even a gap in the capacity to make meaningful use of the access to technology.

One factor to consider is affordability, but there are also significant socio-cultural norms that restrict access for women and of course location. We have recently published a study with IICA titled “Digital gender inequality in Latin America and the Caribbean”, and basically what we found is that rural women are the least connected group to Information and Communication Technologies (ICT) in most of the countries of Latin America and the Caribbean. Moreover, in 17 of the 23 analysed countries of the region, fewer women reported owning cell phones compared to men. Also, this study shows that women with low education and living in rural areas are the least “connected”.

Additionally, there are other phenomenon to take into consideration, such as the bias in data collection and the low representation of women in the design of the tools.

Many argue that one promising way of ensuring that historical gender bias does not get amplified and projected into the future is to increase the diversity of thought through the number of women in tech. Only 22% of AI professionals globally are female, compared to 78% who are male according to the World Economic Forum.

What are the potential ethical risks of this?

Covid19 has accelerated digital changes in the way food and food supplies are produced in rural areas, but also in the way they market these products. Currently, women are disconnected and therefore they are being left out of this important transformation. The consequences are even worse, the main consequence of these gaps is that women are falling behind in opportunities to improve their lives.

But this is not the only problem, we have also seen that gender gaps in digital inclusion, if not properly addressed, are likely to lead to gender inequalities in many other areas such as labour markets and even financial inclusion. This is because digital technologies are omnipresent, and digitalization affects all areas of our lives. Also, the more basic services are delivered by digital technologies the bigger could be the gender gaps in welfare. If only a group of society will be able to harness the opportunities of engaging in the economy, the consequence will be an increasing inequality. Furthermore, it will increase the welfare gap between those with connectivity, access and knowledge of technology, and the digitally excluded.

Algorithms used by public authorities in the US today in judging re-offending risk or in re-assessing disability benefits show that algorithms there already have a direct impact on the realisation of human rights. Are women more affected by the biases of algorithms?

Sure. This is because a historical fact of using men data for feeding systems that serves women as well. The design of many algorithms ignores the sex and gender dimension and its contribution to health and disease differences among individuals.

Take the concrete example of seatbelts, headrests and airbags in cars which have been designed mainly based on data collected from car crash dummy tests using the physique of men and their seating position. Women’s breasts and pregnant bodies do not feed into the “standard” measurements. As a result, women are 47% more likely to be seriously insured and 17% more likely to die than a man in a similar accident explains Caroline Criado- Perez, author of Invisible Women, and Lauren Klein, co-author of  Data Feminism, in a recent BBC interview.

If artificial intelligence is based on biased data sources predominantly from men and/or based on male profiles, terrible things can happen.

In addition to gender biases, AI faces other diversity biases challenges, including race and ethnicity. This can be of concern, for example, when AI technology is supposed to diagnose skin cancer for which the accurate detection of skin-color and its variances matter.

Consider the case of face recognition algorithms which were studied by Algorithmic Justice League founder, Joy Buolamwini. She found that the share of input images on which various facial recognition algorithms were based consisted of 80% images of white people and 75% male faces. As a result, the algorithms had a high accuracy of 99% in detecting male faces. However, the system’s ability to recognize black women was significantly lower at only 65% of the time. Thus, focusing on gender only is not likely to solve other intersectionality issues in AI.

Also, following Foreign Policy, a municipality in Denmark has been experimenting with a system that would use algorithms to identify children at risk of abuse, allowing authorities to target the flagged families for early intervention that could ultimately result in forced removals. Can we trust machines with decisions such as this and fully leave human intervention out of the equation?

Our experience until now is that the decisions in social services cannot be taken entirely by machines. There is a great example from Washington DC. You can read about it here.

We must use technology as a tool to help humans to do their jobs better. Imagine a doctor that can use an algorithm to detect a specific problem in a radiography; it will give the doctor more time to spend with the patient and design the best treatment for him/her. We do not believe in tech substituting humans but complementing their jobs and skills to improve it.

How can we make sure the digital welfare state is equalitarian in terms of gender?

From the IDB we support programs on closing these gaps by using digital social services. A good example is the program Prospera in Mexico, one of the oldest conditional cash transfer programs. It promotes maternal and infant health.

The digital component of the program was aimed at Prospera beneficiaries who were pregnant or had an infant. They received a mobile solution where they could learn and track the different stages of their pregnancies. This solution also offered them health advice on areas such as nutrition and physical activity and ensured that they could identify risk factors that could potentially lead to birth complications or death.

The results of this pilot program proved its effectiveness in impacting the baby’s health indicators such as height and weight at birth as well as other indicators such as length of the pregnancy and number of prenatal consultations, both of which are key elements to prevent birth complications. The pilot program also proved to be effective in helping women to adopt mobile technology in new and useful ways, indicating that pertinence is a key element to promote adoption.

Finally, which good news could the digital welfare state bring for women?

If it accelerates the women’s adoption rate of technology, it could be a pillar to shortening the gender gap.