Q&A  | 

Digital welfare services and gender stigmatisation with Kristin Heffernan

"With the tech industry dominated by young white men, I believe these biases will persist".

Tags: 'child welfare'

SHARE

Reading Time: 6 minutes

Kristin Heffernan is an expert in gender biases in welfare systems. She received her Masters of Social Work in 1996 from Fordham University and is currently a full professor at Brockport College, State University of New York. She's been a member of the Massachusetts Coalition for Sexual Assault and Domestic Violence - this is when she really started to take an interest in women’s rights. She went on to get her PhD in Social Work at Boston College and after graduating she left the US and went to teach at Royal Holloway, University of London from 2003-2009, thus developing a more global perspective on gender biases in welfare systems.

Which are the most pervasive biases in welfare research and practice?

Gender and race biases in welfare research and practice stemming from historical beliefs about gender or ‘doing gender’, as they play out in contemporary policy, are complex and overlapping. They can be found in legislation related to divisions of labor as well as conditions of employment. Looking just at the United States, the Personal Responsibility and Work Opportunity Act (PRWORA) replaced Aid to Families with Dependent Children—which was an entitlement for those eligible.

Whereas its replacement, Temporary Assistance to Needy Families (TANF), is not an entitlement program and as such makes it much more difficult to access the benefits provided through this program, hence limiting the number of persons who receive support. The majority of persons who apply for TANF are women.

Issues or work requirements is not just an issue in the U.S., many capitalistic nations’ social supports are provided based on participation in the work force. Due to interruptions in work force participation in order to provide care, whether it be for children or older adults, women often find themselves limited with regards to the support they receive. Social controls or regulations over traditional public provision and social support often disadvantage women and other minorities. Additionally, in all families, but especially in single mother families, when work is required in order to receive welfare benefits, other family legislation addressing maternity and or parental leave as well as access to affordable, quality child care are often lacking (e.g., Gabel & Kamerman, 2006; Mahon et al., 2012; Olsen, 2007).

Issues such as poverty, hunger, and violence disproportionately affect women (Gurr, 2000). In many societies we see the effects of gender biased policy as they relate to violence, care giving as well as rights of the body. There is no one area that is the ‘most pervasive’, as these biases are systemic and affect income programs, social services, and protective legislation.

Do they persist in the digital welfare state? With the tech industry being preeminently dominated by young white men, can these biases increase?

There is no doubt that many of these same issues persist in the digital welfare state era as they are not necessarily related to becoming a digital welfare state, but rather stem from historical, as well as contemporary patriarchal government systems. Global patterns in which discrimination is fostered by beliefs about the inferiority of women, minorities, and various other social categories continue to play a significant role in how women are treated (Zhao & Lounsbury, 2016).

With the tech industry being pre-eminently dominated by young white men, I believe these biases will persist.

How do these biases impact child welfare?

While I do not believe all technology is bad or harmful, I do believe that overhauling all welfare systems to make them digital has many disadvantages, especially for those seeking social service benefits. Such systems make assumptions about the person filing for services capability to use and access a computer, as well as the capacity of the software system designed to collect the data needed to make the appropriate decision about receiving services. While most individuals may indeed own a SMART phone, this does not mean that they can or will use their phone to access information on how to apply for benefits. Furthermore, moving to digital-only forms may preclude someone who is not tech savvy, from applying. Needing a computer to file for benefits may make it more difficult to apply for some and even more challenging to legally challenge adverse decisions.

Additionally, while it is supposed to be easier to gather information from different sources to put together a history of a person digitally, information can still get missed or wrongly input into the system. This could lead to possibly raising red flags where none exist, or just the opposite, and as such leaving a person in danger, or a danger to others in the process.

The same is true for systems that are put in place for the purpose of collecting data in order to make key decisions about funding for services. Any system that is created without ensuring that the use of the system is barrier free, has the ability to disadvantage one group over another. It is more than just owning a computer, it is about having time and being aware or educated about the need to fill-out the survey information. Furthermore, missing or incorrect data could mean that some programs do not get funded or refunded while others do, leaving those in most need without services.

Do we need a female and feminist perspective on gender bias in digital welfare?

Gender in and of itself is a social construct responsible for dictating masculine and feminine roles within society (Amato, 2018). Traditional feminine roles hold that women are responsible for caring and rearing of children and therefore to blame if things go wrong (Risley-Curtiss & Heffernan, 2003; Park et al., 2015; Russo, 1976). These gender biased beliefs continue to manifest in child welfare research, practice and policy.

In fact, mother blaming and shaming is rampant in the language and actions taken within research on this topic. Mothers accused of neglecting their children experience an immense loss of confidentiality and a very public intrusion when Child Protective Services (CPS) get involved (Sykes, 2011).

Failure to protect (FTP) laws also have been said to favor men, holding women accountable for failing to protect children at disproportionate rates relative to men (May-Chahal, 2006; Strega et al., 2013). Henry et al. (2020) found that workers’ gender stereotypical beliefs play a significant role in FTP cases. This same study concluded that the gender expectations used by workers to justify substantiation of a child protection case “reflect long standing social beliefs that women can and should protect their children from harm that is inflicted by others”.

I strongly echo the sentiment that child welfare as while it is intended, on the one hand, to protect and assist children, is on the other hand, punitive in the way it disciplines and shapes the behavior of a racialized and gendered poor population (Woodward, 2019).–Having said all of this, excluding fathers in child welfare is an issue that needs to be addressed.

How can we make sure the digital welfare state is equalitarian in terms of gender in child welfare?

The use of technology in child welfare is an ever growing industry, and like all new endeavors it presents challenges as well as success to be built upon. As such, more research on the impact of technology and how agencies or staff in child welfare systems are ethically and effectively managing these emerging challenges is needed (Breyette, 2014). Critical analysis and clear outcome measures need to be developed in order understand the pros and cons of the use of technology in child welfare. Agencies need to have updated technology and system programs in order to access information in a timely manner. The use of technology including mobile applications may improve timeliness and accuracy of documentation, as well as ensure access to critical information for the child protective case worker while in the field, but there are ethical considerations regarding privacy as well as confidentiality that need to be adhered to.

We know that the federal government has placed increasing pressure on states to involve fathers in children’s cases including identification and involvement of nonresident fathers (Child Welfare Information Gateway, 2016; Sciamanna, 2011), so is there a way to design technology to help ensure that fathers are not overlooked in the process? 

How can we make sure the digital welfare state is equalitarian in terms of gender?

While ensuring that more women are involved in the development of technology is important to ensuring a more equalitarian digital welfare stare—the involvement of women in all decision making systems is just as important. I would also argue that a continued and focused dialogue on the benefits of gender equality is needed. With such conversations being had, those in leadership positions could help shift the culture of acceptance or indifference to gender inequality, by raising the curiosity of others about the prospect of what could be if there was gender equality. Perhaps this dialogue would empower everyone to be part of the solution.

I believe there needs to be accountability for lack of gender equality, however, I believe that opening up conversations about why it makes sense to have more than just a male perspective, need to be had in order to gain buy-in and talk in terms of win-win situations.

Reframing how we talk about gender equality is important. As such, conversations that highlight the benefits of gender equality are essential to the conversation. Otherwise, the language gets to men that believe obtaining gender equality means losing male privilege or giving up power instead of gaining new insight.

Lastly, I believe that we cannot underestimate the power of gender ideologies. We live in a world that has been conditioned to believe in very specific gender roles for men and women. And even in persons who are more open to challenging masculine and feminine roles, implicit bias may contribute to actions taken or not taken that do not always match the language a person uses. Both words and deeds perpetuate gender inequality without knowing or meaning to do so. This is why it is important to not only ensure that women are physically present and represented in all aspects of society, but that actual conversations about how the addition of women in all aspects of society is, and can be, beneficial for all.

Finally, which good news could the digital welfare state bring for women?

At its best, with no errors, ethical regulatory standards and keeping confidential information confidential– improved welfare provision. For example it could influence the development of new and innovative ways of meeting the needs of the most vulnerable in our society including increased efficiency with the delivery of services in a nonjudgmental way. However, the research to this affect is still wanting.