The Women in Tech Series: Xiaoting Wu

Confronting gender inequality in AI

“Unless data collection and algorithmic development are carefully monitored, AI can be a dangerous catalyst for perpetuating bias and allowing history to repeat itself in undesirable and antisocial ways.”

Daniela Braga, founder and CEO of DefinedCrowd

As we lead up to International Women’s Day on March 8, we’re exploring the issue of gender bias in AI. We’ve asked five DefinedCrowd professionals to give us their thoughts on bias in AI, the representation of women in the industry, and how to address the gender imbalances present in both the product and the workforce.

Today, we chat to Xiaoting Wu, a software engineer at DefinedCrowd, who is part of a team responsible for managing the way we provide our services to enterprise members. A mathematics lover at heart, Xiaoting studied engineering at university and naturally fell into software engineering. “I like things that are abstract, and I like dealing with data, so I’m in the perfect industry,” she said. Here’s more with Xiaoting:

What’s it like working as a female in a male-dominated industry?

So far, I would say it has been a good experience. I get along with my male colleagues, but I think that comes down to the fact that we have a lot in common and share many traits as technical people.

The World Economic Forum’s Global Gender Gap Report shows that only 22% of professionals working in data and AI are female. Why do you think this disparity exists? 

I guess there is this universally held belief that women are bad at mathematics or at working with machines. We also tend to gender-stereotype jobs, which shape our expectations about whether a man or a woman is a better fit for a certain job. These stereotypes definitely influence a lot of women when they’re deciding what career to choose.

The tech industry also requires a certain level of education or training, and there are many women around the world who are not in a position to receive this level of education. Gender inequality exists in so many places, and at so many levels that it infiltrates all parts of society. The gender gap in the data and AI industry is a symptom of a greater social problem.

How do we solve it?

There is no single solution. You have to address it on multiple levels. I think the public sector has a big role to play in providing education. Education plays a big part in influencing the future.

What advice do you have for young women looking to enter the tech industry?

You’ve got to be assertive and true to yourself.  Although this applies to everyone, it applies particularly to women. Sometimes we compare ourselves to our male colleagues and feel inferior. I am not a geek, I am not into high-tech gadgets or anything; basically I am not your “typical” software engineer. But as time went by, I realized that I could do a good job without being a geek. If I was more assertive in the beginning, I would have suffered less!

Have you faced any gender-specific challenges in your career so far?

As a female in the tech world, it can be more difficult to integrate socially, which impacts on work. For example, a lot of the time, professional communication is done in a more casual way – around the water cooler, for example. These are not formal meetings; they are discussions about work, but within a casual environment. This happens more often between male colleagues. They naturally gravitate towards each other, form bonds and provide each other with support when needed. A lot of communication that is crucial to the workings of the business happens in these impromptu social meetings, and as a female, you tend to miss out on these discussions. So you’re in a more passive situation as you are not in a position to contribute. It’s not done maliciously, but it can be a problem. However, it is such a difficult thing to detect or address.

Do you think there is gender bias in AI? If so, how do we fix it?

It’s not a matter of me thinking there is gender bias in AI – it’s a proven fact. AI products are developed by human beings, so it’s very natural that biases are unwittingly brought into AI as it’s being developed.

AI is a work in progress. It’s continually evolving. Although AI was first proposed as a concept in 1940, its application to industry only began 50 years ago, so there are a lot of things we have to address and improve. It may be an idea for machine learning teams to keep up to date with the latest research, to continually increase their knowledge and learn from past mistakes. If you want to avoid gender bias, you need to make sure you are aware of the issue. You’ve got to be a better person yourself – you’ve got to face and challenge your own biases.

Why do you think it’s important to confront gender bias?

It’s as important to address gender bias in AI as it is to address it in our society. Gender equality has been a social movement that has been ongoing for centuries. As long as there is this issue in our society, it will penetrate all aspects of our lives. AI is very quickly becoming part of our lives and society, so it’s something that we need to confront, for sure.

However, we also need to distinguish between bias and discrimination. If you think about it, gender bias is part of the fabric of society. I don’t think we can completely remove bias and stereotypes altogether, as they are just statistics and they do exist because they reflect part of the truth.

For example, females get better deals on car insurance. The logic behind this is that females are considered to drive more safely than males. In this case, there is also a gender bias, but one that works to the female’s advantage. So bias will always exist – you can’t completely remove it. Bias is a way for us to understand how the world works, and how to process data and make decisions. They may not always be correct, but I don’t think it’s the same as inequality.