Insights. Inspiration. Innovation.

“Ultimately, AI is a mirror. If we feed our systems with data embedded with our biases, those systems will reflect them right back at us and, worse, out into the world.” 

Daniela Braga, founder and CEO of DefinedCrowd

Does AI have a gender bias problem? Considering that AI relies on algorithms which learn from real-world data, it seems unavoidable that AI applications will reflect existing biases present in our society. In fact, according to Gartner, by 2022, 85% of AI projects will deliver erroneous outcomes due to bias in data, algorithms or the teams responsible for managing them.  

There is also the worry that the over-representation of men in the design and development of AI technologies may end up creating more exclusive than inclusive models. According to the 2019 World Economic Forum’s Global Gender Gap Report, only 22% of professionals working in the AI and data industry are female. Technology was meant to be the great equalizer, removing social, geographic and physical barriers while promoting connectivity, financial inclusion and access to services. However, the lack of diversity in the technology sector can compound society’s problems rather than address them.  

As we lead up to International Women’s Day this year, we’ll be speaking to five DefinedCrowd professionals to get their thoughts on gender bias in AI and how to confront it. 

 An Interview With Elmira Hajimani  

A machine learning engineer at DefinedCrowd, Elmira got her love of mathematics from her father. Her interest in AI began while she completed her degree in software engineering. She took an optional course in AI and was tasked with designing a model that could recognize handwritten digits. When she saw how it worked, she was instantly smitten. “It was a big motivation for me to explore the field and see what I could do with data and techniques,” she said. She’s been immersing herself in AI ever since! Here’s more with Elmira. 

What does your job entail? 

I conduct research based on the clients’ needs and then implement my findings. I also work on the automatic identification of crowd quality, removing low-quality annotated data from the data stack we’re going to deliver to the client and so on. 

What’s it been like working as a female in a traditionally male dominated field? 

The gender of the people I’m working with is not important to me. Everyone I work with shares a common passion – we’re all here for the same thing. I don’t think too much about it. I have been lucky enough not to face any gender discrimination in my career so far.  It’s also not been a challenge I’ve had to face at DefinedCrowd, as 42% of the tech employees here are women.  

There are far more men working in the data and AI field than there are women. Why do you think this disparity exists? 

I think some women believe it is too difficult to enter a maths-related field; they are somehow afraid of attempting a career in the industry. Perhaps they think they won’t be successful. Some others might think it will be boring – no one has explained what the job entails. But the important thing is the applications. If you want to motivate someone to consider a completely new field, you could show them the application of the data – how the data is applied in real life situations. It’s important to make it tangible to people.  

There has been much said about gender bias in AI. Is this a problem and how do we fix it? 

Bias in AI systems is caused by the data used to fuel it. It’s like training a child to detect, recognize and make decisions. If you’re providing specific inputs that are biased to gender, age or race, the output of the model will be biased. Machine learning teams must make sure the data they use is equally balanced.  

Take a model that analyzes credit risk, for example. If the data they use to train the model is skewed towards men, the model is going to be biased towards the gender. But if the data is balanced equally between male and female, the model will not regard gender as an important feature. Data must be equally distributed between the genders.  

The trick is to consider the bias factors in advance. First list the possible factors that could cause bias and then make sure the data is equally balanced. When we provide data for input, we make sure the data is completely balanced with respect to any field that you wouldn’t want to be considered as an important field in the model. This is the only way to make sure your data represents all the different people, voices and accents in the areas in which the models will operate. 

What do you love about your job? 

It’s such a challenging field – you’re trying to extract something useful from huge amounts of data to learn insights. I love creating completely new applications which can simulate human behavior, like recognizing images, understanding voices, responding and translating. Basically, all the tasks that need a lot of intelligence. I think it’s cool to develop these applications! 

What advice do you have for women looking to forge a career in the industry? 

The best thing for women to do is to complete some internships, so they can learn about the real challenges. Immerse yourself in the work. It’s important to keep up to date with all the developments in the field, as it is a very fast-paced industry. Every day, new ideas, papers and research works are being presented, so it’s important to stay abreast of these developments.