Passion. Proficiency. Pride.  

By understanding bias, the sources from which it originates and the processes by which they infiltrate data, we can actively design systems to avoid, minimize and eliminate them as much as is humanly and mechanically possible.”

Daniela Braga, founder and CEO of DefinedCrowd. 

In a world where thirty-something professional women are being targeted by adverts for diapers and pregnancy tests, it’s natural that we should start to question the bias that infiltrates technology.

We conclude our #WomeninTech series with an interview with Julie Belião, a senior technical product manager in machine learning and crowdsourcing at DefinedCrowd, who gets understandably annoyed with the constant barrage of diaper adverts she’s exposed to on the internet.   

“I’m 32, I am a woman and I don’t have kids. But I am constantly targeted by advertisers selling diapers. It drives me crazy. It’s prejudicial: it’s assumed that because of my gender and age bracket, I want kids, or should be having kids. It’s such bad targeting, and actually has the opposite of the intended effect.” 

We chat to Julie about gender, technology, bias, and how to get more girls interested in the data and AI industry.  

Julie, tell us what your role at DefinedCrowd involves? 

 I was originally hired to oversee the work of the machine learning team, but very quickly took on the task of overseeing the product-related work of the crowdsourcing platform, which consists of 5 squads, working with product owners, engineering managers and UX designers. My main role is to make sure the products we develop or innovate are aligned with the needs of our clients, and with our company vision.   

You work in the AI industry, but you’re not a fan of the term “artificial intelligence.” Tell us more. 

 I don’t really like the term “artificial intelligence” as I think it’s more a marketing term. The correct term would be data science or machine learning. I also don’t like what “artificial” in implies. “Artificial” implies the product is inferior to the original, like artificial flavor or artificial grass, for example. What we do with AI is not something that is inferior; we are creating a new type of intelligence. We are transcribing human abilities into a machine language, so I would prefer we spoke about machine intelligence instead of artificial intelligence.  

You work in an industry in which only 22% of the professionals are women. What’s it been like working in such a male-dominated field? 

I don’t like to see it in that way, I like to see myself as a human. I don’t like thinking about my gender, and I have had very few moments in my 10-year career where my gender has been an issue. In fact, women have played a key role in pushing me to where I am now, whether by mentoring me, challenging me, or inspiring me, and unlike some my colleagues who experienced adversity from other women, I have always felt nurtured and supported. I haven’t felt patronized or discriminated against because I am a woman. Also, I don’t really feel that difference at DefinedCrowd, as a large proportion of the staff here are female, which is great. At DefinedCrowd, I feel valued as an individual, not because of my gender.  

Why do you think women are so under-represented? 

I think it relates to the personal sacrifices you have to make to get to a certain level of your education and career. The way in which society is organized also doesn’t lend itself to women starting a family while also studying towards a PhD or leading a team of engineers.  
 
I think education has a role to play, but it needs to start early – when kids are at school, or even pre-school. We need to incentivize little girls to become bolder and more daring, to play with toys that are more connected to discovery, to science. It starts at home. I was struck by this when I was shopping for a Christmas gift for my stepdaughter. When it comes to toys, shops are extremely gender-segregated: blue, cars and science kits for the boys; pink, dolls and barbies for the girls. If you give little girls the will and curiosity to explore things more related to technology, you are paving the way for them to consider more technology-related careers in the future.  

Finally, I think it has to do with role models. We’re seeing more and more female role models, but twenty years ago, this wasn’t the case, or they unfortunately didn’t get the visibility they deserved, so it was difficult for girls to imagine themselves in certain positions. Things are changing and I think we need to give society time to absorb the change. I am sure in the future, women like us, like Daniela, our CEO, will be role models for little girls today. And maybe this will be the trigger for them to realize they can be anything they want to be. 

 Do you think there is gender bias in AI? 

Yes, but nowadays, with crowd sourcing, gender bias can be overcome. The bias I am more worried about is cultural bias. When you collect data through crowdsourcing, you can easily ensure you balance your data to equally represent all genders and identities. However, I think the difficulty comes in when you’re trying to represent humanity as a whole. For example, say we are conducting speech data collection in the US. We will easily collect a sample that is split equally between all genders or identities, but we will not easily get samples from people with different levels of education, or from people who live in rural areas. This is because the people who are less educated, or who don’t live in large metropolises, don’t necessarily have access to technology. I think, nowadays, cultural bias might be a more challenging thing to address than gender bias.  

How do we eliminate bias from AI? 

I think one of the ways is by using crowdsourcing, as it gives you access to a large panel of individuals. Another way to address bias is to be more aware of bias and understand it, as individuals, but of course as data scientists, engineers, and product managers. We’ve also got to train the humans who are running machine learning algorithms to be aware of different biases so they can actively and proactively eliminate them. If you are not trained to spot discrepancies in your data, you will find it very difficult to eliminate such discrepancies!  

Why do you think it’s important to address gender bias in AI? 

AI has so many exciting applications: from healthcare to retail. However, if your machine learning algorithm is skewed towards one gender, or one segment of the population, these applications will not work. If a system we develop is fed with data collected only from white males with five five years of college, a large part of your customer base won’t relate, and your product will not sell or work. If you want to make money by developing a product for a population, you want to reach as many people in this population as possible.  

Have you ever faced challenges as a woman in the data and AI field? 

Mainly self-imposed challenges. As a woman, when you are in a meeting, it can be difficult to speak over someone, or propose your ideas. When you interrupt someone for example, or raise the tone, as a woman, you are seen as rude or bossy. It’s far more acceptable behavior for a man.  
 
Women are not raised to dare. We are raised to be cautious. We need to overcome society’s conditioning and share our opinions and thoughts without fear. What we have to say is as important as what men have to say, but sometimes it takes a conscious effort to believe this. 

Keen to read more from our #WomeninTechSeries?

Elmira Hajimani. Machine Learning Engineer
Xiaoting Wu.
Software Engineer
Beth Malloy.
VP of Finance
Rui Correia.
Lead machine learning engineer