How bias happens in Machine Learning

Interaction bias? Latent bias? Selection bias?

An insightful video by Google Creative Lab explaining how intelligent machines perpetuates humans bias.

Just because something is based on data, doesn’t automatically make it neutral. Even with good intention, it’s impossible to separate ourselves from our own human biases. So our human biases become part of the technology we create in many differente ways.

The problem of gender bias in the depiction of activities such as cooking and sports in images

The challenge of teaching machines to understand the world without reproducing prejudices. Researchers from Virginia University have identified that intelligent systems have started to link the cooking action in images much more to women than men.

Gender bias test with artificial intelligence to the act “cook”: women are more associated, even when there is a man in the image.

Just like search engines – which Google has as its prime example – do not work under absolute neutrality, free of any bias or prejudice, machines equipped with artificial intelligence trained to identify and categorize what they see in photos also do not work in a neutral way.

Article on Wired.

Article on Nexo (Portuguese)

Reference: Zhao, Jieyu, Tianlu Wang, Mark Yatskar, Vicente Ordonez, and Kai-Wei Chang. “Men Also Like Shopping: Reducing Gender Bias Amplification Using Corpus-Level Constraints.” arXiv:1707.09457 [Cs, Stat], July 28, 2017.