AI Bias & Human Dignity

Mari Kussman
Chief Design Officer
February 16, 2021
“I strongly feel that this is an insult to life itself.”
Hayao Miyazaki

In a short video clip that rocked some niche corners of the internet a few years back, a Japanese deep-learning researcher is showing Hayao Miyazaki (one of the worlds greatest animated storytellers of Spirited Away fame), a simulation of a human body that lacks pain receptors. The result is a writhing zombie, slithering unnaturally across a virtual terrain. Asked for his opinion, Miyazaki is pensive when he describes his friend with physical disabilities who often strains to greet him with a high five; how real and precious his friend’s life is. Finally he elegantly mic drops to the flabbergasted researcher “I strongly feel that this is an insult to life itself”.

His statement can be thought of as a warning shot in an increasingly algorithmic world, where a lack of empathy can create new untold horrors. 

Most of us are now no stranger to “AI bias”, many headlines have criticized AI-powered facial recognition technologies that made it to market without being able to recognize Black people.  And when they do recognize minorities- Black and Asian people are orders of magnitude more likely to be misidentified than white people, a point of moral concern in situations like when experts rely on recognition technology to identify insurrectionists at the recent raid on our nation’s capital.

And when we look to the future, this oversight gets much worse with research suggesting that driverless cars are currently programmed to be more likely to drive into black pedestrians. 

How can this all happen? 

It may seem as simple as engineering team, never training their machine learning program with sample sets with diverse representation. Simple fix right? Improve representation in training samples! But the problem is actually much more insidious. 

Our tech products are almost never vertically integrated, value creation is so specialized along a long supply chain of software systems, hardware components, data sets, and people. Racial and gender bias may unknowingly make its way into a product at any point in it’s development. 

When we optimize for efficiency rather than human dignity, we lose something in the process. Miyazaki’s disgust at the deep learning algorithm that produced such unnatural movements, was because he understood through his relationship with his friend with disabilites, how real human pain can be. To create such a mockery of human life was an affront to his friend’s dignity. 

To combat this, a framework known as participatory design is gaining popularity. In paticipatory design, any end users or those affected by the final product must be a part of the design and decision-making process. This process forces designers and engineers to ask bigger questions: Will this hurt anyone physically or emotionally? Does it impede on their human rights? Should we even be making this at all?  Not only has research suggested that the process yields more innovative ideas, but it is a promising framework for combating AI bias. 

Stories of technological advancement leading to the dehumanization of certain groups of people is and old one. The 1920’s film Metropolis, an epic tale of technological dystopia knew this message well. Emblazoned in it’s inter-title is, “The Mediator Between the Head and the Hands Must Be the Heart”. 

A century old sentiment and a lesson we no doubt should have learned a long time ago. Time is of the essence.


Tell us your idea.

drop us a line.