Technology

Diverse and inclusive organizations needn’t be rarities | Mint – Mint

Artificial Intelligence (AI) is one of the latest fields of knowledge and business. But even this exciting, cutting-edge sector is not an exception when it comes to problems related to diversity and inclusion (D&I). According to the ‘2021 AI Index Report’, female faculty make up just 16.1% of all tenure-track faculty whose primary research focus area is AI. This disparity is further illustrated by a lack of diversity among the tech industry’s top companies. For example, according to an AI Now study, women comprise only 15% of AI research staff at Facebook and 10% at Google. If diversity and inclusion is a problem in these high-tech companies, one can well imagine the state of the problem in traditional sectors like manufacturing. What can be done to solve this crucial problem in organizations?

The first step in tackling the D&I problem is to understand some evolutionary, fundamental facets of human nature. Categorization is an essential first step in the decision-making process of the human brain. Humans will always have an affinity for categories that are perceived as similar to one’s own and will always have an aversion towards categories that are seen as dissimilar. No doubt, perceptions of what is considered similar or dissimilar can easily be manipulated by vested parties. Most of these biases are governed by non-conscious processes of the human brain. At a conscious level, one might not even be aware that they have these biases. So, one needs to keep these biological factors in mind while designing solutions for the D&I problems of an organization.

Traditionally, to manage D&I biases at the recruitment stage, organizations have focused on masking or treatment of data features that lend to these biases. This strategy might not lead to much improvement, because after all data is just a reflection of the biases that already exist in any society. There are also second-order effects (proxies) in data that can contribute to the same biases. For example, a harmless postal code encodes neighbourhood, regional and economic information. Masking data features in many cases only manages to transfer the problem to a different location. One might resolve a bias at the stage of selecting interview candidates by using a masking strategy. But how do we make sure that biases do not play a role in face-to-face situations, say at the interview stage?

Leading professor of AI-ethics, Sendhil Mullainathan of the Chicago Booth School of Business is someone focused on the problem of algorithmic bias. He suggests that we think a bit deeper and “decide which candidate features to make available for the algorithm”. For example, to verify the credit worthiness of a person, one normally uses the traditional financial data points. But, what if one were to also add quantifiable behavioural data points that portray the character of the person while evaluating the individual’s credit worthiness? After all, a person who shows a strong inclination for continuous learning has a higher chance of career success and thus financial stability in future. The introduction of these relevant behavioural data points helps counter-balance biases in the traditional data. And since many of these behavioural data points used in this counter-balance proposition are based on the universality of human nature, they will be relevant across different ethnographies.

Another suggestion for tackling D&I problems in an organization comes from the work of Kristian Ove R. Myrseth, and Ayelet Fishbach of the University of Chicago and Yaacov Trope of New York University. Their work on counteractive self-control has shown that making temptations available makes those temptations less tempting. So, the more D&I becomes a regular topic of discussion in an organization, the lower will be the chance of someone indulging in D&I-related misconduct. If discussions on these ‘temptations’ could be converted into organizational rituals, the result would be even better.

The mitigation of biases towards the LGBTQ+ community holds many lessons for managing the D&I problem too. In 1988, fewer than 12% of American adults agreed that gay people should have the right to marry. But in 2018, 68% of those surveyed said that gay couples should have that right to marry. According to Stanford sociologist Michael Rosenfeld, “There’s more rapid change in attitudes towards gay rights in the past thirty years in the United States than there ever has been in recorded attitudes in the United States on any issue.” The single most important factor contributing to such a drastic change in attitude was that those opposed to gay rights realized that someone in their family or close circles was gay. Affinity can burn away even some of the most stubborn biases. So, if organizations consciously create teams that are truly diverse and inclusive, constant interactions among diverse members will help people understand each other better and build stronger bonds of affinity.

While trying to solve D&I problems in an organization, one place we should look up to is Rwanda. Just a few years ago, this country was subject to one of the worst genocides in human history. But today, those very communities that were killing each other are now living together peacefully. The strategies that helped create such a peaceful environment could surely help build more inclusive organizations. By the way, Rwanda ranks among the top five countries in the world for gender equity too.

Biju Dominic is chief evangelist, Fractal Analytics, and chairman, FinalMile Consulting.

 

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.

More
Less

Subscribe to Mint Newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.