BIAS IN AI
Bias is now expanding its reign from humans to machines — The AI Bias
Other than the COVID pandemic, the one word that we have heard the most in the last couple of months is bias. We have biases embedded in our society, regions, and countries extending to bias in datasets and algorithms. We will talk about bias in AI today. Before doing some research I asked myself, what could be the result of AI bias? The maximum I could think of was the face detection software used by companies and government. But bias is not limited to that only. It has a larger span and multiple forms. First I will talk about how biases are built unknowingly in the machine learning systems and then I will talk about how these inbuilt biases in the trained algorithms affect us in multiple forms giving some examples.
How bias is built in a machine learning system?
More than 98% of AI applications that exist and are commercialized today are using machine learning and machine learning needs data to train them. The machines slowly start behaving the same as the data fed to them. What if the data itself is non-representative? What if the data itself is corrupt intentionally or unknowingly? If the data contains more and more one type of representation, then the algorithm will detect that kind more precisely than others. If one race/culture/region is more prominent in one activity and data related to those are used to train the algorithm, then a normal person can also be tagged as having an interest in that activity even if he/she does not. These activities can be from positive to very negative. These biases exist in both, the computer vision and the natural language processing systems.
Some examples to identify AI bias around us
Let me throw some examples, how biases are affecting our daily life.
The surveillance problem
Recently it was in the news that a U.S. Based pharmacy chain, Rite Aid, installed face recognition systems in their stores to predict the chances of shoplifting. Cameras in security guards’ phones and the ones installed in the stores were flagging those people who were…