Facial recognition software is used not only for unlocking our iPhones, but it is also employed in big parts of our daily life, such as employment decisions, law enforcement, and airport security screening. A modern issue in civil rights can be attributed to artificial intelligence and facial recognition software containing bias against individuals who are NOT classified as “a typical white man.” The invasiveness of this technology is apparent to those aware of the issue and those educated on the A.I. software being used by many companies and in daily life; as a result, this bias can lead to misinformation being spread about people of color.
What is facial recognition software?
Facial recognition software is “one of the advanced forms of biometric authentication capable of identifying and verifying a person using facial features in an image or video from a database” (What is AI, ML & How They are Applied to Facial Recognition Technology). A.I. is used to learn how to identify a person, verify them against a single image stored in a database, and then verify them against multiple images.
So… how did facial recognition bias begin to occur?
When the training set was created to implement facial recognition software for the first time, it was formed around a very specific subset of people taken from the population — white men. It has become apparent that this facial recognition technology fails to recognize and accurately identify dark-skinned people and women at the same level of accuracy. Most false matches tend to be with people of color, as the program does not have enough learning to accurately identify this population.
Where is facial recognition currently being used in daily life, and what are the consequences?
Facial recognition software is used more than many people might be aware of. The software can be used in:
- Employment decisions: Recruiters can analyze a video of your face, assess expressions, and garner a clear reading of your personality traits to make decisions regarding if they believe you belong at their company.
- Surveillance: Facial recognition is used as a surveillance tool in China, resulting in behavioral changes such as self-censorship and avoidance of activism for fear of retribution. It is also being used in stores (e.g. Rite Aid or Kmart) in low-income neighborhoods to target crime and loss prevention.
- Enforcing the law: The software is used to identify missing people or those who are suspected of being involved in a crime.
The consequences? Quality candidates not being considered for a job they are qualified for. Behavioral changes and prevention of self-expression and standing up for beliefs. False accusations and the incarceration of innocent Americans. The list goes on.
What can current and future data scientists do about this ethical issue?
At the Institute, we have discussed how we can be ethical data scientists and what we can practice and be aware of to achieve this goal. Here are some action items we can take to address the issue:
- Be aware and intentional about being antiracist data scientist (using Emily Hadley’s work listen here or read here).
- Focus on creating a diverse training set to consider race, age, gender, and other factors that may not have been included in the past.
- Be transparent to avoid invasion of privacy. Consider: How is the facial recognition is being used? Who has access to the data being collected? Are people aware they are being scanned?
Facial recognition software and A.I. will continue to hold a powerful position in everyday life and continue to become more apparent over time. Addressing the bias in facial recognition as soon as possible will allow this technology to play a helpful role in society instead of a harmful one.
Information used to write this blog:
- https://www.usc.edu.au/about/unisc-news/news-archive/2022/june/stores-that-use-facial-recognition-for-loss-prevention-what-it-might-mean-for-you
- https://www.spiceworks.com/hr/recruitment-onboarding/articles/why-facial-recognition-is-a-game-changer-for-hiring/#:~:text=By%20utilizing%20face%20recognition%20software,selecting%20only%20the%20right%20candidates.
- https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/
Columnist: Megan von Sosen