• Welcome to the Online Discussion Groups, Guest.

    Please introduce yourself here. We'd love to hear from you!

    If you are a CompTIA member you can find your regional community here and get posting.

    This notification is dismissable and will disappear once you've made a couple of posts.
  • We will be shutting down for a brief period of time on 9/24 at around 8 AM CST to perform necessary software updates and maintenance; please plan accordingly!

Brianna White

Administrator
Staff member
Jul 30, 2019
4,655
3,454
“We don't see things as they are, we see them as we are.” So wrote Anais Nin, rather succinctly describing the unfortunate melange of biases that accompany these otherwise perfectly well-functioning human brains of ours.
In a business context, affinity bias, confirmation bias, attribution bias, and the halo effect, some of the better known of these errors of reasoning, really just scratch the surface. In aggregate, they leave a trail of offenses and errors in their wake.
Of course, the most pernicious of our human biases are those that prejudice us for or against our fellow humans on the basis of age, race, gender, religion, or physical appearance. Try as we do to purify ourselves, our work environments, and our society from these distortions, they still worm their way into—well, just about everything that we think and do—even modern technologies, like AI.
Critics say that AI makes bias worse
Since AI was first deployed in hiring, loan approvals, insurance premium modeling, facial recognition, law enforcement, and a constellation of other applications, critics have—with considerable justification—pointed out the technology’s propensity for bias.
Google’s Bidirectional Encoder Representations from Transformers (BERT), for example, is a leading Natural Language Processing (NLP) model that developers can use to build their own AI. BERT was originally built using Wikipedia text as its principle source. What’s wrong with that? The overwhelming majority of Wikipedia's contributors are white males from Europe and North America. As a result, one of the most important sources of language-based AI began its life with a biased perspective baked-in.
A similar problem was found in Computer Vision, another key area of AI development. Facial recognition datasets comprising hundreds of thousands of annotated faces are critical to the development of facial recognition applications used for cybersecurity, law enforcement, and even customer service. It turned out, however, that the (presumably mostly white, middle-aged male) developers unconsciously did a better job achieving accuracy for people like themselves. Error rates for women, children, the elderly, and people of color were much higher than those for middle-aged white men. As a result, IBM, Amazon, and Microsoft were forced to cease sales of their facial recognition technology to law enforcement in 2020, for fear that these biases would result in wrongful identification of suspects.
For more on all of this, I encourage you to watch the important and sometimes-chilling documentary Coded Bias.
What if AI is actually part of the solution to bias?
A better understanding of the phenomenon of bias in AI reveals, however, that AI merely exposes and amplifies implicit biases that already existed, but were overlooked or misunderstood. AI itself is agnostic to color, gender, age, and other biases. It is not vulnerable to the logical fallacies and cognitive biases that trouble humans. The only reason we see bias in AI at all is because of heuristical errors and biased data that humans sometimes train it with.
Since the discovery of the biases stated above (a PR disaster, I assure you), all of the major technology companies have been working to improve datasets and eliminate bias. One way to eliminate bias in AI?—by using AI! If that seems unlikely, read on.
Continue reading: https://www.forbes.com/sites/glenngow/2022/07/17/how-to-use-ai-to-eliminate-bias/?sh=3bfb4a331f1f
 

Attachments

  • p0008522.m08129.960x0_2022_07_18t134853_286.jpg
    p0008522.m08129.960x0_2022_07_18t134853_286.jpg
    89.9 KB · Views: 38
  • Like
Reactions: Brianna White