K

Kathleen Martin

Guest
Battling bias. If I’ve been a little MIA this week, it was because I spent Monday and Tuesday in Boston for Fortune’s inaugural Brainstorm A.I. gathering. It was a fun and wonky couple of days diving into artificial intelligence and machine learning, technologies that—for good or ill—seem increasingly likely to shape not just the future of business, but the world at large.
There are a lot of good and hopeful things to be said about A.I. and M.L., but there’s also a very real risk that the technologies will perpetuate biases that already exist, and even introduce new ones. That was the subject of one of the most engrossing discussions of the event by a panel that was—as pointed out by moderator, guest co-chair, and deputy CEO of Smart Eye Rana el Kaliouby—comprised entirely of women.
One of the scariest parts of bias in A.I. is how wide and varied the potential effects can be. Sony Group’s head of A.I. ethics office Alice Xiang gave the example of a self-driving car that's been trained too narrowly in what it recognizes as a human reason to jam on the breaks. "You need to think about being able to detect pedestrians—and ensure that you can detect all sorts of pedestrians and not just people that are represented dominantly in your training or test set," said Xiang.
And though not so obviously life-or-death, Dr. Margaret Mitchell, chief ethics scientist of Hugging Face, pointed to the problems with bias in A.I. language processing:
"We find that when we have these large language models training on tons and tons of data ... most of it is sourced from the web, where we see a lot of racism and sexism and ableism and ageism,” she said. “[It’s] largely sourced from Wikipedia, which is primarily written by men, white men, between something like 20 to 30 or so, and single and PhD, higher-level education, which means that the kind of topics that are covered, that are then scraped in training the language models, reflect those knowledge bases, reflect those backgrounds.”
Continue reading: https://fortune.com/2021/11/10/ai-bias-research-women-tech-men/
 

Attachments

  • p0005685.m05343.fortune_logo_2016_840x485.jpg
    p0005685.m05343.fortune_logo_2016_840x485.jpg
    34.2 KB · Views: 52