Brianna White

Administrator
Staff member
Jul 30, 2019
4,656
3,456
That's according to a recent World Health Organization policy brief explaining that data used by A.I. in healthcare can be unrepresentative of older people. A.I. is a product of its algorithms, the brief explains, and can draw ageist conclusions if the data that feeds the algorithms is skewed toward younger individuals. This could affect, for example, telehealth tools used to predict illness or major health events in a patient. It could also provide inaccurate data for drug development. Ultimately, not including older adults in the development process for A.I. can make it harder to get them to adopt new A.I. applications in the future.
"To ensure that A.I. technologies play a beneficial role, ageism must be identified and eliminated from their design, development, use and evaluation," Alana Officer, Unit Head, Demographic Change and Healthy Ageing at WHO writes in a summary of the report. She added that the biases of society are often replicated in A.I technologies.
Here are eight ways to make sure A.I. doesn't discriminate against older consumers, as listed in the WHO policy brief.
1. Include older consumers in the design of A.I. technologies.
When developing any A.I. technology, make sure you have older people participating in any focus groups and in giving product feedback.
Continue reading: https://www.inc.com/anna-meyer/ageism-artificial-intelligence-world-health-organization.html
 

Attachments

  • p0007064.m06716.gettyimages_644366708_495147_r9mqql.jpg
    p0007064.m06716.gettyimages_644366708_495147_r9mqql.jpg
    105.1 KB · Views: 50
  • Like
Reactions: Brianna White