Racist Data? Human Bias is Infecting AI Development

By John Murray

Machine learning algorithms process vast quantities of data and spot correlations, trends and anomalies, at levels far beyond even the brightest human mind. But as human intelligence relies on accurate information, so too do machines. Algorithms need training data to learn from. This training data is created, selected, collated and annotated by humans. And therein lies the problem.

Bias is a part of life, and something that not a single person on the planet is free from. There are, of course, varying degrees of bias – from the tendency to be drawn towards the familiar, through to the most potent forms of racism.

This bias can, and often does, find its way into AI platforms. This happens completely under the radar and through no concerted effort from engineers. BDJ spoke to Jason Bloomberg, President of Intellyx, a leading industry analyst and author of ‘The Agile Architecture Revolution’, on the dangers that are faced from bias creeping in to AI.

Read the entire article at https://journal.binarydistrict.com/racist-data-human-bias-is-infecting-ai-development/?fbclid=IwAR1kd3Je4fqe35pl3zukG4KpvnD90Uuz_wYjM84qqELf4ekVKI5rj5bw0VA

SHARE THIS:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.