Amazon’s technology classified dark-skinned women as men 31% of the time, while lighter-skinned women misidentified only 7% of the time.
One of the big problems of artificial intelligence, as it has been recognized by most providers of this technology – from Google to IBM – are the biases and the possibility that automated systems perpetuate (or even increase) inequality and The discrimination. Not in vain, the use of bad data sets with which to train the algorithms (as well as the same poorly designed design of the systems themselves) means that anyone who steps out of the ‘white man’ scheme suffers the negative consequences of these practices .
The last example of this is found in Amazon Web Services, whose artificial intelligence technology has been strongly questioned in recent times due to a report by MIT Media Lab scientist, Joy Buolamwini (available here), which states that systems consistently discriminate against minorities.
Specifically, researchers have found that Amazon technology classified dark-skinned women as men 31% of the time, while women with lighter skin were misidentified only 7% of the time. At the same time, men with darker skin had an error rate of 1%, while men with lighter skin had none.
“If you sell a system that has shown bias in human faces, it is doubtful that your other facial-based products are also completely free of biases,” says Joy Buolamwini.
These data prove that AWS is not taking seriously the fight against biases in its artificial intelligence solutions, which opens a range of problems not only at the level of reputation (especially with the current controversy in countries like the United States) but also in the legal sphere (it is exposed to demands of civil associations and minority defense groups) and in the daily operations itself.
And, while Microsoft or Google have improved their systems to avoid these biases as much as possible, the fact that Amazon is so behind on the road makes us doubt the quality and precision of its technology. Which is especially serious if you consider that it is already being used in such crucial matters as the automated identification of suspects in photographic records in places like the Washington County Sheriff’s Office in Oregon. That, by default, being black (and woman) gives you more ballots to be identified as guilty of a crime is not usually the best claim of a vanguard company as Jeff Bezos claims…