Umetna inteligenca, ki je v vedno večji meri del našega vsakdana, za svoje delovanje pogosto uporablja vnaprej pripravljene algoritme, ki programerjem olajšajo delo. Ena takšnih programskih knjižnic združuje algoritme  za prepoznavanje človeških obrazov. Temnopolta programerka Joy Buolamwini je pri svojem delu ugotovila, da ti algoritmi njenega obraza pogosto ne prepoznajo, zato je sprožila pobudo imenovano ALGORITHMIC JUSTICE LEAGUE.

I am writing a series of articles to explore the embedded bias in code that unintentionally limits the audience who can use products or participate in research. By sharing the ongoing need for inclusive coding i.e. “InCoding” and providing practical steps to make products more inclusive, I want to move closer to a world where technology reflects the diversity of its users and creators. … Calls for tech inclusion often miss the bias that is embedded in written code. Frustrating experiences with using computer vision code on diverse faces remind me that not all eyes or skin tones are easily recognized with existing code libraries. … Seven years since my first encounter with this problem, I realize that I cannot simply move on as the problems with inclusion persist. While I cannot fix coded bias in every system by myself, I can raise awareness, create pathways for more diverse training sets, and challenge us to examine the Coded Gaze — the embedded views that are propagated by those who have the power to code systems. (InCoding — In The Beginning)

-
Podpri Kvarkadabro!
Naroči se
Obveščaj me
guest

0 - št. komentarjev
Inline Feedbacks
View all comments