Apr 10, 2022
In Education Forum
Another option could be to induce whatsapp phone number list the system to use "fair" representations of the data, in the sense that they are not associated with the characteristics that are the source of discrimination. Or, directly, force whatsapp phone number list you to ignore these protected attributes, such as gender or other demographic characteristics, when making a decision. However, we must be careful when designing these solutions: even if we hide certain attributes from a system, such as the gender or ethnic group to whatsapp phone number list which a person belongs, the correlation between those attributes and other variables will continue to exist. Let us remember that if whatsapp phone number list there is something that machine learning models do well, it is to find patterns and also correlations. Thus, while the academic algorithmic fairness research community (fairness ) has whatsapp phone number list worked hard in recent years to build fair models that do not discriminate, the human factor in the design of these systems is paramount. Although there are currently various formalizations of the concept of fairness , many of them are mutually incompatible, in the sense that it is not possible whatsapp phone number list to maximize them at the same time. Weand therefore you must opt for whatsapp phone number list those that you want to maximize. It is not enough, then, to generate representative databases or fair models in any specific sense. Artificial intelligence systems are whatsapp phone number list designed by people with their own views of the world, prejudices, assessments of the facts and biases acquired throughout their life experience, which can filter into the design and definition of evaluation criteria for these models.If those working groups are not diverse enough to reflect a whatsapp phone number list wide variety of views, they will most likely fail to even notice biases exist, and therefore correct them.