Filtering by
- All Subjects: artificial intelligence
- All Subjects: Communication
- Resource Type: Text
Deviant bodies resisting online: examining the intersecting realities of women of color in Xbox Live
Employing qualitative methods and drawing from an intersectional framework which focuses on the multiple identities we all embody, this dissertation focuses on oppressions and resistance strategies employed by women of color in Xbox live, an online gaming community. Ethnographic observations and narrative interviewing reveal that women of color, as deviants within the space, face intersecting oppressions in gaming as in life outside the gaming world. They are linguistically profiled within the space based off of how they sound. They have responded with various strategies to combat the discrimination they experience. Some segregate themselves from the larger gaming population and many refuse to purchase games that depict women in a hyper-sexualized manner or that present people of color stereotypically. For others, the solution is to "sit-in" on games and disrupt game flow by 'player-killing' or engage in other 'griefing' activities. I analyze this behavior in the context of Black feminist consciousness and resistance and uncover that these methods are similar to women who employ resistance strategies for survival within the real world.
Artificial Intelligence’s facial recognition programs are inherently racially biased. The programs are not necessarily created with the intent to disproportionately impact marginalized communities, but through their data mining process of learning, they can become biased as the data they use may train them to think in a biased manner. Biased data is difficult to spot as the programming field is homogeneous and this issue reflects underlying societal biases. Facial recognition programs do not identify minorities at the same rate as their Caucasian counterparts leading to false positives in identifications and an increase of run-ins with the law. AI does not have the ability to role-reverse judge as a human does and therefore its use should be limited until a more equitable program is developed and thoroughly tested.