Education, Science, Technology, Innovation and Life
Open Access
Sign In

Systemic Bias in Artificial Intelligence: Focusing on Gender, Racial, and Political Biases

Download as PDF

DOI: 10.23977/jaip.2024.070324 | Downloads: 23 | Views: 455

Author(s)

Yefan Zhu 1

Affiliation(s)

1 Department of Art and Technology, School of the Art Institute of Chicago, 1140 S Wabash Ave, Chicago, IL, 60605, America

Corresponding Author

Yefan Zhu

ABSTRACT

This paper examines systemic bias in artificial intelligence (AI), focusing on gender, racial, and political dimensions. As AI technology has evolved from theoretical frameworks to practical applications across social and cultural realms—ranging from collaborative robots to natural language processing—it has achieved significant advancements. However, this transition has highlighted a critical tension between AI's precise algorithms and the intricate dynamics of human society, revealing how systemic biases can diverge from ethical standards and perpetuate inequality. By delving into these biases, this study aims to illuminate the ways AI can unjustly advantage or disadvantage specific groups, ultimately contributing to a deeper understanding of the ethical implications of AI technologies in contemporary society.

KEYWORDS

Artificial Intelligence; Systemic Bias; Gender Bias; Racial Bias; Political Bias; Ethical Implications; Algorithmic Fairness; Social Dynamics; Technology and Society

CITE THIS PAPER

Yefan Zhu, Systemic Bias in Artificial Intelligence: Focusing on Gender, Racial, and Political Biases. Journal of Artificial Intelligence Practice (2024) Vol. 7: 195-206. DOI: http://dx.doi.org/10.23977/jaip.2024.070324.

REFERENCES

[1] Angwin, Julia and Larson, Jeff. "Bias in criminal risk scores is mathematically inevitable, researchers say." Propublica, 2016, https://www.propublica.org/article/bias-in-criminal-risk-scores-is-mathematically-inevitable researchers- say.  
[2] Amazon Alexa. www.alexa.com.
[3] Apple. "Siri." Apple, www.apple.com/siri.
[4] "Artificial Intelligence, Real Stereotypes." College of Letters & Science, University of Wisconsin-Madison, 8 Jan. 2020, https://ls.wisc.edu/news/artificial-intelligence-real-stereotypes.
[5] Belk, Russell. "Artificial Emotions and Love and Sex Doll Service Workers." Journal of Service Research, 2022, doi: 10.1177/10946705211063692.
[6] Caliskan, Aylin, Joanna J. Bryson, and Arvind Narayanan. "Semantics derived automatically from language corpora contain human-like biases." Science, vol. 356, no. 6334, 2017, pp. 1-4.
[7] Carney, John. "Twitter Slaps Warning Label on Hunter Biden Article." Breitbart, 16 Oct. 2020, https://www.breitbart.com/tech/2020/10/15/twitter-slaps-warning-label-on-hunter-biden-article/.
[8] Chandler, Daniel and Robert Munday. A Dictionary of Media and Communication. Oxford University Press, Jan.2011. Oxford Reference, https://www.oxfordreference.com/view/ 10.1093/acref/9780199568758. 001.0001/ acref 9780199568758.
[9] Cheng, Lu, et al. "Socially responsible AI algorithms: Issues, purposes, and challenges." arXiv preprint-arXiv: 2101.02032 (2021).
[10] Costa, Pedro, and Luísa Ribas. "AI becomes her: Discussing gender and artificial intelligence." Technoetic Arts, vol. 17, no. 1-2, 2019, pp. 171-193.
[11] Finkel, E. J., et al. "Political Sectarianism in America." Science, vol. 370, no. 6516, 2020, pp. 533-536.
[12] Gabbatt, Adam. "Google accused of manipulating search results to favour Hillary Clinton." The Guardian, Guardian News and Media, 12 June 2016, www.theguardian.com/us-news/2016/jun/10/google-search-favours hillary-clinton-research-finds.
[13] Garg, Nikhil, et al. "Word Embeddings Quantify 100 Years of Gender and Ethnic Stereotypes." Proceedings of the --National Academy of Sciences, vol. 115, no. 16, Apr. 2018, https://doi.org/10.1073/pnas.1720347115.
[14] Garland, Alex, director. Ex Machina. Universal Pictures International, 2014. 
[15] Ge, Yanbo, et al. "Racial and Gender Discrimination in Transportation Network Companies." National Bureau of Economic Research, 2016.
[16] Giammarise, Kate. "High-Tech Help: Allegheny Dhs Using Algorithm to Assist In Child Welfare Screening." Pittsburgh Post-Gazette, 9 Apr. 2017, p. A1.
[17] Goel, Vindu and Mike Isaac. "Facebook, in Cross Hairs After Election, Is Said to Question Its Influence." The New York Times, 12 Nov. 2016, https://www.nytimes.com/2016/11/13/technology/facebook-is-said-to-question-its influence-in-election.html.
[18] Henrickson, Leah. Interview by John Huston. Video Interview. Zoom, January 6, 2022.
[19] Hinds, J., & Joinson, A. "Human and computer personality prediction from digital footprints." Current Directions in Psychological Science, vol. 28, no. 2, 2019, pp. 204-211.
[20] Hutto, Daniel D., et al. "Culture in Mind-An Enactivist Account." Culture, Mind, and Brain: Emerging Concepts, Models, and Applications, edited by Laurence J. Kirmayer et al., MIT Press, 2021, pp. 103-118.
[21] Keyes, Os. “The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition.” Proceedings of the ACM on Human-Computer Interaction, vol. 2, no. CSCW, 2018, https://doi.org/10.1145/3274357.
[22] Mulvey, Laura. "Visual Pleasure and Narrative Cinema." Feminism and Film Theory, edited by Constance Penley, Routledge, 2013, pp. 57-68.
[23] Ngan, Mei, Peter J. Grother, and Mei Ngan. Face Recognition Vendor Test (FRVT) Performance of Automated Gender Classification Algorithms. US Department of Commerce, National Institute of Standards and Technology, 2015.
[24] NIST Special Publication 1270. Towards a Standard for Identifying and Managing Bias in Artificial Intelligence. National Institute of Standards and Technology, Gaithersburg, MD, 2020, p. 8.
[25] OpenAI. "Introducing ChatGPT: Large-scale Generative Pre-training for Conversational AI." OpenAI Blog, 11 June 2022, https://openai.com/blog/chatgpt/.
[26] Selbst, Andrew D., et al. "Fairness and Abstraction in Sociotechnical Systems." Proceedings of the Conference on Fairness, Accountability, and Transparency - FAT ’19, ACM Press, 2019, pp. 59–68. 
[27] Slota, S. C., Fleischmann, K. R., Greenberg, S., Verma, N., Cummings, B., Li, L., & Shenefiel, C. (2021). Many hands make many fingers to point: challenges in creating accountable AI. AI & Soc.
[28] Strengers, Yolande and Jenny Kennedy. The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a -------Feminist Reboot. MIT Press, 2021.
[29] United States Congress. “The FUTURE of AI Act.” Fundamentally Understanding the Usability and Realistic Evolution of Artificial Intelligence Act of 2017, www.cantwell.senate.gov/imo/media/doc/The%20FUTURE%20
of%20AI%20Act-%20Section-By-Section.pdf.
[30] Vincent, James. "Amazon Reportedly Scraps Internal AI Recruiting Tool That Was Biased Against Women." The Verge, 10 Oct. 2018, https://www.theverge.com/2018/10/10/17958784/ai-recruiting-tool-bias-amazon-report.
[31] Winegarner, B. "Facebook is the reason why we’re all stuck in political filter bubbles." Fast Company, 22 Mar. 2021, https://www.fastcompany.com/90613164/facebook-is-the-reason-why-were-all-stuck-in-political-filter bubbles. 
[32] Yong, Ed. "A Popular Algorithm Is No Better at Predicting Crimes Than Random People." The Atlantic, Atlantic Media Company, 19 Jan. 2018, www.theatlantic.com/technology/archive/2018/01/equivant-compas-algorithm /550 ----646/. 
[33] Yoshida, Takuji. "The Trade-Off in Fairness Highlighted by the Recidivism Prediction Algorithm." Axion | Next Generation Economic Media, 14 July 2020, www.axion.zone/is-the-recidivism-prediction-algorithm-fair-to-race.

Downloads: 13271
Visits: 376002

Sponsors, Associates, and Links


All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.