Using AI Changes the Paradigm of Women's Participation in Politics


Iris - Panagiota Efthymiou
https://orcid.org/0000-0001-9656-8378
Anastasia Psomiadi
Kyvelle Constantina Diareme
Souzana Chatzivasileiou
Résumé

The effect of AI on how people are viewed and handled in society is important and profound. However, a vicious cycle is maintained with AI's algorithms design and implementation. Among others, predictive models, machine learning and AI algorithms train and test themselves using datasets, as a result, they “learn” mainly based on the data input in a model. Nowadays and in this context, it seems that there is a growing scientific dialogue concerning bias in training AI (Falco, 2019; Lu, 2019; Straw, 2020) as well as whether datasets, on which decisions are made, only represent fractions of reality (Günther et al., 2017).  The technology often captures and reproduces regulated and restrictive beliefs regarding gender and race, which are then repetitively strengthened: Gender relations be materialized by inventions and, through their enrolment and incorporation of machinery, masculinity, and femininity gain of turn their importance and character. When robots progress in certain cognitive functions, their comparatively weak abilities will definitely get better. This list incorporates the innovative approach to the dilemma, empathy, negotiation, and belief. Automation and AI will also replace many of today's workers at the same time creating new opportunities for specialized personnel– so that is why women need to get into this emerging sector and ensure that they can secure new jobs when their jobs are squeezed.  In addition, AI may provide the ability to alter male and female epistemological assumptions. The narration of "hard" and "soft" intelligence, for instance, is often described as male and female. The rise and development of AI is also seen as pushing economic growth and strengthening political influence. In politics, UK politics still dominates the ambition of economic development by technical advancement. Jude Browne states (Clementine Collett & Sarah Dillon) that a national AI agency equivalent to Human Fertilization and Embryology (HFEA) has yet to be set up by the government of the UK that will fill the divide between national, experts and government, for example. Browne claims that it includes the dominance, primarily guided by the goals of economic wealth, of private interest over the public interest. There is a possibility that economic growth and political influence play an important part in influencing AI laws and policies at the cost of other motivations, which are more morally equal. Consequently, a dual-purpose must be incorporated into an equitable AI policy. Firstly, to ensure there is no rise in social and economic disparity due to the advancement of AI technology. Secondly, to call AI to cut this down. AI must first and foremost enable us to promote our democratic liberties, enhance social harmony, and enhance unity, rather than jeopardise our individual trajectories and networks of solidarity.

Article Details
  • Rubrique
  • Articles
Téléchargements
Les données relatives au téléchargement ne sont pas encore disponibles.
Bibliographies de l'auteur
Iris - Panagiota Efthymiou, Hellenic Association of Political Scientists; University of Piraeus
Iris - Panagiota Efthymiou is Board Member and President of the Interdisciplinary Committee, of the Hellenic Association of Political Scientists (HAPSc), Scientific Associate at the Laboratory of Health Economics and Management University of Piraeus, Board Member of Womanitee, HAPSc: Athens, Greece.
Anastasia Psomiadi, APSON CSR; Womanitee
Anastasia Psomiadi is Founder and President of APSON CSR, Founder and President of Womanitee, Athens, Greece
Kyvelle Constantina Diareme, Hellenic Association of Political Scientists
Kyvelle Constantina Diareme is Member of HAPSc, New York College, Athens, Greece.
Souzana Chatzivasileiou, University of Piraeus; Hellenic Association of Political Scientists
Souzana Chatzivasileiou is Member of HAPSc, researcher at the Laboratory of Health Economics and Management, University of Piraeus, Greece.
Références
Acemoglu, D. and Restrepo, P. (2018). Artificial Intelligence, Automation, and Work. National Bureau of Economic Research, Working Paper Series, Working Paper 24196.
Bano, M. (2018). Artificial intelligence is demonstrating gender bias – and it’s our fault. King's College London. https://www.kcl.ac.uk/news/artificial-intelligence-is-demonstrating-gender-bias-and-its-our-fault (Accessed: 11/10/2020).
BCG (n.d). BCG GAMMA. Available at: https://www.bcg.com/en-ao/beyond-consulting/bcg-gamma/default (Accessed: 9/10/20).
Bolukbasi, T., Chang, K. W., Zou, J., Saligrama, V. and Kalai, A. (2016). “Man Is to Computer Programmer as Woman Is to Homemaker? Debiasing Word Embeddings”. Cornell University. Available at: http://arxiv.org/abs/1607.06520 (Accessed: 10/10/20).
Büchel, B. (2018). Artificial Intelligence Could Reinforce Society’s Gender Equality Problems. The Conversation. Available at: http://theconversation.com/artificial-intelligence-could-reinforce-societys-gender-equality-problems-92631 (Accessed: 9/10/20).
Courtland, R. (2018). Bias Detectives: The Researchers Striving to Make Algorithms Fair. Nature 558 (7710): 357–60.
Collett, C. and Dillon, S. (2019). AI and Gender: Four Proposals for Future Research. Cambridge. Available at: https://www.repository.cam.ac.uk/bitstream/handle/1810/294360/AI_and_Gender___4_Proposals_for_Fu ture_Research_210619_p8qAu8L%20%281%29.pdf?sequence=1&isAllowed=y (Accessed: 18/10/20).
Dastin, J. (2018). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. Available at: https://www.reuters.com/article/us-amazon-com-jobs-automation- insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G (Accessed: 25/10/20).
Davidson, L., and Boland, M. R. (2020). Enabling Pregnant Women and Their Physicians to Make Informed Medication Decisions Using Artificial Intelligence. Journal of Pharmacokinetics and Pharmacodynamics 47(4): 305–18.
Duranton S., Erlebach, J., Brégé, C., Danziger, J., Gallego, A. and Pauly, M. (2020). What’s Keeping Women Out of Data Science?. Boston Consulting Group. Available at: https://www.bcg.com/en- ao/publications/2020/what-keeps-women-out-data-science (Accessed: 5/11/2020).
Efthymiou - Egleton, I. P. (2017). “Wellness”: A New Word for Ancient Ideas. UK: Xlibris.
Efthymiou I. P., Efthymiou - Egleton Th. W., Sidiropoulos S. (2020). Artificial Intelligence (AI) in Politics: Should Political AI be Controlled?. International Journal of Innovative Science and Research Technology, 5(2): 49-51.
Efthymiou, I. P., Sidiropoulos, S., Kritas, D., Rapti, P., Vozikis, A. and Souliotis. K. (2020). AI Transforming Healthcare Management during Covid-19 Pandemic. HAPSc Policy Briefs Series, 1(1): 130–38.
Falco, G. (2019). Participatory AI: Reducing AI Bias and Developing Socially Responsible AI in Smart Cities. 2019 IEEE International Conference on Computational Science and Engineering (CSE) and IEEE International Conference on Embedded and Ubiquitous Computing (EUC) (pp. 154-158). IEEE.
Frenda, S., Ghanem, B., Montes, M., and Rosso, P. (2019). Online Hate Speech against Women: Automatic Identification of Misogyny and Sexism on Twitter. Journal of Intelligent and Fuzzy Systems, 36(5):4743-4752.
Gallego, A., Krentz, M., Tsusaka, M., Yousif, N. and Taplett, F. B. (2019). How AI Could Help—or Hinder— Women in the Workforce. BCG. Available at: https://www.bcg.com/publications/2019/artificial-intelligence-ai-help-hinder-women-workforce (Accessed: 8/10/20).
Günther, W. A., Mehrizi, M. H. R., Huysman, M. and Feldberg, F. (2017). Debating big data: A literature review on realizing value from big data. The Journal of Strategic Information Systems, 26(3): 191-209.
Hinds, J., Williams, E. J. and Joinson, A. N. (2020). “It wouldn't happen to me”: Privacy concerns and perspectives following the Cambridge Analytica scandal. International Journal of Human-Computer Studies, 102498.
Hunt, V., Prince, S., Dixon-Fyle, S. and Yee, L. (2018). Delivering through Diversity. McKinsey. Available at: https://www.mckinsey.com/~/media/McKinsey/Business%20Functions/Organization/Our%20Insights/De livering%20through%20diversity/Delivering-through-diversity_full-report.ashx (Accessed: 17/10/20).
Leavy, S. (2018). Gender Bias in Artificial Intelligence: The Need for Diversity and Gender Theory in Machine Learning. In Proceedings of the 1st International Workshop on Gender Equality in Software Engineering, 14–16. GE ’18. New York, NY, USA: Association for Computing Machinery.
Lu, D. (2019). Google's hate speech AI may be racially biased. New Scientist 243, 3243, p.7
Microsoft (2018). Changing the face of STEM. Available at: https://news.microsoft.com/europe/features/changing-the-face-of-stem/ (Accessed: 23/10/20).
Minevich, M. (2020). Women Are The Key To Scaling Up AI And Data Science. Forbes. Available at: https://www.forbes.com/sites/markminevich/2020/03/16/women-are-the-key-to-scaling-up-ai-and-data-science/ (Accessed: 15/10/2020).
Noriega, M. (2020). The Application of Artificial Intelligence in Police Interrogations: An Analysis Addressing the Proposed Effect AI Has on Racial and Gender Bias, Cooperation, and False Confessions. Futures, 117, 102510.
Norouzzadeh, M. S., Anh N., Kosmala M., Swanson A., Palmer M.S., Packer C., and Clune, J. (2018). Automatically Identifying, Counting, and Describing Wild Animals in Camera-Trap Images with Deep Learning. Proceedings of the National Academy of Sciences 115 (25): E5716–25.
Orduña, N. (2019). AI-driven companies need to be more diverse. Here’s why. World Economic Forum. Available at: https://www.weforum.org/agenda/2019/07/ai-driven-companies-need-to-be-more-diverse-here-s-why/ (Accessed: 9/10/20).
Pallister, K. (2020). Why Artificial Intelligence Is Biased Against Women. IFLScience. Αvailable at: https://www.iflscience.com/editors-blog/why-artificial-intelligence-is-still-gender-biased/ (Accessed: 9/10/2020).
Prates, M. O. R., Avelar, P. H. and Lamb, L. C. (2020). Assessing Gender Bias in Machine Translation: A Case Study with Google Translate. Neural Computing and Applications, 32(10): 6363–81.
Reese, H. (2016). Bias in machine learning, and how to stop it. TechRepublic. Available at: https://www.techrepublic.com/article/bias-in-machine-learning-and-how-to-stop-it/ (Accessed: 18/10/20).
Straw, I. (2020). The automation of bias in medical Artificial Intelligence (AI): Decoding the past to create a better future. Artificial intelligence in medicine, 110, 101965.
Tifferet, S. (2019). Gender differences in privacy tendencies on social network sites: a meta-analysis. Computers in Human Behavior, 93: 1-12.
UN General Assembly (2015). Transforming Our World: The 2030 Agenda for Sustainable Development. Available at: https://sustainabledevelopment.un.org/content/documents/21252030%20Agenda%20for%20Sustainable %20Development%20web.pdf (Accessed: 13/10/20).
van der Schyff, K., Flowerday, S. and Furnell, S. (2020). Privacy Risk and the Use of Facebook Apps: A gender-focused vulnerability assessment. Computers & Security, 101866.
Vinuesa, R., Azizpour, H., Leite, I., Balaam, M., Dignum, V., Domisch, S., Felländer, A., Langhans, S. D., Tegmark, M. and Nerini, F. F. (2020). The Role of Artificial Intelligence in Achieving Sustainable Development Goals. Nature Communications 11 (1): 1–10.

Articles similaires

Vous pouvez également Lancer une recherche avancée de similarité pour cet article.

Articles les plus lus par le même auteur ou la même autrice