Death by Algorithm Podcast Por Sune With arte de portada

Death by Algorithm

Death by Algorithm

De: Sune With
Escúchala gratis

Acerca de esta escucha

A series on autonomous weapons systems, drones and AI in the military domain. Experts from various disciplines share their research and discuss the black box, responsibility, human-machine interaction, and the future of legal and ethical frameworks for AI in war. How is war regulated? Can the ethics of war be programmed into machines? Does it change how we fight? Can war be cleaned up by technology? How can soldiers understand the systems? Will AI systems be the commanders of tomorrow? Why not just let the robots fight? Episodes are narration and interviews and not chronologicalSune With Política y Gobierno
Episodios
  • From Practice to (Auto)Norms feat. Ingvild Bode
    May 20 2025
    Practices create norms, and words can shape reality. This applies to the debate on autonomous weapons and AI in the military domain. It is significant whether the technology precedes public deliberation and regulations. Additionally, it is crucial whether we refer to AI in the military as "decision support systems", "emerging technology", "autonomous weapons", or "killer robots". Professor and recipient of the Danish Elite Research Prize 2025, Ingvild Bode describes her research project, AutoNorms, and how it tracks discourse and development on autonomous weapons. Furthermore, Ingvild shares her perspectives on the black box, meaningful human control, ethical machines, and the future of regulation.Shownotes: Producer and host: Sune With sunewith@cas.au.dkCoverart: Sebastian GramReferences and literature - AutoNorms, PI Ingvild Bode (Accessed May 13. 2025)https://www.autonorms.eu/ - Arai, Koki; Matsumoto, Masakazu, 2023, ”Public perception of autonomous lethal weapons systems”, AI and Ethics (2024) 4:451–462https://link.springer.com/article/10.1007/s43681-023-00282-9 - Bode, Ingvild. 2024. "Emergent Normativity: Communities of Practice, Technology, and Lethal Autonomous Weapons Systems". Global Studies Quarterly 4(1), https://doi.org/10.1093/isagsq/ksad073 - Bode, Ingvild.; Nadibaidze, Anna, 2024, ”Autonomous Drones”. In J. P. Rogers (Ed.), De Gruyter Handbook on Drone Warfare, pp. 369-384, De Gruyter. - Bode, Ingvild; Bhila, Ishmael; September 3. 2024, ”The problem of algorithmic bias in AI-based military decision support”, Humanitarian Law and Policy, ICRC.https://blogs.icrc.org/law-and-policy/2024/09/03/the-problem-of-algorithmic-bias-in-ai-based-military-decision-support-systems/ - Bode, Ingvild. 2023. “Practice-Based and Public-Deliberative Normativity: Retaining Human Control over the Use of Force.” European Journal of International Relations 29(4), 990-1016, https://doi.org/10.1177/13540661231163392 - Bode, Ingvild, and Tom Watts. 2023. Loitering Munitions and Unpredictability: Autonomy in Weapon Systems and Challenges to Human Control. Odense, London: SDU Center for War Studies, Royal Holloway Centre for International Security. Link - Campaign to Stop Killer Robots, 2021, “Killer Robots: Survey Shows Opposition Remains Strong”, Humans Rights Watch (Accessed May 14. 2025) https://www.hrw.org/news/2021/02/02/killer-robots-survey-shows-opposition-remains-strong - Deeney, Chris, 2019, “Six in Ten (61%) Respondents Across 26 Countries Oppose the Use of Lethal Autonomous Weapons Systems”, Ipsos (Accessed May 14. 2025). https://www.ipsos.com/en-us/news-polls/human-rights-watch-six-in-ten-oppose-autonomous-weapons - HuMach, PI Ingvild Bode. (Accessed May 13. 2025)https://www.sdu.dk/en/forskning/forskningsenheder/samf/cws/cws-activities/projects/humach - IEEE Standart Association, A Research Group on Issues of Autonomy and AI in Defense Systems. (2024). ”A Framework for Human Decision Making Through the Lifecycle of Autonomous and Intelligent Systems in Defense Applications”. New York, NY: IEEE SA (Accessed April 2. 2025)https://ieeexplore.ieee.org/document/10707139 - IEEE Standards Association (Accessed April 2. 2025)https://standards.ieee.org/ - Nadibaidze, Anna; Bode, Ingvild; Zhang, Qiaochu, 2024, “AI in Military Decision Support Systems: A Review of Developments and Debates”, Center for War Studies, SDU. - Overton Window, Wikipedia (Accessed May 13. 2025) https://en.wikipedia.org/wiki/Overton_window - Renic, Neil and Christenson, Johan, 2024, “Drones, the Russo-Ukrainian War, and the Future of Armed Conflict”, CMS Report. https://cms.polsci.ku.dk/english/publications/drones-the-russo-ukrainian-war-and-the-future-of-armed-conflict/. - The Overton Window, Mackinac Center for Public Policy (Accessed May 13. 2025)https://www.mackinac.org/OvertonWindowMusic: Sofus Forsberg
    Más Menos
    56 m
  • The Analytical Engine feat. Lise Bach Lystlund, Jonas Nygreen, Lauritz Munch & Joshua Hatherley
    May 16 2025

    So, why are autonomous weapons systems such a big deal? Aren't they just weapons like the rest of them? Well, the black box problem with algorithmically controlled systems raises challenges different from those of "fire and forget" munitions.

    Four AI experts explain.

    The first part of this episode clarifies what an algorithm is when the black box appears and why it's important.

    The second part clarifies how the data the algorithms rely on can be biased and that constant maintenance and updates do not fix the problem.

    Shownotes:

    Producer and host: Sune With sunewith@cas.au.dk

    Cover art: Sebastian Gram


    References and literature:

    - Algorithmic bias, Wikipedia (Accessed April 8. 2025)

    https://en.wikipedia.org/wiki/Algorithmic_bias

    - Black Box, Wikipedia (Accessed April 8. 2025)

    https://en.wikipedia.org/wiki/Black_box

    - Blouin, Lou; Rawashdeh, Samir, March 2023, “AI´s mysterious “black box” problem. Explained”, NEWS University of Michigan-Dearborn (Accessed April 8. 2025)

    https://umdearborn.edu/news/ais-mysterious-black-box-problem-explained

    - Co-Coders (Accessed April 8. 2025)

    https://cocoders.dk/

    - ExekTek (Accessed April 8. 2025)

    https://exektek.com/

    - Hatherley, J. J., 2020, ”Limits of Trust in Medical AI. ” Journal of Medical Ethics, 46(7), 478-481.

    - Hatherley, J., Sparrow, R., & Howard, M. (2024).”The Virtues of Interpretable Medical AI. ” Cambridge Quarterly of Healthcare Ethics, 33(3), 323-332.

    - Hatherley, J. (2025).”A Moving Target in AI-Assisted Decision-Making: Dataset Shift, Model Updating, and the Problem of Update Opacity. ” Ethics and Information Technology, 27, 20.

    https://link.springer.com/article/10.1007/s10676-025-09829-2 - citeas

    - Hyperight, “The Black Box: What We´re Still Getting Wrong about Trusting Machine Learning Models” (Accessed April 8. 2025)

    https://hyperight.com/ai-black-box-what-were-still-getting-wrong-about-trusting-machine-learning-models/

    - Lystlund, Lise Bach (Accessed April 8. 2025)

    https://cocoders.dk/om-os/

    - Nygreen, Jonas (Accessed April 8. 2025)

    https://www.linkedin.com/in/jonasnygreen/


    Music: Sofus Forsberg

    Más Menos
    1 h y 15 m
  • Minotaur Warfare feat. Robert Sparrow
    May 9 2025

    Robert Sparrow is a philosophical pioneer in the field of autonomous weapons. We discuss the current debate and the development since his famous articles "Killer Robots" and "Robots and Respect" were published. We explore the notion of male in se - evil in itself, and Rob presents his idea of Minotaur Warfighting or AI commanders.

    Rob also gives his perspectives on the black box, meaningful human control, programming ethics into machines and the value of fundamental human respect and recognition in war.


    Shownotes:

    Producer and host: Sune With, sunewith@cas.au.dk

    Cover art: Sebastian Gram


    - Dige, Morten, 2012, ”EXPLAINING THE PRINCIPLE OF MALA IN SE”, Journal of Military Ethics, 11:4, 318-332

    - Orend, Brian, 2016, "War", The Stanford Encyclopedia of Philosophy (Spring 2016

    Edition), Edward N. Zalta (ed.)

    https://plato.stanford.edu/archives/spr2016/entries/war/ - 2.2

    - Scharre, Paul, 2016, ”Centaur warfighting: the false choice of humans vs. Automation”. Temp. Int'l & Comp. LJ, 30, 151-165.

    - Scharre, Paul, 2018, ”Army of none: Autonomous weapons and the future of war”. WW Norton & Company.


    SHAPE, 2025, ”⁠NATO ACQUIRES AI-ENABLED WARFIGHTING SYSTEM”, NATO (Accessed April 14, 2025).

    https://shape.nato.int/news-releases/nato-acquires-aienabled-warfighting-system-

    - Sparrow, Robert, 2016, ”Robots and Respect: Assessing the Case Against Autonomous Weapon Systems”, Ethics & International Affairs, 30, no. 1 (2016), pp. 93-116.

    - Sparrow, Robert, 2007, ”Killer Robots”, Journal of Applied Philosophy, Vol. 24, No. 1, pp. 62-77.

    - Sparrow, Robert, 2021. ”Why machines cannot be moral” AI & Society: Journal of Knowledge, Culture and Communication.

    https://doi.org/10.1007/s00146-020-01132-6.

    - Sparrow, Robert; Henschke, Adam, 2023, ”Minotaurs, Not Centaurs: The Future of Manned-Unmanned”, Parameters 53 (1), The Us Army War College Quarterly, pp. 115-130.

    - Sparrow, Robert, 2012 "ONE, Riskless Warfare Revisited: Drones, Asymmetry and the Just Use of Force". Ethics of Drone Strikes: Restraining Remote-Control Killing, Edinburgh: Edinburgh University Press, pp. 10-30.

    https://doi.org/10.1515/9781474483599-004

    - Strawser, Bradley (edi), 2013, ”Killing by remote control -The Ethics of an Unmanned, Military”, Oxford University press.

    - TERMA, 2025, ”Multi-Domain”, (Accessed April 14. 2025)

    https://www.terma.com/products/multi-domain/


    Music: Sofus Forsberg


    Más Menos
    55 m
adbl_web_global_use_to_activate_T1_webcro805_stickypopup
Todavía no hay opiniones