Artwork

Content provided by SWI swissinfo.ch. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SWI swissinfo.ch or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

New wars, new weapons and the Geneva Conventions

25:00
 
Share
 

Manage episode 415425329 series 2789582
Content provided by SWI swissinfo.ch. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SWI swissinfo.ch or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Send us a text

In the wars in Ukraine and in the Middle East, new, autonomous weapons are being used. Our Inside Geneva podcast asks whether we’re losing the race to control them – and the artificial intelligence systems that run them.

“Autonomous weapons systems raise significant moral, ethical, and legal problems challenging human control over the use of force and handing over life-and-death decision-making to machines,” says Sai Bourothu, specialist in automated decision research with the Campaign to Stop Killer Robots.

How can we be sure an autonomous weapon will do what we humans originally intended? Who’s in control?

Jean-Marc Rickli from the Geneva Centre for Security Policy adds: “AI and machine learning basically lead to a situation where the machine is able to learn. And so now, if you talk to specialists, to scientists, they will tell you that it's a black box, we don't understand, it's very difficult to backtrack.”

Our listeners asked if an autonomous weapon could show empathy? Could it differentiate between a fighter and a child? Last year, an experiment asked patients to rate chatbot doctors versus human doctors.

“Medical chatbots ranked much better in the quality. But they also asked them to rank empathy. And on the empathy dimension they also ranked better. If that is the case, then you opened up a Pandora’s box that will be completely transformative for disinformation,” explains Rickli.

Are we going to lose our humanity because we think machines are not only more reliable, but also kinder?

“I think it's going to be an incredibly immense task to code something such as empathy. I think almost as close to the question of whether machines can love,” says Bourothu.

Join host Imogen Foulkes on the Inside Geneva podcast to learn more about this topic.

Get in touch!

Thank you for listening! If you like what we do, please leave a review or subscribe to our newsletter.
For more stories on the international Geneva please visit www.swissinfo.ch/
Host: Imogen Foulkes
Production assitant: Claire-Marie Germain
Distribution: Sara Pasino
Marketing: Xin Zhang

  continue reading

Chapters

1. New wars, new weapons and the Geneva Conventions (00:00:00)

2. The Ethics of Autonomous Weapons (00:00:07)

3. The Rise of Empathetic Machines (00:15:49)

127 episodes

Artwork
iconShare
 
Manage episode 415425329 series 2789582
Content provided by SWI swissinfo.ch. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by SWI swissinfo.ch or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Send us a text

In the wars in Ukraine and in the Middle East, new, autonomous weapons are being used. Our Inside Geneva podcast asks whether we’re losing the race to control them – and the artificial intelligence systems that run them.

“Autonomous weapons systems raise significant moral, ethical, and legal problems challenging human control over the use of force and handing over life-and-death decision-making to machines,” says Sai Bourothu, specialist in automated decision research with the Campaign to Stop Killer Robots.

How can we be sure an autonomous weapon will do what we humans originally intended? Who’s in control?

Jean-Marc Rickli from the Geneva Centre for Security Policy adds: “AI and machine learning basically lead to a situation where the machine is able to learn. And so now, if you talk to specialists, to scientists, they will tell you that it's a black box, we don't understand, it's very difficult to backtrack.”

Our listeners asked if an autonomous weapon could show empathy? Could it differentiate between a fighter and a child? Last year, an experiment asked patients to rate chatbot doctors versus human doctors.

“Medical chatbots ranked much better in the quality. But they also asked them to rank empathy. And on the empathy dimension they also ranked better. If that is the case, then you opened up a Pandora’s box that will be completely transformative for disinformation,” explains Rickli.

Are we going to lose our humanity because we think machines are not only more reliable, but also kinder?

“I think it's going to be an incredibly immense task to code something such as empathy. I think almost as close to the question of whether machines can love,” says Bourothu.

Join host Imogen Foulkes on the Inside Geneva podcast to learn more about this topic.

Get in touch!

Thank you for listening! If you like what we do, please leave a review or subscribe to our newsletter.
For more stories on the international Geneva please visit www.swissinfo.ch/
Host: Imogen Foulkes
Production assitant: Claire-Marie Germain
Distribution: Sara Pasino
Marketing: Xin Zhang

  continue reading

Chapters

1. New wars, new weapons and the Geneva Conventions (00:00:00)

2. The Ethics of Autonomous Weapons (00:00:07)

3. The Rise of Empathetic Machines (00:15:49)

127 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide