Sign in: Staff/Students
Michael Fisher, Professor of Computer Science and Director of the Centre for Autonomous Systems Technology, visited Geneva to attend an international meeting of experts to discuss the limits of, and issues around, autonomy in relation to autonomous weapon systems.
“Established in 1863, the International Committee of the Red Cross helps people affected by conflict and armed violence and promotes the laws that protect victims of war.
An independent and neutral organization, its mandate stems essentially from international humanitarian law, particularly the Geneva Conventions of 1949. Funded mainly by voluntary donations from governments and from National Red Cross and Red Crescent Societies, the ICRC employs over 16,000 people across 80 countries, and is based in Geneva.
In recent years, the ICRC has been engaged in debates about autonomous weapon systems, holding international expert meetings, and contributing to discussions at the United Nations.
To better understand the limits of, and issues around, autonomy ICRC recently held a meeting of international experts at the ICRC headquarters in Geneva on 7th and 8th June 2018.
The aim was to consider core questions about human control, with a particular focus on understanding human supervision, human-machine interaction, predictability and reliability, as well as the implications of associated AI, such as image recognition and machine learning.
I participated in the meeting and discussed human control, reliability and predictability of autonomous systems and the AI sub-systems they comprise. The ICRC’s position is that States must agree limits on autonomy in weapon systems to ensure compliance with international humanitarian law, and other applicable international law, and to satisfy ethical concerns.
The meeting consisted of around 12 international experts together with 6 experts in international law from the ICRC’s legal division.
The key issues considered included the unreliability and unpredictability of some of the AI techniques that might be used.
The ICRC intends to use the insights gained to inform its thinking about human control over weapons with autonomous functions, and will publish a report in due course based on discussions.”
You must be logged in to post a comment.
All recent news
Heseltine Institute report for LCR APPG leads calls for Westminster to deliver “clean, green and inclusive growth”
Using Liverpool’s e-scooters safely
Llama antibodies have “significant potential” as potent COVID-19 treatment
COVID-19 hospital mortality rates not reducing in patients with history of cancer, study finds
Follow us on social media
Liverpool is leading a new global network to bring together animal & human #coronavirus research communities.
Funded by @UKRI partners include @APHAgovuk @roslininstitute @Pirbright_Inst @Cambridge_Uni
#OneHealth #LivUniCovid @ThePandemicInst @livuniHLS https://bit.ly/3AEEqHr
Preventing the next global pandemics depends on the collective capacity of universities. For @Wonkhe this morning I’ve written about Liverpool, Covid-19 and @ThePandemicInst, and what next: https://wonkhe.com/blogs/if-pandemics-are-predictable-could-they-be-preventable/
Researchers from @livuniHLS have contributed to new research showing that llama antibodies have 'significant potential' as a #COVID19 treatment.
Published in @NatureComms today, the study was led by @RosFrankInst.
#LivUniCovid | @molvirol | @LivUni_IVES