While there is a lot of excitement about AI research, there are also concerns about the way it might be implemented, used and abused. In this episode Hannah investigates the more human side of the technology, some ethical issues around how it is developed and used, and the efforts to create a future of AI that works for everyone.
If you have a question or feedback on the series, message us on Twitter (@DeepMind using the hashtag #DMpodcast) or email us at [email protected].
Further reading:
The Partnership on AIProPublica: investigation into machine bias in criminal sentencingScience Museum – free exhibition: Driverless: who is in control (until Oct 2020)Survival of the best fit: An interactive game that demonstrates some of the ways in which bias can be introduced into AI systems, in this case for hiringJoy Buolamwini: AI, Ain’t I a Woman: A spoken word piece exploring AI bias, and systems not recognising prominent black womenHannah Fry: Hello World - How to be Human in the Age of the MachineDeepMind: Safety and EthicsFuture of Humanity Institute: AI Governance:A Research Agenda Interviewees: Verity Harding, Co-Lead of DeepMind Ethics and Society; DeepMind’s COO Lila Ibrahim, and research scientists William Isaac and Silvia Chiappa.
Credits: Presenter: Hannah Fry Editor: David Prest Senior Producer: Louisa Field Producers: Amy Racs, Dan Hardoon Binaural Sound: Lucinda Mason-Brown Music composition: Eleni Shaw (with help from Sander Dieleman and WaveNet) Commissioned by DeepMind
Please like and subscribe on your preferred podcast platform. Want to share feedback? Or have a suggestion for a guest that we should have on next? Leave us a comment on YouTube and stay tuned for future episodes.
While there is a lot of excitement about AI research, there are also concerns about the way it might be implemented, used and abused. In this episode Hannah investigates the more human side of the technology, some ethical issues around how it is developed and used, and the efforts to create a future of AI that works for everyone.
If you have a question or feedback on the series, message us on Twitter (@DeepMind using the hashtag #DMpodcast) or email us at [email protected].
Further reading:
The Partnership on AIProPublica: investigation into machine bias in criminal sentencingScience Museum – free exhibition: Driverless: who is in control (until Oct 2020)Survival of the best fit: An interactive game that demonstrates some of the ways in which bias can be introduced into AI systems, in this case for hiringJoy Buolamwini: AI, Ain’t I a Woman: A spoken word piece exploring AI bias, and systems not recognising prominent black womenHannah Fry: Hello World - How to be Human in the Age of the MachineDeepMind: Safety and EthicsFuture of Humanity Institute: AI Governance:A Research Agenda Interviewees: Verity Harding, Co-Lead of DeepMind Ethics and Society; DeepMind’s COO Lila Ibrahim, and research scientists William Isaac and Silvia Chiappa.
Credits: Presenter: Hannah Fry Editor: David Prest Senior Producer: Louisa Field Producers: Amy Racs, Dan Hardoon Binaural Sound: Lucinda Mason-Brown Music composition: Eleni Shaw (with help from Sander Dieleman and WaveNet) Commissioned by DeepMind
Please like and subscribe on your preferred podcast platform. Want to share feedback? Or have a suggestion for a guest that we should have on next? Leave us a comment on YouTube and stay tuned for future episodes.
Nyd den ubegrænsede adgang til tusindvis af spændende e- og lydbøger - helt gratis
Dansk
Danmark