Artwork

Content provided by Marshall Poe. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Marshall Poe or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Alan F. Blackwell, "Moral Codes: Designing Alternatives to AI" (MIT Press, 2024)

55:09
 
Share
 

Manage episode 443756480 series 2508295
Content provided by Marshall Poe. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Marshall Poe or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Why the world needs less AI and better programming languages. Decades ago, we believed that robots and computers would take over all the boring jobs and drudgery, leaving humans to a life of leisure. This hasn’t happened. Instead, humans are still doing boring jobs, and even worse, AI researchers have built technology that is creative, self-aware, and emotional—doing the tasks humans were supposed to enjoy. How did we get here?

In Moral Codes: Designing Alternatives to AI (MIT Press, 2024), Alan Blackwell argues that there is a fundamental flaw in the research agenda of AI. What humanity needs, Blackwell argues, is better ways to tell computers what we want them to do, with new and better programming languages: More Open Representations, Access to Learning, and Control Over Digital Expression, in other words, MORAL CODE. Blackwell draws on his deep experiences as a programming language designer—which he has been doing since 1983—to unpack fundamental principles of interaction design and explain their technical relationship to ideas of creativity and fairness. Taking aim at software that constrains our conversations with strict word counts or infantilizes human interaction with likes and emojis, Blackwell shows how to design software that is better—not more efficient or more profitable, but better for society and better for all people. Covering recent research and the latest smart tools, Blackwell offers rich design principles for a better kind of software—and a better kind of world.

Alan F. Blackwell is Professor of Interdisciplinary Design in the Cambridge University department of Computer Science and Technology. He is a Fellow of Darwin College Cambridge, cofounder with David Good of the Crucible Network for Research in Interdisciplinary Design, and with David and Lara Allen the Global Challenges strategic research initiative of the University of Cambridge.

Dr. Michael LaMagna is the Information Literacy Program & Library Services Coordinator and Professor of Library Services at Delaware County Community College.

Learn more about your ad choices. Visit megaphone.fm/adchoices

Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/critical-theory

  continue reading

1767 episodes

Artwork
iconShare
 
Manage episode 443756480 series 2508295
Content provided by Marshall Poe. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Marshall Poe or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Why the world needs less AI and better programming languages. Decades ago, we believed that robots and computers would take over all the boring jobs and drudgery, leaving humans to a life of leisure. This hasn’t happened. Instead, humans are still doing boring jobs, and even worse, AI researchers have built technology that is creative, self-aware, and emotional—doing the tasks humans were supposed to enjoy. How did we get here?

In Moral Codes: Designing Alternatives to AI (MIT Press, 2024), Alan Blackwell argues that there is a fundamental flaw in the research agenda of AI. What humanity needs, Blackwell argues, is better ways to tell computers what we want them to do, with new and better programming languages: More Open Representations, Access to Learning, and Control Over Digital Expression, in other words, MORAL CODE. Blackwell draws on his deep experiences as a programming language designer—which he has been doing since 1983—to unpack fundamental principles of interaction design and explain their technical relationship to ideas of creativity and fairness. Taking aim at software that constrains our conversations with strict word counts or infantilizes human interaction with likes and emojis, Blackwell shows how to design software that is better—not more efficient or more profitable, but better for society and better for all people. Covering recent research and the latest smart tools, Blackwell offers rich design principles for a better kind of software—and a better kind of world.

Alan F. Blackwell is Professor of Interdisciplinary Design in the Cambridge University department of Computer Science and Technology. He is a Fellow of Darwin College Cambridge, cofounder with David Good of the Crucible Network for Research in Interdisciplinary Design, and with David and Lara Allen the Global Challenges strategic research initiative of the University of Cambridge.

Dr. Michael LaMagna is the Information Literacy Program & Library Services Coordinator and Professor of Library Services at Delaware County Community College.

Learn more about your ad choices. Visit megaphone.fm/adchoices

Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/critical-theory

  continue reading

1767 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide