Artwork

Content provided by Data & Society. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Data & Society or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Race After Technology

35:01
 
Share
 

Manage episode 247024380 series 1918297
Content provided by Data & Society. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Data & Society or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Ruha Benjamin discusses the relationship between machine bias and systemic racism, analyzing specific cases of “discriminatory design” and offering tools for a socially-conscious approach to tech development. In "Race After Technology: Abolitionist Tools for the New Jim Code," Ruha Benjamin cuts through tech-industry hype, from everyday apps to complex algorithms, to understand how emerging technologies can reinforce White supremacy and deepen social inequity. Presenting the concept of “the new Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite.
This event is hosted by Data & Society’s Director of Research Sareeta Amrute.

  continue reading

Chapters

1. Introduction (00:00:00)

2. Ruha Benjamin is associate professor of African-American studies at Princeton University. Founder of the Just Data Lab, author of “Race After Technology: Abolitionist Tools For The New Jim Code,” an editor of “Captivating Technology: Reimagining Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life,” among many other publications. (00:00:11)

3. Ruha's work investigates the social dimensions of science, medicine, and technology with a focus on the relationship between innovation and equality, health and justice, knowledge and power. She is the recipient of numerous awards and fellowships, including from the American Council of Learned Societies, The National Science Foundation, The Institute for Advanced Study, and, in 2017, she received the President's Award for distinguished teaching at Princeton. Please join me in welcoming Ruha. (00:00:36)

4. Thank you for that introduction. Good evening, everybody. Good to see you. Thank you so much to all the organizers, my wonderful colleagues, Sareeta Amrute, CJ Landow and Rigoberto Lara, and all my friends here at Data & Society. About a year ago, I had the opportunity to circulate the draft among Data & Society and, in particular, a few people who aren't here now that were also part of that feedback session, Mutale Nkonde, Jessie Daniels, and Kadija Ferryman, among many others, so thank you all for having me back now that the book is complete. I'd also like to join in the acknowledgement, the land acknowledgment and [to] think about this land and the traditional unceded territory of the Lenape. (00:01:09)

5. Let us acknowledge the intertwined legacies of settler colonialism and the transatlantic slave trade which contribute to the creation and continued wealth of this city, of this nation. We acknowledge the reparations owed to black and indigenous communities and nations and the impossibilities of return for generations past. Let us also acknowledge the ancestors in the room tonight, as we fight together for better futures. We are alive in an era of necessary resistance to preserve this planet and all the beautiful creation that we have no doubt is worthy of the struggle, Ashe. With that let me begin with three provocations. (00:02:00)

6. First, racism is productive, not in the sense of being good, but in the literal capacity of racism to produce things of value to some even as it wreaks havoc on others. We're taught to think of racism as an aberration, a glitch, an accident, an isolated incident, a bad apple, in the backwoods and outdated, rather than innovative, systemic, diffused, an attached incident, the entire orchard, in the ivory tower, forward-looking, productive. In sociology, we like to say race is socially constructed, but we often fail to state the corollary that racism constructs. Secondly, I'd like us to think about the way that race and technology shape one another. More and more people are accustomed to thinking about the ethical and social impact of technology, but this is only half of the story. (00:02:43)

7. Social norms, values, and structures all exist prior to any given tech development, so it's not simply the impact of technology but the social inputs that make some inventions appear inevitable and desirable, which leads to a third provocation: That imagination is a contested field of action, not an ephemeral afterthought that we have the luxury to dismiss or romanticize, but a resource, a battleground, an input and output of technology and social order. In fact, we should acknowledge that many people are forced to live inside someone else's imagination and one of the things we have to come to grips with is how the nightmares that many people are forced to endure are the underside of elite fantasies about efficiency, profit, and social control. (00:03:52)

8. Racism among other axes of domination helps to produce this fragmented imagination, misery for some, monopolies for others. This means that for those of us who want to construct a different social reality, one grounded in justice and joy, we can't only critique the underside but we also have to wrestle with the deep investments, the desire even, that many people have for social domination. That's the trailer. Let's start with a relatively new app called Citizen, which will send you real-time crime alerts based on a curated selection of 911 calls. It also offers a way for users to report, live stream, and comment on purported crimes via the app. (00:04:53)

9. It also shows you incidents as red dots on a map so you can avoid supposedly dangerous neighborhoods. Now, many of you are probably thinking, what could possibly go wrong in the age of Barbecue Becky's, calling the police on black people, cooking, walking, breathing out of place. It turns out that even a Stanford educated environmental scientist, living in the Bay Area no less, is an ambassador of the carceral state, calling the police on a cookout at Lake Merritt. It's worth noting too that the app, Citizen, was originally called the less chill name Vigilante and in its rebranding, it also moved away from encouraging people to stop crime but rather now simply to avoid it. (00:05:41)

10. What's most important to our discussion, I think, is that Citizen and other tech fixes for social problems are not simply about technology's impact, but also about how social norms, racial norms, and structure shape what tools are imagined necessary in the first place. How should we understand the duplicity of tech fixes, purported solutions that nevertheless reinforce and even deepen existing hierarchies? One formulation that's hard to miss is the idea of racists robots. A first wave of stories a few years ago seemed to be shocked at the prospect that, in Langdon Winner's terms, artifacts have politics. A second wave seemed less surprised. Well of course technology inherits its creators' biases. (00:06:27)

11. Now, I think we've entered a phase of attempts to override or address the default settings of racists robots, for better or worse, and one of the challenges we face is how to meaningfully differentiate technologies that are used to differentiate us. This combination of coded bias and imagined objectivity is what I term the “new Jim code,” innovation that enables social containment, while appearing fairer than discriminatory practices of a previous era. This riff off of Michelle Alexander's analysis in “The New Jim Crow,” considers how the reproduction of racist forms of social control in successive institutional forms entails a crucial sociotechnical component, that not only hides the nature of domination, but allows it to penetrate every facet of social life under the guise of progress. (00:07:17)

12. This formulation as a highlight here, is directly related to a number of other cousin concepts by Brown, Broussard, Daniels, Eubanks, O'Neil, Noble, and others. A quick example, hot off the presses, illustrating the new Jim Code: “Racial bias in the medical algorithm favors white patients over sicker black patients,” reports the new study by Obermeyer and colleagues, in which the researchers were able to look inside the black box of algorithm design, which is typically not possible with proprietary systems. What's especially important to note is that the algorithm does not explicitly take note of race. That is to say it is race neutral. (00:08:10)

13. By using cost to predict health care need, this digital triaging system unwittingly reproduces racial disparities because on average, black people incur fewer costs for a variety of reasons, including systemic racism and in my review of their study, both of which you can download from the “Journal of Science,” I argue that indifference to social reality on the part of tech designers and adopters can be even more harmful than malicious intent. In the case of this widely used healthcare algorithm, affecting millions of people, more than double the number of black patients would have been enrolled in programs designed to help them stay out of the hospital, if the predictions were actually based on need rather than cost. (00:08:56)

14. Race neutrality, it turns out, is a deadly force. Connecting this with a number of books and articles and works, the new Jim code is as I see it, situated in a hybrid literature that we can think of as race critical code studies. And again, this approach is not only concerned about the impacts of technology, but its production, and particularly, how race and racism enter the process. As we think about how anti-blackness gets encoded in and exercised through automated systems, I write about four conceptual offspring to the new Jim code that follow along a kind of spectrum. The book is organized around these in terms of the chapters. (00:09:46)

15. At this point in the talk, I would normally dive into each of these with examples and analysis but for the sake of time, I'm going to shift gears now to focus on what many people and organizations are already doing about the problem. Like abolitionist practices of a previous era, not all manner of getting free should be exposed. Recall how Frederick Douglass reprimanded those, who reveal the routes that fugitives took to escape slavery, declaring that those supposed white allies turned the Underground Railroad into the Upper Ground Railroad. Likewise, some efforts of those resisting the new Jim code necessitate strategic discretion, while others may be effectively tweeted around the world in an instant. (00:10:32)

16. Exhibit A, 30 minutes after proposing an idea for an app that converts your daily change into bail money to free black people, Compton-born black, trans tech Developer, Dr. Courtney Zeigler added, “and it could be called Appolition,” a riff on abolition and a reference to a growing movement toward divesting resources from policing in prisons and reinvesting in education, employment, mental health, and a broader support system needed to cultivate safe and thriving communities. Calls for abolition are never simply about bringing harmful systems to an end, but also envisioning new ones. After all, the etymology of the word includes root words for “to destroy” and “to grow.” (00:11:20)

17. To date, Appolition has raised more than $137,000. That money being directed to local organizations who've posted bail freeing at least 40 people. When Zeigler and I sat on a panel together at the Allied Media Conference, he addressed audience questions about whether the app is diverting even more money to a bloated carceral system. But as Zeigler clarified, money is returned to the depositor after a case is complete. So donations are continuously recycled to help individuals, like an endowment. That said, the motivation behind ventures like Appolition can be mimicked by people who don't have an abolitionist commitment. (00:12:08)

18. Zeigler described a venture that Jay-Z is investing millions in called Promise. Although Jay-Z and others call it a decarceration startup because it addresses the problem of pre-trial detention, Promise is in the business of tracking individuals via the app and GPS monitoring, creating a powerful mechanism that makes it easier to lock people back up. Following criticism by the organization BYP 100, we should understand that Promise exemplifies the new Jim code. It's dangerous and insidious precisely because it's packaged as social betterment. The good news is that tech industry insiders themselves have increasingly been speaking out against the most egregious forms of corporate collusion with state sanctioned racism and militarism. (00:12:28)

19. For example, thousands of Google employees condemned the company's collaboration on a Pentagon program that uses AI to make drone strikes more effective, and a growing number of Microsoft employees are opposed to the company's ICE contract saying that, "As people who build the technologies that Microsoft profits from, we refuse to be complicit." This kind of informed refusal is certainly necessary, as we build a movement to counter the new Jim code, but we can't wait for workers' sympathies to sway the industry. And as this article published by Science For The People reminds us, contrary to popular narratives, organizing among technical workers has a vibrant history, including engineers and technicians in the 60s and 70s who fought professionalism, individualism, and reformism to contribute to radical labor organizing. (00:13:36)

20. The current tech workers movement, which includes students across our many institutions, can draw from past organizers' experiences and learning to navigate the contradictions and complexities of organizing and tech today, which includes building solidarity across race and class. For example, when the predominantly East African Amazon workers in the company's Minnesota warehouses organized a strike on Prime Day to demand better work conditions, some engineers from Seattle came to support. In terms of civil society, initiatives like Data for Black Lives and the Detroit Community Technology Project offer an even more far-reaching approach. (00:14:29)

21. The former brings people working across a number of agencies and organizations together in a proactive approach to tech justice, especially at the policy level, and the latter develops and uses technology rooted in community needs, offering support to grassroots networks doing data justice research, including hosting what they call DiscoTechs, which stands for discovering technology, which are multi-media, mobile neighborhood workshop fairs that can be adapted in other locales. And I'll just quickly mention one of the concrete collaborations that's grown out of Data for Black Lives. A few years ago, several government agencies in St. Paul, Minnesota, including the police department and the public schools, formed the controversial joint powers agreement called “the innovation project, giving these agencies broad discretion to collect and share data on young people, with the goal of developing predictive tools to identify at-risk youth in the city. There was immediate and broad-based backlash from the community with the support of the Data For Black Lives Network and in 2017, a group of over 20 local organizations formed what they called The Stop The Cradle to Prison Algorithm Coalition. Eventually, the city of St. Paul dissolved the agreement in favor of what they call a more community led approach, which was a huge victory for activists and community members who've been fighting these policies for over a year. (00:15:08)

22. Another abolitionist approach to the new Jim code that I'd like to mention is Our Data Bodies' “Digital Defense Playbook,” which you can download for free online and make a donation to the organization if you're inclined. The playbook contains in-depth guidelines for facilitating workshops and group activities plus tools, tip sheets, reflection pieces, and rich stories crafted from in-depth interviews in communities from Charlotte, Detroit, and Los Angeles that are dealing with pervasive and punitive data collection and data-driven systems, with the aim of developing power not paranoia, according to Our Data Bodies. (00:16:29)

23. Although the Playbook presents some of the strategies people are using, in the spirit of Douglass's admonition about the Upper Ground Railroad, not everything that the team knows is exposed. Detroit based digital justice activist Tawana Petty put it bluntly, she says "Let me be real, y’all are getting the Digital Defense Playbook but we didn't tell you all their strategies and we never will, because we want our communities to continue to survive and to thrive and so the stuff that's keeping them alive we're keeping to ourselves." Finally, when it comes to rethinking STEM education as a ground zero for re-imagining the relationship between technology and society, there are a number of initiatives underway. (00:17:06)

24. I'll just mention one very concrete resource that you can also download, the “Advancing Racial Literacy in Tech” handbook, developed by our very own wonderful colleagues here at Data & Society. The aims of this intervention are threefold. First, is to develop an intellectual understanding of how structural racism operates in algorithms, social media platforms, and technologies not yet developed. An emotional intelligence concerning how to resolve racially stressful situations within organizations and a commitment to take action to reduce harms to communities of color. The fact is, data disenfranchisement and domination has always been met with resistance and appropriation in which activists, scholars, and artists have sharpened abolitionist tools that employ data for liberation. (00:17:49)

25. From Dubois' modernist data visualizations to Ida B. Wells-Barnett's expert deployment of statistics in the red record, there's a long tradition of employing and challenging data for black lives. In that spirit, the late legal and critical race scholar, Derrick A. Bell, encouraged a radical assessment of reality through creative methods and racial reversals. He said that, “to see things as they really are, you must imagine them for what they might be.” One of my favorite examples of what we might call a Bellian racial reversal, is this parody project that begins by subverting the anti-black logics embedded in hi-tech approaches to crime prevention. (00:18:40)

26. Instead of using predictive policing techniques to forecast street crime, the “White Collar Early Warning System” flips the script by creating a heat map that flags city blocks where financial crimes are likely to occur. The system not only brings the hidden, but no less deadly, crimes of capitalism into view, but includes an app that alerts users when they've entered high-risk areas to encourage citizen policing and awareness. Taking it one step further, the development team is working on a facial recognition program to flag individuals who are likely perpetrators and the training set used to design the algorithm includes the profile photos of 7,000 corporate executives downloaded from LinkedIn. (00:19:22)

27. Not surprisingly, the average face of a criminal is white and male. To be sure, creative exercises like this are only comical when we ignore that all of its features are drawn directly from actually existing practices and proposals in the real world, including the use of facial images to predict criminality. By deliberately and inventively upsetting the status quo in this manner, analysts can better understand and expose the many forms of discrimination embedded in and enabled by technology. If, as I've suggested at the start, the carceral imagination captures and contains, then an abolitionist imagination opens up possibilities and pathways. (00:20:05)

28. It creates new templates and builds on a critical intellectual traditions that have continually developed insights and strategies grounded in justice. May we all find ways to build on this tradition. Thank you for your attention. (00:20:46)

117 episodes

Artwork

Race After Technology

Data & Society

164 subscribers

published

iconShare
 
Manage episode 247024380 series 1918297
Content provided by Data & Society. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Data & Society or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://player.fm/legal.

Ruha Benjamin discusses the relationship between machine bias and systemic racism, analyzing specific cases of “discriminatory design” and offering tools for a socially-conscious approach to tech development. In "Race After Technology: Abolitionist Tools for the New Jim Code," Ruha Benjamin cuts through tech-industry hype, from everyday apps to complex algorithms, to understand how emerging technologies can reinforce White supremacy and deepen social inequity. Presenting the concept of “the new Jim Code,” she shows how a range of discriminatory designs encode inequity by explicitly amplifying racial hierarchies; by ignoring but thereby replicating social divisions; or by aiming to fix racial bias but ultimately doing quite the opposite.
This event is hosted by Data & Society’s Director of Research Sareeta Amrute.

  continue reading

Chapters

1. Introduction (00:00:00)

2. Ruha Benjamin is associate professor of African-American studies at Princeton University. Founder of the Just Data Lab, author of “Race After Technology: Abolitionist Tools For The New Jim Code,” an editor of “Captivating Technology: Reimagining Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life,” among many other publications. (00:00:11)

3. Ruha's work investigates the social dimensions of science, medicine, and technology with a focus on the relationship between innovation and equality, health and justice, knowledge and power. She is the recipient of numerous awards and fellowships, including from the American Council of Learned Societies, The National Science Foundation, The Institute for Advanced Study, and, in 2017, she received the President's Award for distinguished teaching at Princeton. Please join me in welcoming Ruha. (00:00:36)

4. Thank you for that introduction. Good evening, everybody. Good to see you. Thank you so much to all the organizers, my wonderful colleagues, Sareeta Amrute, CJ Landow and Rigoberto Lara, and all my friends here at Data & Society. About a year ago, I had the opportunity to circulate the draft among Data & Society and, in particular, a few people who aren't here now that were also part of that feedback session, Mutale Nkonde, Jessie Daniels, and Kadija Ferryman, among many others, so thank you all for having me back now that the book is complete. I'd also like to join in the acknowledgement, the land acknowledgment and [to] think about this land and the traditional unceded territory of the Lenape. (00:01:09)

5. Let us acknowledge the intertwined legacies of settler colonialism and the transatlantic slave trade which contribute to the creation and continued wealth of this city, of this nation. We acknowledge the reparations owed to black and indigenous communities and nations and the impossibilities of return for generations past. Let us also acknowledge the ancestors in the room tonight, as we fight together for better futures. We are alive in an era of necessary resistance to preserve this planet and all the beautiful creation that we have no doubt is worthy of the struggle, Ashe. With that let me begin with three provocations. (00:02:00)

6. First, racism is productive, not in the sense of being good, but in the literal capacity of racism to produce things of value to some even as it wreaks havoc on others. We're taught to think of racism as an aberration, a glitch, an accident, an isolated incident, a bad apple, in the backwoods and outdated, rather than innovative, systemic, diffused, an attached incident, the entire orchard, in the ivory tower, forward-looking, productive. In sociology, we like to say race is socially constructed, but we often fail to state the corollary that racism constructs. Secondly, I'd like us to think about the way that race and technology shape one another. More and more people are accustomed to thinking about the ethical and social impact of technology, but this is only half of the story. (00:02:43)

7. Social norms, values, and structures all exist prior to any given tech development, so it's not simply the impact of technology but the social inputs that make some inventions appear inevitable and desirable, which leads to a third provocation: That imagination is a contested field of action, not an ephemeral afterthought that we have the luxury to dismiss or romanticize, but a resource, a battleground, an input and output of technology and social order. In fact, we should acknowledge that many people are forced to live inside someone else's imagination and one of the things we have to come to grips with is how the nightmares that many people are forced to endure are the underside of elite fantasies about efficiency, profit, and social control. (00:03:52)

8. Racism among other axes of domination helps to produce this fragmented imagination, misery for some, monopolies for others. This means that for those of us who want to construct a different social reality, one grounded in justice and joy, we can't only critique the underside but we also have to wrestle with the deep investments, the desire even, that many people have for social domination. That's the trailer. Let's start with a relatively new app called Citizen, which will send you real-time crime alerts based on a curated selection of 911 calls. It also offers a way for users to report, live stream, and comment on purported crimes via the app. (00:04:53)

9. It also shows you incidents as red dots on a map so you can avoid supposedly dangerous neighborhoods. Now, many of you are probably thinking, what could possibly go wrong in the age of Barbecue Becky's, calling the police on black people, cooking, walking, breathing out of place. It turns out that even a Stanford educated environmental scientist, living in the Bay Area no less, is an ambassador of the carceral state, calling the police on a cookout at Lake Merritt. It's worth noting too that the app, Citizen, was originally called the less chill name Vigilante and in its rebranding, it also moved away from encouraging people to stop crime but rather now simply to avoid it. (00:05:41)

10. What's most important to our discussion, I think, is that Citizen and other tech fixes for social problems are not simply about technology's impact, but also about how social norms, racial norms, and structure shape what tools are imagined necessary in the first place. How should we understand the duplicity of tech fixes, purported solutions that nevertheless reinforce and even deepen existing hierarchies? One formulation that's hard to miss is the idea of racists robots. A first wave of stories a few years ago seemed to be shocked at the prospect that, in Langdon Winner's terms, artifacts have politics. A second wave seemed less surprised. Well of course technology inherits its creators' biases. (00:06:27)

11. Now, I think we've entered a phase of attempts to override or address the default settings of racists robots, for better or worse, and one of the challenges we face is how to meaningfully differentiate technologies that are used to differentiate us. This combination of coded bias and imagined objectivity is what I term the “new Jim code,” innovation that enables social containment, while appearing fairer than discriminatory practices of a previous era. This riff off of Michelle Alexander's analysis in “The New Jim Crow,” considers how the reproduction of racist forms of social control in successive institutional forms entails a crucial sociotechnical component, that not only hides the nature of domination, but allows it to penetrate every facet of social life under the guise of progress. (00:07:17)

12. This formulation as a highlight here, is directly related to a number of other cousin concepts by Brown, Broussard, Daniels, Eubanks, O'Neil, Noble, and others. A quick example, hot off the presses, illustrating the new Jim Code: “Racial bias in the medical algorithm favors white patients over sicker black patients,” reports the new study by Obermeyer and colleagues, in which the researchers were able to look inside the black box of algorithm design, which is typically not possible with proprietary systems. What's especially important to note is that the algorithm does not explicitly take note of race. That is to say it is race neutral. (00:08:10)

13. By using cost to predict health care need, this digital triaging system unwittingly reproduces racial disparities because on average, black people incur fewer costs for a variety of reasons, including systemic racism and in my review of their study, both of which you can download from the “Journal of Science,” I argue that indifference to social reality on the part of tech designers and adopters can be even more harmful than malicious intent. In the case of this widely used healthcare algorithm, affecting millions of people, more than double the number of black patients would have been enrolled in programs designed to help them stay out of the hospital, if the predictions were actually based on need rather than cost. (00:08:56)

14. Race neutrality, it turns out, is a deadly force. Connecting this with a number of books and articles and works, the new Jim code is as I see it, situated in a hybrid literature that we can think of as race critical code studies. And again, this approach is not only concerned about the impacts of technology, but its production, and particularly, how race and racism enter the process. As we think about how anti-blackness gets encoded in and exercised through automated systems, I write about four conceptual offspring to the new Jim code that follow along a kind of spectrum. The book is organized around these in terms of the chapters. (00:09:46)

15. At this point in the talk, I would normally dive into each of these with examples and analysis but for the sake of time, I'm going to shift gears now to focus on what many people and organizations are already doing about the problem. Like abolitionist practices of a previous era, not all manner of getting free should be exposed. Recall how Frederick Douglass reprimanded those, who reveal the routes that fugitives took to escape slavery, declaring that those supposed white allies turned the Underground Railroad into the Upper Ground Railroad. Likewise, some efforts of those resisting the new Jim code necessitate strategic discretion, while others may be effectively tweeted around the world in an instant. (00:10:32)

16. Exhibit A, 30 minutes after proposing an idea for an app that converts your daily change into bail money to free black people, Compton-born black, trans tech Developer, Dr. Courtney Zeigler added, “and it could be called Appolition,” a riff on abolition and a reference to a growing movement toward divesting resources from policing in prisons and reinvesting in education, employment, mental health, and a broader support system needed to cultivate safe and thriving communities. Calls for abolition are never simply about bringing harmful systems to an end, but also envisioning new ones. After all, the etymology of the word includes root words for “to destroy” and “to grow.” (00:11:20)

17. To date, Appolition has raised more than $137,000. That money being directed to local organizations who've posted bail freeing at least 40 people. When Zeigler and I sat on a panel together at the Allied Media Conference, he addressed audience questions about whether the app is diverting even more money to a bloated carceral system. But as Zeigler clarified, money is returned to the depositor after a case is complete. So donations are continuously recycled to help individuals, like an endowment. That said, the motivation behind ventures like Appolition can be mimicked by people who don't have an abolitionist commitment. (00:12:08)

18. Zeigler described a venture that Jay-Z is investing millions in called Promise. Although Jay-Z and others call it a decarceration startup because it addresses the problem of pre-trial detention, Promise is in the business of tracking individuals via the app and GPS monitoring, creating a powerful mechanism that makes it easier to lock people back up. Following criticism by the organization BYP 100, we should understand that Promise exemplifies the new Jim code. It's dangerous and insidious precisely because it's packaged as social betterment. The good news is that tech industry insiders themselves have increasingly been speaking out against the most egregious forms of corporate collusion with state sanctioned racism and militarism. (00:12:28)

19. For example, thousands of Google employees condemned the company's collaboration on a Pentagon program that uses AI to make drone strikes more effective, and a growing number of Microsoft employees are opposed to the company's ICE contract saying that, "As people who build the technologies that Microsoft profits from, we refuse to be complicit." This kind of informed refusal is certainly necessary, as we build a movement to counter the new Jim code, but we can't wait for workers' sympathies to sway the industry. And as this article published by Science For The People reminds us, contrary to popular narratives, organizing among technical workers has a vibrant history, including engineers and technicians in the 60s and 70s who fought professionalism, individualism, and reformism to contribute to radical labor organizing. (00:13:36)

20. The current tech workers movement, which includes students across our many institutions, can draw from past organizers' experiences and learning to navigate the contradictions and complexities of organizing and tech today, which includes building solidarity across race and class. For example, when the predominantly East African Amazon workers in the company's Minnesota warehouses organized a strike on Prime Day to demand better work conditions, some engineers from Seattle came to support. In terms of civil society, initiatives like Data for Black Lives and the Detroit Community Technology Project offer an even more far-reaching approach. (00:14:29)

21. The former brings people working across a number of agencies and organizations together in a proactive approach to tech justice, especially at the policy level, and the latter develops and uses technology rooted in community needs, offering support to grassroots networks doing data justice research, including hosting what they call DiscoTechs, which stands for discovering technology, which are multi-media, mobile neighborhood workshop fairs that can be adapted in other locales. And I'll just quickly mention one of the concrete collaborations that's grown out of Data for Black Lives. A few years ago, several government agencies in St. Paul, Minnesota, including the police department and the public schools, formed the controversial joint powers agreement called “the innovation project, giving these agencies broad discretion to collect and share data on young people, with the goal of developing predictive tools to identify at-risk youth in the city. There was immediate and broad-based backlash from the community with the support of the Data For Black Lives Network and in 2017, a group of over 20 local organizations formed what they called The Stop The Cradle to Prison Algorithm Coalition. Eventually, the city of St. Paul dissolved the agreement in favor of what they call a more community led approach, which was a huge victory for activists and community members who've been fighting these policies for over a year. (00:15:08)

22. Another abolitionist approach to the new Jim code that I'd like to mention is Our Data Bodies' “Digital Defense Playbook,” which you can download for free online and make a donation to the organization if you're inclined. The playbook contains in-depth guidelines for facilitating workshops and group activities plus tools, tip sheets, reflection pieces, and rich stories crafted from in-depth interviews in communities from Charlotte, Detroit, and Los Angeles that are dealing with pervasive and punitive data collection and data-driven systems, with the aim of developing power not paranoia, according to Our Data Bodies. (00:16:29)

23. Although the Playbook presents some of the strategies people are using, in the spirit of Douglass's admonition about the Upper Ground Railroad, not everything that the team knows is exposed. Detroit based digital justice activist Tawana Petty put it bluntly, she says "Let me be real, y’all are getting the Digital Defense Playbook but we didn't tell you all their strategies and we never will, because we want our communities to continue to survive and to thrive and so the stuff that's keeping them alive we're keeping to ourselves." Finally, when it comes to rethinking STEM education as a ground zero for re-imagining the relationship between technology and society, there are a number of initiatives underway. (00:17:06)

24. I'll just mention one very concrete resource that you can also download, the “Advancing Racial Literacy in Tech” handbook, developed by our very own wonderful colleagues here at Data & Society. The aims of this intervention are threefold. First, is to develop an intellectual understanding of how structural racism operates in algorithms, social media platforms, and technologies not yet developed. An emotional intelligence concerning how to resolve racially stressful situations within organizations and a commitment to take action to reduce harms to communities of color. The fact is, data disenfranchisement and domination has always been met with resistance and appropriation in which activists, scholars, and artists have sharpened abolitionist tools that employ data for liberation. (00:17:49)

25. From Dubois' modernist data visualizations to Ida B. Wells-Barnett's expert deployment of statistics in the red record, there's a long tradition of employing and challenging data for black lives. In that spirit, the late legal and critical race scholar, Derrick A. Bell, encouraged a radical assessment of reality through creative methods and racial reversals. He said that, “to see things as they really are, you must imagine them for what they might be.” One of my favorite examples of what we might call a Bellian racial reversal, is this parody project that begins by subverting the anti-black logics embedded in hi-tech approaches to crime prevention. (00:18:40)

26. Instead of using predictive policing techniques to forecast street crime, the “White Collar Early Warning System” flips the script by creating a heat map that flags city blocks where financial crimes are likely to occur. The system not only brings the hidden, but no less deadly, crimes of capitalism into view, but includes an app that alerts users when they've entered high-risk areas to encourage citizen policing and awareness. Taking it one step further, the development team is working on a facial recognition program to flag individuals who are likely perpetrators and the training set used to design the algorithm includes the profile photos of 7,000 corporate executives downloaded from LinkedIn. (00:19:22)

27. Not surprisingly, the average face of a criminal is white and male. To be sure, creative exercises like this are only comical when we ignore that all of its features are drawn directly from actually existing practices and proposals in the real world, including the use of facial images to predict criminality. By deliberately and inventively upsetting the status quo in this manner, analysts can better understand and expose the many forms of discrimination embedded in and enabled by technology. If, as I've suggested at the start, the carceral imagination captures and contains, then an abolitionist imagination opens up possibilities and pathways. (00:20:05)

28. It creates new templates and builds on a critical intellectual traditions that have continually developed insights and strategies grounded in justice. May we all find ways to build on this tradition. Thank you for your attention. (00:20:46)

117 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Quick Reference Guide