Manage episode 177434799 series 1274513
00:16 – Welcome to “The Tale of Space Cat Burritos” …we mean, “Greater Than Code!”
02:26 – Space Technology and the Cultural Portrayal of Science
08:24 – The Influence of Science Fiction on the Current Developments in Science
14:47 – What is sci-fi telling us about the world we live in now?
18:34 – “Hard” vs “Soft” Science Fiction; “Hard” Conference Talks vs “Soft” Talks
24:43 – Understanding How People Work to Build Better Technology; Fighting for Accessibility in Science
33:11 – Machine Learning
“Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” – Jeff Goldblum as Dr. Ian Malcolm in Jurassic Park
37:52 – Scarcity and Exploitation: Looking at Power Dynamics and Relationships Between Groups and People
41:34 – Reasons We Prefer to Focus on Technology; Siloing and Specialization
50:16 – Control: Who is the manager? Treating People Equally
52:46 – Congruency and Being Congruent: It’s a People Problem!
“Emotions are valid inputs to every thought process.” – Coraline Ada Ehmke
01:01:44 – How do we know we are right?
Rein: These issues go straight up to the top in terms of the philosophical ladder we’re trying to climb of what do we value? How do we get other people to share our values? It doesn’t get easier by ignoring that the problem is that difficult and pretending that it’s just technical.
Coraline: It’s the responsibility of technologists to think about the social impact of the technical solutions they are making, whether that means by being better informed and striving to be generalists, or by making sure we are being inclusive and giving voice to people with different perspectives and levels of expertise on our teams to make sure we are addressing problems deeply and not just from one particular silo.
Ashe: Understanding how we are looking at a problem ethically, how we’re looking at it technically, and how we’re looking at it from a human point of view? What are the potential effects?
Brad: The laws of nature still exist in the absence of humans. Humans are the reason things are messy and complicated.
Please leave us a review on iTunes!
CORALINE: Hello and welcome to Episode 29 in our ongoing series, ‘The Tale of Space Cat Burritos’. I’m Coraline Ada Ehmke and I am joined today by Astrid Countee.
ASTRID: Thank you, Coraline but I’m pretty sure that our show is called Greater Than Code.
CORALINE: You’re such a joy-kill.
ASTRID: [Laughs] I’m sorry. I’m also here today —
CORALINE: Was that killjoy? I was messing stuff. Sorry.
ASTRID: Joy-kill sounds way cooler. I’m also here with Rein Henricks.
REIN: Hi and I’m pretty sure that they are called ‘purr-itos’. I am here with our three guests today. I’m very excited about this. I have Ariel Waldman, Ashe Dryden and Brad Grzesiak.
Ariel sits on the council for NASA Innovative Advanced Concepts, a program that nurtures radical science-fiction inspired ideas that could transform future space missions. She is the co-author of a congressionally requested, National Academy of Sciences Report on the Future of Human Spaceflight and the author of the book, ‘What’s It Like in Space?: Stories from Astronauts Who’ve Been There.’ Ariel is the founder of SpaceHack.org, a directory of ways for anyone to participate in space exploration and the Global Director of Science Hack Day, a grassroots endeavor to prototype things for science that is now in over 25 countries. In 2013, Ariel received an honor from the White House for being a champion of change in citizen science.
Ashe Dryden is a programmer of over 15 years, turned diversity advocate and consultant, White House fellow, a prolific writer and speaker. She is the founder of AlterConf and co-founder of Fund Club. Ashe is currently writing two books: The Diverse Team and The Inclusive Event. Her work has been featured in The New York Times, Scientific American, Wired, NPR and more.
Brad is CEO and co-founder of Bendyworks, an application development consultancy in Madison, Wisconsin. He started his career as a mechanical aerospace engineer and has at least one payload in space. He now seeks out better ways to write robust, yet flexible software from Bendyworks’ clients, from Fortune 100 Enterprises to brand new startups.
Welcome, Ariel, Ashe and Brad.
ARIEL, ASHE, BRAD: Thank you.
CORALINE: This is such a cool lineup of people.
REIN: I know. It’s pretty crowded in here.
CORALINE: Yeah. Some of my favorite people are here today. This is really great. We’d like to start off by getting to know as everyone here has an amazing resume and an amazing list of accomplishments and really interesting backgrounds. We’d like to get to know who you are behind the scenes, who are behind your public profile and what makes you tick. Brad, I know you got your start in space technology, in rockets and I know you from outside of the podcast. I see the enthusiasm you get whenever there’s a Space X launch. What got you interested in space?
BRAD: I am actually in Florida right now and the thing that got me excited about space was when I was in Florida at age… I don’t know, five or six? We had a family vacation to Disney World and on the last day, we’re sitting in the hotel, watching TV and there’s a shuttle launch going on right at that moment and I had the bright idea to go to the window, looked out at it and sure enough, in the distance I could see tiny plume rising into the sky. That was the moment, I think that I got really excited about the idea of space. Unlike a lot of the other kids, you said, I want to be an astronaut. My initial thought was I want to be a rocket scientist so I kind of followed that throughout my schooling and ended up working in aerospace firm since college.
CORALINE: So cool. Ariel, you’ve done quite a bit of scientific work yourself. What got you started on that path?
ARIEL: Actually, I don’t have a background in science or whatsoever. I went to art school and I got my degree in graphic design but a few years ago, I was watching a documentary about NASA during the early days and how they were trying to figure out how to send people into space. I got so incredibly inspired by that documentary that I decided that I wanted to send someone at NASA an email saying, “I was a huge fan of what they were doing and if they ever needed a volunteer or someone like me, I was around.” I, serendipitously and very unexpectedly, ended up getting a job at NASA from that e-mail. It completely changed my life and really set me on a trajectory to try and give other people the same experience of making space exploration more accessible, whether it’s getting a job at NASA or contributing to discovering galaxies and the like.
CORALINE: Ashe, you’re best known for your diversity work. What’s your connection to space?
ASHE: Space is really cool. Growing up, I actually wanted to be a marine biologist and since I was in about fourth grade, I was super interested in all different aspects of science, especially ones that we didn’t know everything about. The idea that there was more stuff out there to learn and to discover and something that we hadn’t quite place on a map yet, was really fascinating to me. It kind of falls into the category of completely fascinating and in love with the idea of exploring and learning and discovering new things for me.
CORALINE: Awesome. We’re going to talk about more than space but there’s this common thread that I think is worth exploring right now. I actually had a run in with NASA when I was in my formative years. When I was in high school, my computer literacy teacher told me about a program called the Explorers Program, which was a scouts program. I have the opportunity to go learn Fortran at NASA Langley. Twice a week, my dad had to drive me out to Langley Air Force Base to go to NASA and sit down in front of a dumb terminal and write Fortran programs.
I remember being really disillusioned because the computers that I was working on had no screens. All the output was to line [inaudible]. On the other side of this wall, there are tape drives and we could just see all these wheels turning. They look like open audio recorders or something like that. I remember being really disillusioned because I was like, “This is just so low tech,” and I expected so much more from NASA. It was very strange but it is a great experience and I really learned a lot from it. That was kind of cool but I’m wondering if anyone else had that kind of impression where you think that from the cultural portrayal of science and the role of science that when you actually get into the heart of it, it’s a lot more low tech than you expected?
ARIEL: I certainly have that experience because I remember when I ended up getting a job at NASA, originally I was expecting that all of NASA looked like those mission control rooms where most rooms you would walk into would have a huge picture of the Earth and people would be typing away as they are seeing this big Earth projection and there were men and that would look really cool. Pretty much, none of NASA looks that way whatsoever. There’s maybe one or two rooms across the whole 10 centers that look like that.
Walking into NASA Ames for me, a lot of dilapidated buildings, a lot of buildings scheduled for demolition, a lot of buildings that say, “Just so you know, there’s asbestos in this building.” It was the antithesis of glamour. I remember being legitimately surprised that it looked that way.
BRAD: Yeah, I remember visiting Johnson Space Center in college and feeling the exact same way. There was the mission control room but everything else, correct me if I’m wrong, I think all those buildings were basically built in the 60s during the Moonshot Programs, the Apollo. They haven’t really changed a whole lot of them because the budget has been dwindling ever since so they have the buildings that they have.
CORALINE: But the current administration loves space so we’ll see a huge investment in NASA, I’m sure.
BRAD: I heard he wants to have us go to Mars. I’m sure that will change in a week.
CORALINE: As only because it’s a red planet.
BRAD: Coraline, one. Everyone else, zero.
CORALINE: Ariel, with your background during the bio reading, it was kind of interesting to me and that’s you mention the influence of science fiction, the idea of some science fiction on the current development of science. Most of the science fiction I read is not hard science. I’m a huge fan of Octavia Butler and dystopian cyberpunk futures. I’m wondering what’s more relevant in our current political climate, hard science fiction or the science fiction that deals with socioeconomic and political scenarios where we’re not seeing the best case scenario play out, like what should we be drawing on right now for inspiration?
ARIEL: Well, there’s realism which is of course, the most relevant stuff to right now is the socioeconomic dystopias. Pulling on for inspiration, in this climate that you’re talking about that something to look forward to, I’m not quite sure. In terms of how science fiction is influencing specifically science and getting away from a little bit of recent administration and everything, more often I’m seeing stuff that should be influencing science fiction rather than the what other way around, which I think to me is incredibly encouraging.
With the NASA Innovative Advanced Concepts Program, which shortens to NIAC, they are the only program at NASA that funds the more futuristic sci-fi, out there concepts that could go on to transform future space missions, maybe 10, 20 or 30 years down the line but concepts that aren’t quite ready to be implemented yet but you can do the initial research and development for. They’re looking into things that some of which are science fiction inspired like can we actually hibernate humans on the way to Mars.
But there’s other ones that I haven’t really seen actually in science fiction before so there’s a concepts called the Comet Hitchhiker, which is a concept to send a spacecraft to a comet, have it harpoon into the comet and then reel out an incredibly long tether and then harvest the kinetic energy from the comet to be able to explore the solar system twice as fast. There’s a lot of things like using comets as propulsion systems, which I haven’t exactly seen in science fiction. I think what’s really exciting about right now to me in space exploration and science is that, I think we have sort of almost as equilibrium where science fiction is still inspiring science but I think there is new things being developed in science that should be in science fiction.
ASHE: I think that’s great. Science fiction is my favorite thing ever. I love reading science fiction, I think that something that people who are kind of on the cutting edge of using technology as we change our culture in our societies, whether or not it’s in space, science fiction provides this futuristic fable of what we can learn by not having to do the thing. It takes it to the farthest, logical conclusion so we can plan for those things well in advance of actually getting there which is really nice because a lot of the time we’re kind of going into this without being able to see all of the different aspects of what problems it could create. Science fiction helps us plan for those things that it ends up affecting in a positive or negative way.
CORALINE: I remember reading HP Lovecraft’s essay on horror fiction. Lovecraft, of course being a very problematic person. I have to mention that every time I referenced him but he talked about horror as being a way for psychologically preparing for worst case scenarios. I think sci-fi does an interesting job of preparing us both for best case scenarios and the unexpected consequences thereof, as well as worst case scenarios.
ASHE: I agree.
REIN: Yeah, I think a lot of people were kind of like, “I’ve read William Gibson. I know how this works now.”
ASTRID: I also think science fiction is a nice landscape to talk about contemporary moral issues because it can be painted in this world that you’re not actually in so that it can give you some distance. Then you can really start to examine things that would normally make you kind of uncomfortable but it’s a little bit easier if it’s not really you and it’s not really your community when it’s this other community that’s dealing with something that’s very similar to yours.
CORALINE: It’s easier to talk about race if you’re talking about aliens?
ASTRID: Yeah, if they’re blue people who have random weird skin and other stuff, it’s not the same but it actually is the same issue.
CORALINE: How much do you think science fiction influences is felt outside of hard sciences? Are there sociological studies on the culture of Star Trek? Is it pervasive?
ASTRID: I know for sure that there’s an entire group of anthropologists who study science fiction and they’re all about it. I definitely think in some ways, it’s the thing that brings people to science because science fiction can be such a great story. A lot of people who feel really intimidated by science can at least relate to the stories.
REIN: Can I just mention on the subject of Star Trek because I watched this episode the other day that Star Trek is a socialist, post-hunger, post-poverty, etcetera utopia but Deanna Troi still went to a science conference and got sexually harassed.
ASTRID: [Laughs] That’s so horrible. What I’m thinking of is when I first saw 2001: Space Odyssey, which it took me a really long time to watch, to me was more like a comment on the 60s than it was about the future.
ASHE: Yeah, I think that science fiction allows us to reflect a lot on the cultural touch points and that kind of thing throughout time. If you look at the difference in science fiction and the things that we’re focused on by decade, the things that people are afraid of, the things that people are interested in change pretty dramatically from decade to decade from sub-genre to sub-genre even. That gives you a glimpse into what things we’re like at the time. It was written as well. It’s both like for when they are writing it, compared to our time it’s like past science fiction, looking at the future. It’s like looking behind us and looking forward at the same time which is really neat.
ASTRID: Yes, totally.
CORALINE: I remember reading about the [inaudible] phenomenon where the cultural climate of the day influences both sci-fi and horror and looking at some of the cultural study of science fiction fantasy and horror media that was being produced during the Cold War and how you can just trace the influence of the prevailing fears at that time and see how that’s reflected in the media. What is sci-fi telling us about the world we’re living in right now?
ASHE: As somebody who reads a lot of science fiction from a lot of different time periods, at least for what I read because my viewpoint is going to be colored by that regardless. From what I read, definitely much more of social movements in that kind of thing are influencing the kind of science fiction that I’ve been reading, where those kinds of topics are much more at the forefront. This is somebody who reads much more sociological science fiction, what are the effects of having this technology versus the super hard science fiction which is mainstream, mostly just about the science and how it works and why it works and that kind of thing and less so about how it actually impacts people.
Seeing more on socio sci-fi that’s focusing on what it means to have a society where either racial justice, for instance exists or justice for women exists or the exact opposite where if things devolve from where they are right now. This is what we’re looking at. I’m seeing a lot of what the cultural changes that have happened over the past five or so years, especially around Black Lives Matter and the kind of new awakening for a lot of people like this new generation of feminism.
CORALINE: Are we seeing middle-aged, white guys about that, Ashe or is it marginalized people who are finding a voice in those sci-fi community?
ASHE: Interesting fact about me, I do my best to only read books by and about marginalized people so I couldn’t tell you that. I try very hard not to read and consume media that’s created solely by very privileged people because that just kind of replicates the problems that we already have. In doing so, it kind of becomes like a PBS special for me like it’s entertainment and I learned something from it, which is really good.
BRAD: One thing that I’ve noticed with some of the more recent literature is this idea of living within the current time, the poor societal choices that had happened in the past and dealing with that. For example, I’m currently reading The Three-Body Problem. I might screw-up his name but I think it’s Cixin Liu. The first half of the book is all about the Cultural Revolution in China and everything that’s still in place in China in this, of course sci-fi world, was completely influenced by the Cultural Revolution. If you look at that same idea for The Expanse Series which is obviously been turned into a pretty successful TV show where you have Earth contingent and this Mars contingent and they have these past societal problems with each other, as well as… What do they called the people who live out world, like on the rocks?
ARIEL: You mean, the Belters?
BRAD: The Belters. Thank you. Then you have the oppression of the Belters and they’re still fighting with each other due to the societal problems they have in the face of this great threat that’s coming from outside and yet, they still can’t resolve their differences. That’s one thing that I’ve been seeing where we’re reflecting on the poor decisions that society has made in the past but we still can’t get past them so it takes the new threat.
REIN: Earlier we talked about hard science fiction and soft science fiction and I wonder is this a cultural thing where if the author explains it in great detail, things related to physics or chemistry or maybe biology that we call it hard science fiction but if it instead sociology or anthropology, we call it soft science fiction? Am I mischaracterizing that? There’s something, maybe at least for me going on there.
BRAD: We see the same thing with the conference talk, we talk about hard talks and soft talks and there’s this [inaudible] that even though hard and soft is that the antonyms of each other, there’s this other idea that the hard talks are because it’s difficult and here’s the thing. For most of very technical people, it’s the soft talks that are actually the difficult part. We’re not wired necessarily to have the empathy that we’re expected to have. That’s, to me the hard part because it’s something that I’m constantly working on and all the technical stuff comes fairly easy to me.
ARIEL: I do think that something that The Expanse, I don’t think they always had but I think they’re attempting to sort of blended. I think it’s the first science fiction I’ve seen in a while that attempts to do that because it’s a show that so steep in politics and because it’s steeped in politics, that’s deep in social economic realities of their world and everything but what The Expanse usually gets a lot of acclaim for is how hard they work on trying to get the science right or mostly right, on the TV series specifically.
I think it’s nice to see a science fiction series that really does care about building a world around politics and social issues but also really cares about if we spin Ceres a certain way that the gravity will work exactly this way and if someone’s head gets blown off that’s in zero G, this is exactly how it’s going to look or something. Some of it is definitely more fantastical than others but the fact that they’re working both sides of those issues is unfortunately rare but I’m hoping they’ll set more of a trend to really dig into both sides because too often, it’s one or the other. It’s a little bit too much void of reality in a lot of science fiction because they’re picking their [inaudible] and not fully doing world building on issues that I think people would be genuinely interested in.
CORALINE: I’m curious, I’m really glad we mentioned soft talks and hard talks in terms of conferences. I typically give what is referred to, and I hate the term soft talks but at the keynote I gave in South Africa this year, I gave a talk called, ‘Metaphors Are Similes. Similes Are Like Metaphors’ and it was about the way we think. I went into graph theory and it was really interesting the breakdown of people who talk to me after the talk. It was mainly men who came out to thank me for my introduction to graph theory and I had people tell me like, “It never made sense to me before you explained it,” and it was mainly women and people of color who came up to me to talk about the parts of the talk that are more about how brains function and how we interact with each other and how we model thought. I wonder if a show like The Expanse is using the hard sci-fi to draw in audiences that would otherwise not be interested in the sort of political drama or the sociological questions that are being asked by the show.
ASHE: I have a question for your question. Do you think that with the audience of your talk that the people who need that message are still missing it and looking instead at the science?
CORALINE: That’s my underlying concern.
ASHE: As somebody who gives a lot of those kinds of talks as well, I tell people that people are much more complicated and there are so many more variables with humans than there are with computers. People are the really hard problem in computer science, in my opinion. As somebody who gets kind of boxed into this gives soft talks, even though I’m an engineer and I’m talking about the way that these kinds of things apply, to the things that we create, the way that we interact with each other and the way that we are shaping the world, you still end up with the people that you expect to be in the room.
The people who need those kinds of messages most are somewhere else. Especially, that’s a multi-track conference, they’re sitting somewhere else where they’re picking out the things that seem most relevant to them in the understanding of the world that they already hold. Does that make sense?
CORALINE: Yeah, definitely. I have some frustration right now because I have this talk idea. I’m working on a book on empathy for software developers, in particular called ‘The Compassionate Coder’. I have a chapter devoted to modeling emotions as state machines and that seemed to me like a really great way of getting people in the room who would not otherwise attend to talk about empathy because I’m modeling it in terms that they would understand. Frustratingly, not a single conference has picked up that talk yet.
ASHE: Yeah, in case you don’t know, I run a conference called AlterConf that brings together marginalized people to talk about views on different aspects of sociology in the way that they interact with technology. Whether you’re talking about, racial justice and having to go into work where you’re the only person of color and having a lot of rather ignorant white people ask you questions or treat you in a certain way, while you’re out of work life, you’re fighting for your life and you’re fighting for justice. That’s a huge difference.
Talking about those kinds of things are still seen as something that’s less mainstream, as less important because those are the concerns that more privileged people have and those tend to be the people that make up the vast majority of the industry so it’s a self-perpetuating problem.
ASTRID: I guess one of the questions I would have about that is that even if you are privileged and you don’t understand other people who are not like you, they are still people so why are you not interested in understanding how people work in general so that you can actually build better technology? I’m still trying to understand why there is a break there between people who only want to focus on just the hard science and engineering and don’t really want to talk about the other people parts because they kind of feel like it’s extraneous soft stuff. It’s not always about the social justice part. Sometimes it’s just like do you understand how people think, how they act, how they use things?
ARIEL: Yeah, I think that’s unfortunate is just the reality, especially on the science side of things is that the science culture and science industry really suffers from trying to promote that science is this thing that exists without humans, as opposed to being a human pursuit. I think you see it everywhere and it really seeps in in a really insidious ways. I find that pursuit of science being something where an objective truth exists out there. If humans weren’t here, science would still be there. I think people take that as being almost literal that science, instead of it being humans trying to actually explore and understand and find the objective truths for which we live in. It’s seen as this thing that is completely external to us.
I think this gets promoted in so many different ways and so many different places and by people who aren’t even aware that they’re really promoting this. I think the science industry really suffers from unknowingly promoting things that are actually disrespectful or harmful to people. It’s not that I think they’re completely clueless. I think they’re just not using a lot of critical thinking about how things affect people. Because science is seen as this thing that’s devoid of humans, they end up thinking that you can just focus on the work in front of you and it doesn’t really have to affect humans.
But the reality is if you’re someone fighting for science to be accessible, to me my argument to someone who doesn’t really care about the human side of things would be that it’s not just about caring so much about who is doing the work and making sure that marginalized communities are getting into science. Obviously, that’s something I care very deeply about. But I would tell them don’t you agree that there is a lot of underfunded, overlooked, not appreciated science out there and I think a lot of people would agree with that statement, even if they’re the heads down, I don’t need to pay attention to humans thing. Although, the reality is that there are humans behind that underfunded, overlooked science.
A lot of them are going to be people who are in marginalized communities. To me it’s like if you care that there’s not a lot of research into how humans have sex or there’s not a lot of research into a specific type of snail — it could be a white guy who is doing that sort of work — but there’s so many people who are not getting funded, who are not being included in science that you start to think about if we’re not really making sure to find every last human we can and make science accessible to them and make it equitable, then how much science are we actually losing out on and that’s quite a bit.
That would be my argument to someone who is not very human-centric. Something that is just incredibly frustrating is I think promoting science that it’s not a human endeavor and when you do that, you forget about how much science we don’t actually have. I think a difficult concept for most people, whether it’s space exploration or tech or science is trying to have them think about exactly what we haven’t discovered yet, what we don’t know, we’re maybe not far as far along as we should be in our knowledge of something.
ASHE: Also, the assumption that as scientists, as people who are lovers of science that what we bring to the table is 100% objective, also influences the science that we create in the way that that science influences all of society. There have been lots of studies and research projects in that kind of thing that in their time were determined to be good science that we know now are motivated by misogyny and racism and just general lack of empathy and the assumptions that we make about other people.
The idea in all of these different fields that we’re coming into this, 100% objective, we don’t care about who the people are that are interacting with us or who are affected by it is 100% false. We need to look at all of those different aspects. One part of being able to correct that is making sure that we do have all different kinds of people involved in science at all different levels.
CORALINE: I think we see the same sort of thing in the open source community, which is a little near and dear to my heart or more familiar to me, where the default open source developer is maybe a 40-year old cisgender, heterosexual, white male, employed by a large corporation to do open source and simply doesn’t see what is not being created by people who are denied access to the resources and needs exposure and the communities that will support them to solve problems for people other than the 25-year old Silicon Valley guy named Chad.
It’s disheartening to hear that science suffers the same thing where the problems that are being solved or the solution to their being presented are being presented as if they come from the subjective reality. When in fact, they’re serving the default human being who most people think of as Chad.
BRAD: I think one of the issues of being in a society that is not post-scarcity is that we, by default have this idea of what’s in it for me. Not only that, but what’s in it for me right now. If you’re not seeing the value in someone else’s work in your life, then you tend to dismiss it. That applies to so much science because you don’t necessarily see what’s the value in someone doing space research, anything in space and then you look 20 years down the line and it’s going to revolutionize your life. But it’s difficult to understand how that’s going to affect you so far in the future because we have to do that now in order to have an effect on you in the future.
ASHE: Yeah, oftentimes using the marginalized people as resources and non-consenting resources. There’s a movie coming out right now and there was a recent book about Henrietta Lacks, a black woman in, I want to say the 50s or 60s, who went to the hospital with a type of cancer and instead of properly treating her, the scientists took her cells and used them and still use them today for a lot of research. Her family has not benefited in any real way. That science in particular is built on a long history of racial injustice and gender injustice and continues to this day.
ASTRID: Do you think that’s because of what Ariel was saying about taking the human out of it and just seeing it as an objective pursuit?
ASHE: I think that by the people doing the science, they see it as a worthy sacrifice on that person’s part even though that person hasn’t consented to that sacrifice. In Henrietta Lacks’ case in particular, she wasn’t even ever notified that this was happening to her. She didn’t have the opportunity to consent because she wasn’t aware that that was going to happen at all.
I think that definitely agreeing with Ariel because they’re not seeing particular individuals as humans. It becomes something that they can kind of zoom out from but also, it’s perpetuating the same problems that we see society as a whole. The idea that science isn’t affected by those things or that science doesn’t further that kind of oppression is untrue.
CORALINE: I feel like we could talk a little bit about some of the stories we’ve seen over the past year or two about biases in machine learning as an example of what happens when you treat data and people as objective points in space, as opposed to complex beings who are intersecting with society along a lot of different axes of oppression.
ASTRID: That makes me think about Facebook and the whole fake news and what it appears as them not really taking much responsibility for it because apparently, they may have not known a whole lot about it but a lot of what ended up getting through was partly because they removed the humans from the process of filtering the news in the first place because they believe that it was appropriate to just use the machine learning. It had been matured enough, I guess in that perspective, to be able to filter properly the news for people.
Then obviously, it was not. It was putting out a lot of things that people were promoting that were not true. Then now Facebook is not really doing much about that, which concerns me because it makes me question, when you have made these big decisions that affect so many people, since there’s over a billion people who use Facebook and then you decide that you’re not responsible for that but it is affecting people, then who is really to blame?
I know that’s kind of a different question that you were suggesting but it’s still get back to this what you do to people and how it effects everything else. Since you’re not really considering how other people are going to be affected or are being affected as enough of a factor to make changes, then what does that really mean?
BRAD: I think it’s time to bring out the classic Jeff Goldblum quote, “Your scientists were so focused on whether they could. They didn’t question, whether they should.”
ASTRID: Yeah, that’s fair.
BRAD: I think what really needs to happen is that there needs to be a whole lot more human learning in the field of machine learning.
CORALINE: It’s terrifying that the companies that are making the greatest progress in putting machine learning to some practical value are also not studying societal and political and cultural effects. That’s absolutely horrifying.
ASTRID: When people like Elon Musk talk about how AI is scary because it’s going to kill us one day, I’m not worried about the AI. I’m more worried about something like this because when you have masses and masses of people using your platform and you make even a small change, it’s going to have a huge effect. It seems as though a lot of, at least these tech companies are not prepared to be able to handle when they do that and that’s very concerning to me.
ASHE: Yeah and that kind of technology, especially when it comes to things like machine learning and crunching big data is being used in a lot of real world, long-lasting scenarios like when people go before a judge to determine what a sentence will be for any given crime, all of their information is fed into an algorithm that spits out what their likely recidivism rate would be so they’re being judged based on all of the data of all of the people that they’ve collected before, not taking into the account that some communities are far more police than others, that being poor, it makes you a target for the police. You’re more likely to be arrested — being a person of color, being a transwoman — all of those things make it much more likely that you’re going to end up in the criminal justice system and them using all of those traits to determine how long you’re going to be in prison, leads to this huge epidemic that we’ve had over the entire United States history of the vast number of people that we have in our prison systems, people that are put on death row and all of those different kinds of things. It’s not just how we’re using technology specifically in tech spaces but also how we’re determining, whether or not we should be hiring teachers or hiring more police officers or over policing different areas and doing things like stop and frisk. All of those things motivated, based on the machine learning and the big data that were using and collecting every single day.
CORALINE: Ariel, I know your time with us today was very limited. I wanted to really thank you for being in this very important conversation and it was really great talking to you. Thank you so much.
ARIEL: Yeah, absolutely. Thank you so much for having me.
ALL: Bye, Ariel. Thanks a lot!
ASTRID: Let’s take this time to do a shout out to one of our Patreons, Dave Tapley. Dave is a Patreon at the $10 level and we want to say thank you so much to everybody who participates on Patreon.
REIN: I wanted to bring back up what Brad said about scarcity and also point out Conway’s Law, which is that organizations with design system produced designs that are copies of the communication structures of the organizations that produced them. My point here is that if you look at the structure of our society, the scarcity and exploitation are both really deeply baked into the structure of our society. I don’t think it should surprise us when we find those qualities in the systems we produce.
ASHE: Yeah and scarcity isn’t equally distributed is something else that’s really important to note. There are definitely communities that experience far more scarcity in income and access to education and access to opportunities and material goods than others do.
REIN: Yeah. All of these things have to be understood through looking at the power dynamics and relationships between groups and people.
BRAD: I think that what it’s going to take is sufficiently large and/or sufficiently frequent external stimuli to bump the system into a different track, if you will. We’re running on a negative feedback loop, which means that any sort of deviation gets negatively fed back into the inputs to bring it back on your status. What we need is a large enough stimulus to push us out of that rut of getting us back into that groove — I’m using too many metaphors here but —
REIN: If you continue the physics metaphor, both of the size of the impulse and how long it’s applied for.
BRAD: Well, also because where a human society, sometimes frequency of small impulses is going to be enough.
REIN: There is another — I’m a big fan of these laws — Gerald Weinberg has a law. It’s an aphorism, I guess that he calls the Law of Cucumbers in Brine, which is that cucumbers get pickled far more than brine gets cucumbered. The idea there is that if you have a small system interacting with a much larger system, the small system tends to change much more than the larger system. The small system tends to become more like a large system far more than the large system tends to become a small system, especially when there’s a power dynamic involved, as there always has. The way to combat that is through persistent action: find something that you can do and keep doing it until it works.
ASHE: Do you think our issue is we’re hoping that the impulse comes in the form of technology versus human intervention?
REIN: I think that we all tend to prefer technical solutions to problems even when they aren’t technical problems and the secret there is that there aren’t really any technical problems. They are all just people problems that we try to apply technology to with varying success. I’m not sure if that answered your question.
ASHE: No, it definitely does. Especially as technologists, I feel like we always want to extract what can be easily repeated by a machine so we don’t have to do the same thing over and over again. But I think that ends up perpetuating the problem. We’re feeding off of — I don’t want to say mixed signals — incomplete or incorrect data. We’re building on that again and again and again. When you start with that shaky foundation, it’s not going to lead to the outcome that we hope or want.
ASTRID: I think, I get what you’re saying, Ashe. Because in the beginning, we don’t want to do messy stuff. We’d rather do a very clean, “I know how to build the stuff,” then later, we just get lazy about it and just rely on it, even though there’s no proof that what we started out with is even right.
REIN: I think it may be interesting to unpack some of the reasons that we prefer to focus on technology. I think some of it might be that our experiences in technology, we have a better understanding of technology, it’s easier to form mental models of technology that are useful to us. If you compare that to understanding people and their interactions, it’s much more difficult. We have much less experience doing it. I think those are, at least some of the reason why people are reaching for technical solutions and looking at problems as technical problems.
BRAD: If we look at the distribution of job roles at a typical technology company, what you’ll see is often times you have — ignoring the sales aspect side which more on the creative side of the business — many more engineers than designers and UX people. When you have that disparity in number of people, the engineers tend to have more weight. That’s just the generally accepted way of doing business, even though it might not actually be the correct way of doing business.
CORALINE: I really think exception to that too because one of the more interesting developments in software development or in particular, web development over the past few years is the devops movement where people have started to think and say and do that the software we produce is inseparable from the way we deploy it and a way it operates in production. I hate the fact that we have to have a specialist in UX or UI and separate that from the engineering work when really all of us should be generalist and all of us should be considering the human impact of what we’re building at every step in the process that’s not something that should be delegated.
ASHE: Even stepping farther back from that because we’re so heavily siloed, especially when we’re talking about technology as an industry we so heavily silo all of these different roles. We’re also not learning from each other. I worked with a lot of startups and a lot of other technology companies to tell them that the way that they need to start changing things so they can have a more inclusive culture, so they can actually support and sustain any kind of diversity internally.
One of the things is making sure that everybody is on a level playing field when it comes to different departments, especially because the vast majority of diversity, the vast majority of marginalized people exist in roles outside, explicitly engineering. Those roles include things like what we would call customer service or dev relations. Any kind of front-facing, consumer-facing role, people who tend to know our products and our services very well, who tends to know our users and our consumers very well, they don’t have any say in the way that products are created. They might pass things on to an engineer if they can’t solve the problem but that doesn’t then get turned into this is a UX problem or this is not what our users want or need. Failing that, how are we able to create the best products and services, if internally we can’t even communicate the things that we are building and the things that our consumers and our users actually want and need.
CORALINE: There’s an emphasis on speed and delivery that gets in a way that process too. I know with my work at GitHub, I had a lot of ideas for things that I wanted to put in place right away and I thought of that same trap that you’re talking about, Ashe, I wanted to get a technology solution out the door to solve what I saw as some severe problems in open source in the way that people come together to do open source. But luckily, I’m a very talented UX specialist in our team and we don’t release anything without in-depth collaboration with our UX person and that’s really opened my eyes to the fact that if you create a technology, even when that sets out to address inequality or an injustice, if people can’t use it or if you don’t understand how people are going to use it, you might as well have not wasted the effort.
BRAD: To bring it back to space just a little bit, this reminds me a lot of my past experiences as an aerospace engineer. This idea of silos is not unique to the software tech industry. We have them in the aerospace as well. But they weren’t necessarily institutionalized silos. Sometimes, it’s just an individual who wanted to live in a silo by themselves. On a payload team that I worked on, we had software engineers, we had electrical engineers, we had mechanical engineers, which is what I was and we had scientists.
The best teammates that I had on that payload were the ones who would overlap what they did with one or more of the other sub-teams. As a mechanical engineer, I worked a lot with the electrical team in order to make sure that, for instance the boards that they were designing fit well within the mechanical idea of what we’re building. Working with another team like that is what made the best engineers on the team and the people who decided to erect silos for themselves were the people that we just didn’t enjoy working with.
CORALINE: I fear that the way that we are producing software engineers is only emphasizing that siloing because we’re training people for frontend jobs. Maybe, you come out of a CS degree, you have no critical thinking background, you have no sociology background, you have no anthropology background. You’re solving problems algorithmically. It seems like the people who are generalists have either been in the field long enough that they got to start when those areas specialization didn’t exist like me or people who are self-taught.
ASHE: I don’t know if I agree with that. I will work with a lot of companies where in the beginning it makes the most sense to have the most generalists because everybody has to be able to work on everything at once. I think what we need to move toward is that we have multi-disciplinary teams. You should have somebody from marketing and somebody from customer support and an engineer and somebody who works with devops, product manager and all of these different roles and all of those things working directly together because that specialization does bring about a lot of important things where they’re seeing issues before they can come up. They can create the best and most efficient solutions to problems where a generalist might not be able to.
Especially because we see very many people that are looking at getting into the industry as people that are new graduates from university versus things like boot camps or are self-taught, especially when we’re looking at marginalized people, that helps it to even that playing field. Having somebody who is, for instance a full-stack developer, having your entire team made up people that are full-stack developers means that you’re looking for somebody who has a lot of experience in a lot of different areas and that’s going to mean that you’re only looking at people who have been in the industry for a long time.
REIN: At the risk of playing my own little one note samba here. I want to point out that it’s much easier for a manager to blame a line engineer than the reverse and it’s much easier for an engineer to blame a customer service person than the reverse and this isn’t a coincidence. These are sub-groups that have different power, different status within the organization and you’ll find that blame and back passing always flows downhill in the power differential.
ASHE: Yes, absolutely.
ASTRID: I think that that’s just points to the fact that although we were recognizing that the multi-disciplinary teams are really important that the person who leads that team is probably the most important because that person will have the power to say that the customer support person has a voice and they should be heard and the engineers can’t just talk over what the UX designers would say is the best thing to do.
REIN: Yeah, the correction here is to find a way to equalize those groups, to put them on an equal playing field. Diversity and equality is the solution.
CORALINE: Or the people were promoted to manager so we have that responsibility.
ASTRID: I think that’s the biggest problem. Who is the manager, I think is the biggest problem. I don’t even worry so much about the equality if you have managers who are brokering that conversation. But I think a lot of times that’s not what’s happening.
BRAD: Speaking as someone who went from a technical role to a managerial role and I think I can speak for everyone who has done the same transitions that that’s really hard. A lot of our companies do not provide the training necessary to get any good at it and —
REIN: The biggest understatement I’ve heard today.
BRAD: As one of the owners of my company, the only person to blame is myself. We have had business coaches come in and help us through that process of realizing that not everything is solvable with technology, you actually have to listen to people and all that stuff. These are things that I effectively knew innately but wasn’t willing to bet the farm on and actually execute on but it’s tough.
REIN: What happens in most organizations is that someone says, “You’re the best technical person we have. We’re not going to give you this other job that is completely different where almost none of your skills transfer.” Good luck with that.
ASHE: Yeah and you’re no longer touching code. Whatever you feel like you have control over, you’re going to try to exert that control anyway so promoting them almost makes the situation worse because now they want to have that control — their backseat driving is basically what I’m trying to say.
REIN: The thing that I want to emphasize here quickly is that knowing how to treat people equally and creating a culture where people are treated equally is how you begin to fix these power differential problems that I think have caused a lot of the problems we’re describing so equality isn’t just for marginalized people. It helps everyone else, the entire organization.
CORALINE: What can we do, as people participating in the tech industry to affect that kind of change?
ASHE: I honestly think that the most radical thing that we can do is to listen to people. I don’t know why it’s so difficult for very many people but being able to listen and actually hear what somebody is saying and where they’re coming from, oftentimes people will tell you exactly what they to change, exactly what they need to be successful and to be an integral part of your team. Listen to what people need, listen to what people are saying and actually take action on it.
REIN: I would like to propose something to you. I would like to propose that the most important quality here is congruent and I’d like to talk about what that might mean. What do you hear when I say the word congruent?
REIN: Okay, that’s one. One thing that congruency might mean or being congruent might mean is saying what you’re thinking and thinking what you say. There is a congruence between your thoughts and your actions. I think that’s the one that a lot of us have used. The one that I want to offer — this is again from Gerald Weinberg — is congruency as being able to pick the best action in the situation, which means being able to overcome your biases, being able to overcome your lack of information about the situation to find the best action to take, taking everything into consideration and then taking that action.
CORALINE: But Rein, how do we get there?
REIN: One of the best things that you can do is to listen until you know all of attention where everyone is coming from. You can’t act congruently because you won’t have all the possibilities available to you.
CORALINE: Now, I would say, I’d paid attention to what Ashe said as I was preparing my next point.
BRAD: In addition to listening, another key quality or key decision that you have to make for yourself is to recognize your own biases and try to set them aside when you’re listening.
REIN: Yeah, you can’t act congruently if you’re taking actions based on your preconceived notions, your emotions are overriding your decision making things like that.
BRAD: And not to bring Jerry Weinberg in again but he wrote a book called The Rules of Consulting and the first rule for a consultant coming into a large company is, “There’s always a problem, otherwise you wouldn’t be brought in.” The second rule is probably the more important one is, “It’s always a people problem.”
REIN: Yeah, I snuck that one in earlier but I too, didn’t want to make the entire thing about Jerry all the time.
BRAD: Yeah. But as a very technical person, it is so hard to realize that it is always a people problem, whether it’s because they don’t know how to hire and train people, they don’t know how to attract people to do the work and that’s why are being brought in, to do technical stuff or if there’s just an internal issue of the team. That’s a people problem. That’s not necessarily a technical problem. Sure, technical problems come along with it. It’s really a people problem and it’s so hard to drop the bias that everything can be fixed with technology.
ASTRID: I can share —
CORALINE: I’m going to write a gem to remind people that it’s always a people problem. Would that be good?
ASTRID: I can share something I’ve been working on getting better at that I hope will make this easier, which is when someone approaches me or talks to me about something that I don’t agree with, I’m trying to consider their perspective, even if I don’t like their perspective but to actually try it on before I dismiss it. I think it’s easier to try to have a conversation with the person if you can find some sort of common thread and doesn’t mean that I have to accept everything that they’re suggesting. But it does mean that if I, at least really think it through and see maybe there is a point that I’m missing, that I am not as angry about it so I can actually hear what they’re saying, as opposed to being like, “Stop bringing this up. I don’t want to hear this. This is stupid,” which is normally how I feel when people do that.
CORALINE: [inaudible] that environment where discourse is mutually appreciated. There are some people that it’s pointless to do [inaudible] with because they will not engage and they will never change.
ASTRID: I don’t tend to engage with them. It’s more so for me so I don’t always respond to them. I just think it in my own head because —
REIN: I think this is all predicated on people acting in good faith at all and if that’s not the case, then none of this should apply.
ASTRID: No, I don’t expect people act in good faith. It’s more like if I’m having an argument with my sister, for instance like she wants to have pasta for dinner and I want to have salad, then it’s easier if I actually consider why she wants it, even if I don’t agree, even it might be stupid. But if I do actually think about it, it makes me less angry. That’s something I’ve just noticed because I think a lot of times, we respond from our ready set place, where we already have a response for this and it’s already gearing up to go and there are people who are just trying to rile you up so I don’t expect that everybody acting in good faith. But it doesn’t mean that I have to be angry. If I can to find out how somebody else is thinking even if I don’t agree with them, it still helps me understand how to deal with people better.
CORALINE: I would just caution that emotions are perfectly valid and emotional responses are perfectly valid and emotions can give us the energy to have conversations that we would not ordinarily have. I get angry a lot. I try, at least to choose how I direct that anger and how, you said anger and try and turn it into something that’s constructive and positive. But one of the things I learned into the period of my transition is that emotions are valid inputs to every thought process.
I’m reminded to of feedback that my manager gave me. I tend to hold very strong opinions. I think everyone who knows me knows that and what she told me once after observing me, I basically shot down a conversation with people who didn’t have the same perspective that I did was that it’s incredibly important to weigh the impact of your words and how you’re saying them but more so, when you’re right.
ASTRID: More so when you’re right?
BRAD: Saw it all the time.
CORALINE: Yeah because it’s not enough to be right if you can’t convince someone of what the outcome should be according to your perceptions or according to your experience, if you can’t communicate that back to them and you just shot a conversation down and not reach someone that you really need to reach in order to effect a change and the stakes are so much higher when you are right because of the perspective that you have or the experiences that you’re drawing on.
BRAD: And it’s so hard for the person who, in this hypothetical is not right to admit that they’re not right and not just to you but also to themselves like what you want to do is make it as easy as possible for them to make the transition to being right. This is all, of course for supposing that you are right in this situation. But in that situation like you do want to make sure of that, you make it as easy as possible for that person to see the light for themselves and admit it to you.
CORALINE: And it’s really a matter of choosing how to approach the conversation or using pathos or ethos and that’s a very tricky subject. Ashe brought up in chat that the point I raised verges on tongue pleasing and it absolutely does verge on tongue pleasing. It’s difficult and complicated and messy and I don’t have an answer to that.
ASHE: Yeah, a lot of human problems absolutely are messy. I think that it’s also something that was hard won over many years of me looking at these kinds of problems to realize that we can’t change anybody’s mind. We can present them with all of the information that we have and the reasons why we believe the things that we do but we can’t make anybody do anything else. Especially when it comes to any kind of issues around justice. When it can feel like you have a conveyor belt of people that want you to educate them about something that is so personal that it affects your life every single day and your survival every single day. It’s not on you to approach those conversations in a teacher-like way because that is not your job. It is on us, as privileged people.
There was a really great article that was posted this morning and I was kind of tweeting about it. It’s on us as privileged people to look at those situations and to learn from that. We need to do our research that we need to understand how we are perceived and how we benefit from the mistreatment of other people. Whether that is an identity-type power dynamic or that is a manager-to-employee type dynamic, it doesn’t matter. It’s the same kind of thing. How could we look at the way that we communicate and the way that we come across to make sure that we’re all hearing the same message?
REIN: I wanted to look at, Coraline, the example you provided of being right but then not doing the extra work to figure out how to communicate that effectively through this framework for congruent that’s part of that thing that Jerry talks about because this is, I think the most useful thing I’ve ever found for helping me understand the situation. For any interaction to be congruent, there are three parts that have to have an equal share. A failure of congruent is often leaving what are more goes out. Being right but then getting it wrong thing is not caring about the other person. It’s only caring about yourself and the facts of the situation.
ASHE: Can I ask a really simple question?
ASHE: How do we know that we’re right?
REIN: I don’t know if you’re asking me but we don’t.
CORALINE: I can make this very concrete. I can talk about the situation. We were talking about the ability of a project owner to edit comments in an issue. The case is being made for why a project maintainer needed to be able to edit comments in an issue and I pointed out that through my own experience with harassment on GitHub that, let’s take in particular the Opalgate incident, if the maintainer to Opalgate project had been able to manipulate the words that I was saying in that issue, it would have been disastrous.
It was already a disastrous, already a terrible situation and already nothing good came of it but if I was out of control of the way my words were being portrayed, it would have gone much, much worse. I shut down the conversation by sending my own experience into saying, “No, a maintainer should never be able to edit comments.” I think about that conversation a lot and like, “What did I do? Or what could have I done differently to have made a case that was very personal and I did have a lot of personal impact on me but that was also effective?”
ASHE: Let me step back. I assume that there was some person that brought up the idea or the reasons why they thought that it was valid for a maintainer to be able to edit a comment so wouldn’t they also feel that they were right?
CORALINE: Yeah, exactly. It wasn’t a matter right or wrong opinion but I think in this case, there’s a right and wrong decision that could be made. What I failed to do was to bring to light some of the unintended consequences of what that particular picture would bring in a way that put it on equal footing with the valid reasons for allowing someone to edit the comment.
ASHE: I guess my biggest concern here is how do we know that we’re right. How did you know that you were right versus the other individual? On the other side of this, you had an experience in those kinds of things. Often, people say, “The good outweighs the bad,” like I’ll sacrifice your experience for the greater good kind of thing. How do we determine that?
CORALINE: I wish I had an answer to that, Ashe.
BRAD: What we never know that even in scientific things, which is traditionally in an area where we can say, “Yes, that is proven to be right.” For example, with the discovery of relativity. A lot of things were kind of thrown out the door or modified to accommodate relativity and not to turn this into the Jerry Weinberg show but there’s this idea called the Orange Juice Test and the short of it is if someone says yes or no to a request, then they’ve probably failed the Orange Juice Test. Instead they should counter back with the repercussions of their request.
One way is that we could hypothetically say in this editing GitHub comment situation like, “We could create a thing where the project maintainer who wants to edit a comment has to pay $50 to buy a GitHub personnel’s person time in order to review whether this is censorship or changing someone’s words or not.” It’s not necessarily a great solution but it’s exploring the possibilities. I don’t know where I’m taking that but that’s around the idea of like, “Do you ever know that you’re right?” You really don’t. If you make the assumption, then you can start moving forward.
REIN: I guess we could get very philosophical here and say that right or wrong depends on the context, depends on the situation what you’re asking the question.
CORALINE: I just feel like my job is making sure that we put the needs of marginalized people first, that we consider the impact of what we do on the most vulnerable people first and that is very complicated but that is the position of advocacy that I have to take and that makes me feel justified in what I do and that makes me feel right and that influences the way I’m interact with other people.
ASHE: I guess my only concern is when you meet somebody who feels that they are just as concerned about ‘the greater good’, as we are about what it means to have full participation from all different kinds of people. I’m absolutely on your side here and I do not, at all mean to be a devil’s advocate. Just that so often, we run into the situation where ethically I know I’m correct but technically, from a literal technical perspective, they may feel that they’re right so who ends up having more weight in those conversations and how do you come to any real conclusion, whether that’s some kind of compromise or an actual change that moves it towards something that’s much more ethical?
REIN: I guess I could point out that this is a conflict between two incompatible sets of values. Historically, the way that’s resolved is the person with power makes the decision and then that’s how it goes.
ASHE: Yes, but I’m saying how do we mitigate that then.
REIN: Yeah. I got nothing for you.
ASHE: People are so hard.
REIN: If someone doesn’t value doing the right thing, how do you get them to value doing the right thing?
ASHE: Yeah, I don’t know. I feel like this is so much of the work toward actual real justice in so many different areas. It’s like how do we get people to see what justice is because right now we don’t have a shared vocabulary for that and how do we get them to empathize? Even if they don’t fully understand it, they can take someone’s word for it that this is going to have negative consequences that you may not fully understand and therefore, this isn’t the right decision.
CORALINE: I think maybe the most important first step that we can take is having those conversations.
BRAD: This reminds me of the book, ‘The Art of Negotiation’. When you hear that title, you think, “This is how you beat other people at negotiating,” and that’s not at all what the book is about. What it’s about is trying to forget the ‘what’ of the negotiation and trying to understand the why of the negotiation. Why is this other person advocating for this? Why am I advocating for this? Because it’s so easy to focus on what you’re advocating for but to pull back and think about why and figure out why they are and measuring those two things together, often results in win-win situation. Not always but the majority of the book actually focuses on those win-win situations because you’re focusing on why, instead of what.
REIN: I just like to say a couple things. One is that, maybe Brad and I should go get a room and read Gerald Weinberg books to each other —
BRAD: That one was not written by Gerald.
REIN: I know.
CORALINE: This has been an amazing conversation and I am so happy that we had the three of you on the show today. What we’d like to do at the end of an episode is to reflect a conversation that we’ve had and see what sort of takeaways or calls to action may have come up or things that we want to spend more time thinking about. Rein, do you want to go first and talk about what you’ve got out of this conversation?
REIN: Yeah. I guess what I got out of it is that these issues go straight up to the top in terms of the philosophical ladder we’re trying to claim of what do we value, how do we get other people to share our values. It doesn’t get easier by ignoring that the problem is that difficult and pretending that it’s just technical.
CORALINE: I think for me, I’ve talked about some of the challenges that I faced in doing the work that I do and trying to be informed by that perspective and I try to be informed by it. There’s definitely a personal aspect in terms of thinking about how I can be a more effective change agent. Also thinking back on the earlier part of our conversation. I want to reflect more and maybe do more to promote the idea that it’s responsibility of a technologist to think about the social impact of the technical solutions that they’re making, whether that means by being better informed and striving to be a generalist or by making sure that we are inclusive in giving force to people with different perspectives and different levels of expertise on our teams to make sure that we’re addressing problems deeply and not just from want to take your silo. Ashe, what are your thoughts?
ASHE: I have so many. I think that we all need to understand approaching any kind of problem, whether it’s in technology or in any other field. It requires us to bring this multi-disciplinary approach, understanding how we are looking at a problem ethically, how we’re looking at it technically and how we’re looking at it from a human point of view: why do we actually need this, why is this the way that we’re doing it and what are the potential effects?
I think that often, we are only looking at what we have simplified as the problem and therefore, creating simplified solutions. I think a lot of people brought up a lot of good points about being much more broad when we’re looking at where we should be pulling information from and how we should be applying it.
CORALINE: How about you, Brad?
BRAD: I like what Ariel said earlier about the science and the laws of nature still exists in the absence of humans. If you look at the science discussion that’s happening around the world today, it’s messy and complicated. I guess the answer to why that is, is humans. We’re the reason it’s messy and complicated. If we want to advance the frontiers of our understanding of the laws of nature through science, we’re going to have to figure out how to work better together and have the empathy to understand other people’s points of view and convince others and ourselves that we can’t always assume that were right because the whole point of science is to learn and understand.
CORALINE: This has been an absolutely amazing episode and to give people a behind the scenes look, we talked about what we’re going to talk about at the very beginning as we’re sort of organizing things and we came up with space cats and burritos. Sadly, we did not touch on burritos but I think some other, perhaps more important topics did get covered. I hope everyone has enjoyed this conversation.
If you want to continue the conversation, go to Patreon.com/GreaterThanCode, pledge at any level and come and talk to Ariel, Ashe and Brad and the panelists about the topics that were brought up. We’d love to continue the conversation with you. Ashe, Brad, Ariel, thank you so much for being on the show today. It was really amazing. This is definitely one of my favorite episodes and I’m so pleased you joined us.
ASHE: Thanks for having me.
BRAD: Yeah, thanks I had a great time.
CORALINE: That wraps up Episode 29. We will talk to you all next week.
This episode was brought to you by the panelists and Patrons of >Code. To pledge your support and to join our awesome Slack community, visit patreon.com/greaterthancode. Managed and produced by @therubyrep of DevReps, LLC.
69 episodes available. A new episode about every 7 days averaging 54 mins duration .