The State of Blackness—with Andrea Fatona
Manage episode 294536983 series 2867253
“In a way, I've always been working on the edge of both a larger dominant society engagement and a deep engagement with my communities. My focus is really digging deep into blackness.”
Andrea Fatona, 2021
Toronto-based curator and scholar Andrea Fatona has been addressing institutionalized racism on her own terms since the 1990s. Our conversations across time reveal the depth of her commitment to making visible the full spectrum of Black culture in Canada. Engaging with Black communities to build an online repository while addressing algorithmic injustice, she and her collaborators are illuminating the work of Black Canadian cultural producers on the global stage.
Sound Design: Anamnesis Audio
Special Audio: Hogan’s Alley (1994), courtesy Vivo Media Arts, Andrea Fatona and Cornelia Wyngaarden and Whitewash (2016), Nadine Valcin, courtesy the artist
Related Episodes: The Awakening, New Point of View at the Venice Art Biennale
Related Links: The State of Blackness, Andrea Fatona/OCADU, Vivo Media Arts, Okui Enwezor, All the World’s Futures/56th Venice Art Biennale, Cornelia Wyngaarden
What is The State of Blackness?
The State of Blackness website shares digital documentation of a 2014 conference that took place in Toronto, Canada. The State of Blackness: From Production to Presentation was a two-day, interdisciplinary event held at the Ontario College of Art and Design University and Harbourfront Centre for the Arts. Artists, curators, academics, students, and public participants gathered to engage in a dialogue that problematized the histories, current situation, and future state of Black diasporic artistic practice and representation in Canada. The site is now expanding to serve as a repository for information about ongoing research geared toward making visible the creative practice and dissemination of works by Black Canadian cultural producers from 1987 to present.
What is Algorithmic Injustice?
Algorithms come into play when you do a search on the internet, taking keywords as input, searching related databases and returning results. Bias can enter into algorithmic systems as a result of pre-existing cultural, social, or institutional expectations; because of technical limitations of their design; or by being used in unanticipated contexts or by audiences who are not considered in the software's initial design.
312 episodes