Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

2024 Participants: Hannah Ackermans * Sara Alsherif * Leonardo Aranda * Brian Arechiga * Jonathan Armoza * Stephanie E. August * Martin Bartelmus * Patsy Baudoin * Liat Berdugo * David Berry * Jason Boyd * Kevin Brock * Evan Buswell * Claire Carroll * John Cayley * Slavica Ceperkovic * Edmond Chang * Sarah Ciston * Lyr Colin * Daniel Cox * Christina Cuneo * Orla Delaney * Pierre Depaz * Ranjodh Singh Dhaliwal * Koundinya Dhulipalla * Samuel DiBella * Craig Dietrich * Quinn Dombrowski * Kevin Driscoll * Lai-Tze Fan * Max Feinstein * Meredith Finkelstein * Leonardo Flores * Cyril Focht * Gwen Foo * Federica Frabetti * Jordan Freitas * Erika FülöP * Sam Goree * Gulsen Guler * Anthony Hay * SHAWNÉ MICHAELAIN HOLLOWAY * Brendan Howell * Minh Hua * Amira Jarmakani * Dennis Jerz * Joey Jones * Ted Kafala * Titaÿna Kauffmann-Will * Darius Kazemi * andrea kim * Joey King * Ryan Leach * cynthia li * Judy Malloy * Zachary Mann * Marian Mazzone * Chris McGuinness * Yasemin Melek * Pablo Miranda Carranza * Jarah Moesch * Matt Nish-Lapidus * Yoehan Oh * Steven Oscherwitz * Stefano Penge * Marta Pérez-Campos * Jan-Christian Petersen * gripp prime * Rita Raley * Nicholas Raphael * Arpita Rathod * Amit Ray * Thorsten Ries * Abby Rinaldi * Mark Sample * Valérie Schafer * Carly Schnitzler * Arthur Schwarz * Lyle Skains * Rory Solomon * Winnie Soon * Harlin/Hayley Steele * Marylyn Tan * Daniel Temkin * Murielle Sandra Tiako Djomatchoua * Anna Tito * Introna Tommie * Fereshteh Toosi * Paige Treebridge * Lee Tusman * Joris J.van Zundert * Annette Vee * Dan Verständig * Yohanna Waliya * Shu Wan * Peggy WEIL * Jacque Wernimont * Katherine Yang * Zach Whalen * Elea Zhong * TengChao Zhou
CCSWG 2024 is coordinated by Lyr Colin (USC), Andrea Kim (USC), Elea Zhong (USC), Zachary Mann (USC), Jeremy Douglass (UCSB), and Mark C. Marino (USC) . Sponsored by the Humanities and Critical Code Studies Lab (USC), and the Digital Arts and Humanities Commons (UCSB).

Week 1: Roundtable: Anti-Racist Critical Code Studies Reading Group

A round table featuring participants in the Anti-Racist Critical Code Studies Working Group
Opening Post: Sarah Ciston, Zach Mann, Jeremy Douglass, Mark C. Marino

In his 2019 book How to be an Anti-Racist, Ibram X. Kendi defines the anti-racist as “one who is supporting an antiracist policy through their actions or expressing an antiracist idea.” The anti-racist actively works against racism. They work to produce change. Throughout the book, he offers more specific definitions for anti-racism with regard to biology, gender, culture, and other areas. What would a definition of anti-racist critical code studies look like?

During recent years, scholars have been teaching us how to approach software through race-critical lenses. Jessica Marie Johnson, Safiya Noble, and Mark A. Neal ran a pivotal week of the 2018 Critical Code Studies Working Group, entitled “Race and Black Codes,” which examined the relationship between codes that regulate Black bodies and human and technological responses to them. That week was built on a special issue of The Black Scholar entitled “Black Code.” Noble’s Algorithms of Oppression and Ruha Benjamin’s Race after Technology also provide direction for how to approach software and other technological objects through the lens of anti-racism.

Our reading group gathers at the intersection of race and software studies, following the lead of such luminaries as Lisa Nakamura and Alondra Nelson, who was recently appointed the Office of Science and Technology Policy’s inaugural Deputy Director for Science and Society. We look to Joy Buolamwini’s work in the Algorithmic Justice League. We have arranged a screening of her 2020 film Coded Bias during the reading group.

We also learn from the example of senior research associate Tara McPherson, who offers a key methodological model in “U.S. Operating Systems at Mid-Century: The Intertwining of Race and Unix.” In this article, which we will discuss in the group, rather than removing technology from its historic contexts, McPherson traces the relationship between an operating system and the racist culture in which it emerges.

In this reading group, we hope to explore strategies for incorporating anti-racist thought and practice into Critical Code Studies. Using the readings, we will discuss broadly the many ways that computer programming inherits racist histories of its authors, even how seemingly neutral code can operate in, reproduce, and also resist the racialized systems in which it operates. We will look at concrete examples of source code which intersect with these concerns. These code snippets will be added to the code channel. We invite you to post new ones.

We invite participants of the Anti-Racist CCS Reading Group to bring insights from those discussions here and for all in CCSWG22 to join in this topic.

Our initial conversations were about Tara McPherson's article and Beth Coleman's "Race as Technology"

But feel free to pick up any of the threads that came up in that Reading Group.

Comments

  • I want to highlight the relationship between code and race in the context of skin tone modifiers. More specifically, emojies as well as AI research on race discrimination have take up and given further legitimacy to the Fitzpatrick skin type categorization while further reworking it.
    More specifically, this new schema adapted the six number-driven skin pigmentation types articulated by Fitzpatrick in order to represent cultural racial categories such as White, Brown, and Black. Fitzpatrick Types 1-2 were consolidated to represent emoji U+1F3FB, Type 3 became emoji U+1F3FC, Type 4 - U+1F3FD, Type 5 - U+1F3FE and Type 6 - U+1F3FF as demonstrated in the chart below (Figure 1). The coding of U+1F3FA was still reserved for the “default” yellow colored aka racially white emoji icon. The Fitzpatrick scale became displaced from its original context of dermatology and inserted into the articulation of race and ethnicity in a digital context. The seemingly neutral skin type system, however, was harnessed in discourses about diversity, identity, race, and ethnicity as the emojis became informally labels as racemojis.
    The racial anchoring of skin tone was further articulated by the coupling of skin to hair and eye color – parameters of the human body which have been traditionally grounded in 19th and 20th century anthropological studies of race. For example, Fitzpatrick type 1-2 standing for pale and white skin, blond or red hair, blue or green eyes, tied to white Australian identity, became anchored primarily in icons featuring dark hair an East Asian racial categorization. For one thing, using the lightest emoji feels incorrect because it "seems like the software platforms opted to include East Asians by assigning a 'neutral' dark/black hair to the palest skin tone," writes illustrator Jason Li. Emojipedia concurs, writing that the pale/black combination represents "those from [a] large slice of Asia."
    Blond hair in Apple’s morphology became associated with Fitzpatrick’s type III “cream-white” quite common skin color and hence the default signifier for users who classified themselves as white. This coupling reflects the dermatological prevalence of skin type III in the United States: “The most common skin type in the United States is type III (48%), with types I and II comprising the second largest group (35% in total).” The ordering of the Fitzpatrick scale and its mapping onto the digital landscape has been criticized by If nothing else, the scale’s systematic default assumptions—Type I for the “whitest” skin, Type VI for the “blackest”—are exemplary of the emoji character set’s indebtedness to established hierarchies of gendered and racialized authority and inequality. Moreover, even in systems that do not support the combined character of a face plus a skin tone modifier, a skin tone modifier itself might be supported on its own.
    The Fitzpatrick skin color schema has its own racial bias yet through code it has become anchored as the objective Shirley Card of an algorithmic culture.

  • @Stefkax: This is fascinating. Even the fact that the skin tone emoji categories are based on sensitivity to ultraviolet light, not human perception, seems an interesting choice. I suppose UV light seems a safer beholder of skin color than flawed human judgment (as you explain), but this strikes me as... too linear? I wonder why AI legitimizes this scale... is this machine learning which relies on a similarly linear understanding of color?

    Even the way emojis came to adopt the scale by merging types I-II reminds me of the way U.S. ideology has assimilated multiple ethnic whites into a monolithic "white American" category over the years: those who burn fall under one category, and the groups which burn less become subjected to a classification system.

  • Hello
    I would like to point out a topic that I consider related, although if it is not we can move it to another thread.

    About a year ago, Github published this guide:
    https://github.com/github/renaming
    This guide specifies how to rename the "Master" branch of the repositories. The guide does not detail the reasons for this change, it only mentions a vague: "Many communities, both on GitHub and in the wider Git community, are considering renaming the default branch name of their repository from master".

    A slightly more concise explanation can be found in other articles, for example:
    https://www.theserverside.com/feature/Why-GitHub-renamed-its-master-branch-to-main
    This name change comes in response to one of the most recurring patterns in the technology industry, the term "master and slave". This has generated voices of support as well as resistance from the different people involved in software development and the IT industry. So, I want to bring the topic to this group and hear your opinions. Are these kinds of decisions a step towards an anti-racist code? How do you think we can interpret the resistance to these changes?

    Thank you!

    Translated with www.DeepL.com/Translator (free version)

  • The renaming is fascinating. Although there is so much that is lousy rn - this gives me hope. Even at my work I tried to push to master a month ago and there was no more master only main.

  • Reading race as technology, I would 100% say race is a technology. I would also say the ego is a technology, from a non-duality or psychoanalytic perspective - this would hold water

  • edited January 2022

    I didn't manage to find out whether the 'master' name was really a big issue or not (although thinking about it, it's already starting to feel .. antiquated), but on the other hand renaming a branch from master to main takes a couple of minutes, it's a simple action that is hardly worth talking about.
    As far as I know, Github only changed the default name for new repositories, there are still plenty of repos naming their main branch 'master'. Github also continues to service ICE amongst their customers.

  • @alvarotriana: from what I understand, the master-slave metaphor for machinery goes back at least as far as the 19th century, when slavery was still legal, with terms such as "servomotors" and "slave clocks." Thus, there is at least an indirect connection between these techno-social metaphors and slavery practices. Computer science inherited these ideas (just like biology inherited "slave ants"), and I think it is important to go back and reconsider why they were named as such in the first place. The fact that Github does not explain why on the renaming guide suggests this is not so much reconsidering as it is obscuring, but at least it is reducing terminology that keeps alive, at least linguistically, these awful historical practices. (Source: Ron Eglash, “Broken Metaphor: The Master-Slave Analogy in Technical Literature,” in _Technology and Culture _Vol. 48 No. 2, 2007).

    @Stefkax: I am also reminded of Triton Mobley's work, visual art projects which use the digital codes for skin tone to explore anti-blackness.

  • thanks @Stefkax @Zach_Mann for the proposal and the reference. When I first watched Modifying the Universal—Femke Snelting I didn't quite get where the problem lies - the statistics and the power structure elucidated where the problem lied.

  • @Zach_Mann: I think the master-slave metaphor was in circulation before the 19th c. Philosopher and mathematician Gottfried Leibniz (1646–1716) was also the designer of one of the first mechanical calculators and explained the division of labor like this: “It is unworthy of excellent men to lose hours like slaves in the labor of calculation which could safely be relegated to anyone else if the machine were used." Discussed in my book The Software Arts. Ron Eglash (who you cite above) and I first discussed this metaphor years ago, in the late-1980s. It's just horrendously long-lived and tenacious in computing!

  • @warrensack: Oh, thanks for that quote, haven't come across it yet. Leibniz saying the quiet part out loud. Eglash cites Charles Babbage saying something similar, if I recall, which isn't surprising with the Leibniz influence.

  • @yaxu it might be a matter of moments to change a branch name, but in many cases there's a lot more to it than that. The name change has to be percolated across build and deployment infrastructure; the assumption might be baked into many scripts. If there's shared build tooling, then the change might have to be coordinated across dozens of repositories at once.

    Unlike a change of nomenclature in spoken communication - where it's simply a matter of relearning some habits - there's a great deal of inertia when things get enshrined in code. That's not through lack of desire: it's through actual technical impediment. I'm proud to say that I don't have a single colleague who's not senaitive to, and sympatetic towards, this matter. Nonetheless, the first (and thusfar only) attempt to begin to introduce this change that we saw was kyboshed when the extent of build system implications became apparent.

  • changing naming conventions like this feels important as a first step (to me), but those conventions are also just racisms patina. irrespective of what we call it, the master-slave paradigm (as an example) grounds architectures in a worldview of hierarchy / domination. more interesting to me is how we teach coders to think about systems as emergent and relational so that these terms become not only offensive and socially disadvantageous but inaccurate and unnecessary.

  • Thanks everyone! :smile:
    It's very helpful to learn from all your knowledge

Sign In or Register to comment.