It looks like you're new here. If you want to get involved, click one of these buttons!
This week, Carly Schnitzler (@cschnitz) and I got together for a chat around the week's topic, New Work and New Directions.
Join us with your thoughts and reactions!
References and Links
EDIT: I had not added a transcript, which is now here! It is still very rough, but I tried to do the fastest of cleanups.
Comments
Such a brilliant start to this thread. Thank you both. Thoughtful and provocative, your discussion has sent my mind going in so many directions. I want to take up an issue you brought up and that I've been discussing with various participants this WG...
In the age of agentic code writers or code generators or coding helpers (Claude Code, Cursor, Codex, etc.), are we moving away from human contact with code so much that CCS offers too narrow a lens, pointing at a relic of programming's earlier stage? Has the time come to expand the larger umbrella to Critical Computation Studies to emphasize the importance of applying critical hermeneutics to more than just code.
(By the way, Critical Computational Studies is tied to a center at Goethe University, just learning about it. I think I'm referring to something more general.)
Now, just code is tricky for me because CCS has never been just about the code. The code is a material trace that uses human-readable signs that can be used as a starting point for the exploration, or perhaps more accurately an anchor. Also, based on my training as a literary scholar, it helps to begin with a text (cultural text or artifact) as the core of the analysis, a shared text to ground our claims, but also a semiotic text with signs instigate, orient, and center the interpretive practice. I take Evan Buswell's reminder (in the second special issue of DHQ) to remember that state is also something we can and should read as part of this process of reading these texts.
Over the past few decades, scholars have demonstrated how Critical Code Studies has called forth new ways of reading. Reading here is not limited to close reading but also distant reading and moving back and forth in what @jeremydouglass calls "transversal reading," but also reading by porting, recreating, and re-animating, reading with tools, reading with code....
So another possibility is to follow Carly and Lyr's suggestion that we expand the scope of what we're studying to include whatever human readable content we can access, including the analog and other texts like prompts, change-logs, or other pieces of a pipeline (here I'm thinking of the ICE-Air thread or some of the work that ProPublica has done on software we have no access to).
Of course, it does not have to be either or. We can do critical code studies as part of critical computation studies. But this is getting at the question of whether computer source code is an important text still, which the UNESCO exhibition makes a very strong case for.
I'd love to hear this discussion.
Also, @cschnitz and @Lyr , can you talk a little more about what you think is motivating your move to the material and maybe even handmade in this particular moment and what that might say about the the impulse toward post-digital Critical Code Studies and DH (I am drawing on Florian Cramer's framing of the post-digital, but you can take it as you like)?
I'm finally catching up, and before I answer, I just wanted to signal I added as an edit in the main post a transcript of the discussion. I had meant to do it with the original post and it went completely over my head.
To answer your question, the move to the material for me makes sense because I approach the digital from the user's perspective, so there is never not a fleshy element. And to an extent I think that is what CCS is also doing by putting code back into its broader cultural, human context. We try not to forget that for the digital to exist there is always a fleshy element. What you say about AI agents that write code questions that really nicely because there is a new degree of removal from the code, like an interpreter.
My specific interest is to get as close as possible to the point at which the physical and the digital meld, sort of like the "surface of the mirror." That's what I talk about with VR hugs, where you can't really detangle the material from the digital because you need both together to give a VR hug. And of course I think the pandemic has made that all the more obvious. @markcmarino you mentioned to me your own experiment of using pillows to convey hugging through Zoom, and although it's a very different setup, I think it speaks to that same weird moment in our social lives where we realize that we have to be comfortable socializing not-in-the-flesh.
And as I write this I catch myself wondering if that is also a form of coding in a purely "felt" language? In my dissertation, besides hugging virtually, I also mention the kinds of embodied attitudes people adopt once given an avatar to control: pacing around, jumping to say hi or thanks, etc. All those actions feel like they add yet another layer to the code, the same way that comments, which are not accessible to the user/output, can offer a new layer of reading. Here, that layer is only observable by existing in that space and experiencing it yourself. In that layer, you find the more social interactions, the things where affect is poured into how we exist online.
I think one interesting point of tension is that this (para?)social layer by which we define how we are going to use a digital space comes, by necessity, at a very different moment from the "proper" code. It's a little bit like accumulating gunk, but useful gunk that oils the parts?
At this point I feel like I am spitting a bunch of ideas that go in too many directions!
I want to build on a lot of what's gone on in these few weeks, and amplify the direction that @Lyr, @markcmarino , @cschnitz and others are headed (whether they are explicit about it or not) by seconding @markcmarino's proposal that we embrace Critical Computing Studies as the scope our methods deserve. , that CCS has developed interpretive methods that work across the full spectrum of computational culture, and is ready to claim that broader territory.
We've engaged in rich discussions of LLM interactions and vibe codin ,analyzing prompts, model behaviors, emergent properties. The Software Heritage conversation revealed how powerfully we can read computational systems when we have code alongside oral histories, documentation, organizational context. @davidmberry identified intermediate objects ( tables and JSON files and prompts, ...) as carrying crucial cultural meaning. @JoeyJones'sfictional code examples showed us reading computational culture even where nothing even executes!
Here's what this opens up:
Emergent systems as primary texts: The ML models, distributed infrastructures, and complex real-time systems shaping contemporary life are perfect candidates for our interpretive methods. We can read prompts, UIs, training data curation, architectural choices, specs, hardware constraints for embedded values, labor relations, and ideology -- exactly what CCS pioneered for source code.
Intermediate objects as cultural artifacts: @davidmberry's insight about prompts, JSON files, and tables points to where interpretation increasingly happens. These artifacts carry intention, novelty, cultural significanc. Their prime material for our critical methods. Critical Computing Studies explicitly welcomes them.
Leading critical interpretation: The humanistic interpretive methods for computational systems that reveal what instrumental approaches miss positions the broader CCS to lead critical analysis of AI systems, infrastructure, and emerging computational forms.
The methodological insightsof reading for rhetoric, embedded assumptions, material constraints, cultural values, etc. work beautifully across all computational substrates.
Calling it Critical Computing Studies claims the full reach our methods have already demonstrated. We've have built foundation. Let's claim the territory.
I fed Claude the above, and asked it to find existing academic programmes in this space. Here's what it said:
Based on my research, here's what I found about existing academic programs in this space:
Established Programs
Science, Technology, and Society (STS) Programs - The most mature field addressing sociotechnical systems:
- Graduate programs: MIT, Cornell, Stanford, UC Berkeley, Virginia Tech, Penn State, RPI, UCSD, University of Minnesota, among ~30 US institutions
- Focus: Historical and social analysis of science and technology, examining how technology shapes and is shaped by society
- Key centers: Cornell's Department of Science & Technology Studies (founded 1991), MIT STS, Stanford STS (founded 1971)
Software Studies & Platform Studies:
- MIT Press Software Studies book series (editors: Matthew Fuller, Noah Wardrip-Fruin, others)
- Platform Studies series focusing on hardware/software relationships
- These are primarily scholarly movements rather than degree programs
Media Archaeology:
- University at Buffalo has a strong Media Archaeology research area in their Department of Media Study
- Focuses on low-level operation of technical media, creating artworks from obsolete technologies, uncovering old software
Critical Computing (emerging):
- Yale Critical Computing Initiative - interdisciplinary initiative examining computing, culture, ethics, and societies
- University of Michigan - "Critical Studies of Design and Computing" research area in their School of Information
- Critical Computing Conference - New annual conference being established with ACM SIGCHI (first gatherings in 2025), founded by scholars from University of Washington, University of Edinburgh, USC, University of the Philippines, Universidad Nacional de Colombia
Human-AI Interaction:
- Stanford HAI (Human-Centered AI Institute)
- Multiple universities have HAI or human-computer interaction programs, though these tend toward instrumental rather than critical approaches
The Gap
What's notable: There's no established "Critical Computing Studies" program despite:
1. Active scholarly work (Software Studies, Platform Studies, Media Archaeology)
2. Institutional infrastructure (STS programs at major universities)
3. Growing need (Critical Computing Conference launching, Yale Critical Computing Initiative)
This gap represents an opportunity - CCS could position itself as the methodological leader for critical interpretation of computational systems, bridging between STS's broader sociotechnical focus and the code-level insights CCS has pioneered.
Thanks Lyr and Carly for the human-presence thinking that pervades your conversation:
At the School of the Art Institute of Chicago, where I teach, there is not an emphasis on choosing specific majors. Thus, in my Women Artists in Cyberspace course, in addition to students in Art and Tech, I also get painters, sculptors, ceramic artists, makers of artists books, and many other kinds of artists, and their work provides examples of the energizing combination of analog and digital that Carly brings up.
See https://www.narrabase.net/saic_ats/waic_intro20-21.html (click on each artists name for details)
As regards the role of interface as agent that Lyr brings up, a classic reference -- that may now be less read than it it should be -- is Brenda Laurel: The Art of Human-Computer Interface Design, Addison-Wesley, 1990. For an in depth review, see
http://www.narrabase.net/elit_software_links.html#brenda
Re a different focus, Mark suggests Critical Computation Studies: @jshrager calls attention to Critical Computing Studies. Personally I prefer retaining code because it encompasses the combinations with performance and the material that Carly and Lyr suggest. “Computing “ does not suggest a welcome to artists and humanities scholars.