Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion

Participants: Hannah Ackermans * Julianne Aguilar * Bo An * Katie Anagnostou * Joanne Armitage * Lucas Bang * Alanna Bartolini * David M. Berry * Lillian-Yvonne Bertram * Elisa Beshero-Bondar * Briana Bettin * Sayan Bhattacharyya * Avery Blankenship * Gregory Bringman * Tatiana Bryant * Zara Burton * Evan Buswell * Ashleigh Cassemere-Stanfield * Angela Chang * Prashant Chauhan * Lia Coleman * Chris Coleman * Bill Condee * Nicole Cote * Christina Cuneo * Pierre Depaz * Ranjodh Dhaliwal * Samuel DiBella * Quinn Dombrowski * Kevin Driscoll * Brandee Easter * Jeffrey Edgington * Zoelle Egner * Tristan Espinoza * Teodora Sinziana Fartan * Meredith finkelstein * luke fischbeck * Cyril Focht * Cassidy Fuller * Erika Fülöp * gripp gillson * Alice Goldfarb * Jan Grant * Sarah Groff Hennigh-Palermo * Saksham Gupta * MARIO GUZMAN * Gottfried Haider * Rob Hammond * Nabil Hassein * Diogo Henriques * Gui Heurich * Kate Hollenbach * Stefka Hristova * Bryce Jackson * Dennis Jerz * Joey Jones * Amy Kintner * Corinna Kirsch * Harris Kornstein * Julia Kott * Rishav Kundu * Karios Kurav * Cherrie Kwok * Sarah Laiola * RYAN LEACH * Rachael Lee * Kristen Lillvis * Elizabeth Losh * Jiaqi LU * Megan Ma * Emily Maemura * ASHIK MAHMUD * Felipe Mammoli * Mariana Marangoni * Terhi Marttila * Daniel McCafferty * Christopher McGuinness * Alex McLean * Chandler McWilliams * Todd Millstein * Achala Mishra * Mami Mizushina * Nick Montfort * Molly Morin * Gutierrez Nicholaus * Matt Nish-Lapidus * Michael Nixon * Mace Ojala * Steven Oscherwitz * Delfina Pandiani * Stefano Penge * Megan Perram * Gesina Phillips * Tanner Poling * Julia Polyck-O’Neill * Ben Potter * Amit Ray * Katrina Rbeiz * Jake Reber * Thorsten Ries * Giulia Carla Rossi * Barry Rountree * Warren Sack * samara sallam * Mark Sample * Perla Sasson-Henry * zehra sayed * Carly Schnitzler * Ushnish Sengupta * Lyle Skains * Andrew Smith * Rory Solomon * S. Hayley Steele * Samara Steele * Nikki Stevens * Daniel Temkin * Anna Tito * Lesia Tkacz * Fereshteh Toosi * Nicholas Travaglini * Paige Treebridge * Paige Treebridge * Álvaro Triana Sánchez * Lee Tusman * Natalia + Meow Tyshkevich + Kilo * Annette Vee * Malena Velarde * Dan Verständig * Yohanna Waliya * Samantha Walkow * Josephine Walwema * Shu Wan * Biyi Wen * Zach Whalen * Mark Wolff * Christine Woody * kathy wu * Katherine Yang * Shuyi Yin * Nikoleta Zampaki * Hongwei Zhou
Coordinated by Mark Marino (USC), Jeremy Douglass (UCSB), Sarah Ciston (USC), and Zach Mann (USC). Sponsored by the Humanities and Critical Code Studies Lab (USC), and the Digital Arts and Humanities Commons (UCSB).

BrainSim: Neural Network on a Commodore 64 (2022 Code Critique)

Title: Neural Network on a Commodore 64
Author: John Walker
Language: C64 BASIC
Year of development:1987
Software/hardware requirements: Commodore 64 and C64 BASIC

While searching for artificial intelligence that predated the 2000s, I stumbled upon John Walker's blog and found this blogpost on BrainSim. It is a program written in 1987 in less than 250 lines of BASIC to implement a neuron network. The program can train the computer to memorize the patterns of three alphabet letters. It is limited to three because of limited memory capacity. The method to use neuron cells as on and off switches was proposed in the 50s. This paper written by Frank Rosenblatt is a reference for the method. I haven't read the paper at all. I took an applied machine learning class here in CU, a picture of the "perceptron", referring to the neural switch, was in the slides. The class was more technical than investigating the algorithm as a historical artefact, so I looked up the paper outside of class. The extent of what I digested the mechanism is that our neurons receive a stimulus, as the stimulus increases over a threshold, it makes a reaction, and transmits the stimulus onto to other neurons. In the program we are looking at in this thread, there are two layers of neurons. The network is visualized in the blog post in detail. The author also explained how it was implemented (a character is an array of size 42, as the character field is 6 by 7).

I was fascinated by this program. As said in the introduction I am a grad assistant working in the Media Archaeology Lab. Our volunteer helped me to "flash" the program onto a floppy disk and we were playing with it on the lab C64. What's beyond this fascination other than seeing Artificial Intelligence running on a C64? These are my questions open for criticism:

1) Can critical code studies blend in with media archaeology? Does the definition of media archaeology allow that? (I argue with a lab colleague on this). While media archaeology emphasizes the experiences interacting with media, I contend that this is a piece of code running on a media. The C64 as a computer through which the program is mediated. As we study the code, the code actualizes (is this the right word?) itself through the media of C64.

2) I also think about how @nickm defined his practice of writing BASIC poems on Apple II and C64 as platform specific. The specificity has also to do with cultural and historical sensitivity.

"I should point out, however, than many people—millions of us—first learned how to write programs like these using BASIC"

This sentiment is true for our lab community as well. Personally, I was born after the sweet spot, by the time I started to fuzz with computers, they were running on GUIs and had a less programmable interface. Other than historical sensitivity, the program presented to me a sensitivity of limit - it could train 3 letters and not more.

3) Lastly, I also want to bring in the concern of retro-fetishism. I see the risk of falling into retro-fetishism, in context of the MAL, if the program is presented in lack of critical strategies. How do you see the program be critically presented with thoughtful strategies: a good and accessible code review, demonstrations of the training process, comparison with artificial intelligence as implemented in modern platforms (ie Jupyternote book)?

I propose this thread both out of my own academic interest and context of the MAL.


  • There’s a great deal to say here and you have the discussion off to a great start!

    I consider media archaeology (like platform studies) to be interested in a multi-layered view of software and hardware systems in their cultural and historical contexts.

    In other words, not just the interface, not just “the algorithm,” but the particularities of how everything is implemented in code and on one or more layers of platform. Studying the C64 as a hardware artifact (which may be the archetypal media archaeology experience) brings certain things to your attention, for instance the length of pauses when loading from floppy drive and the presence of graphical PETSCII characters right on the keyboard. Reading code for the C64 (as part of CCSWG) highlights other things, and looking at a book that has BASIC programs for the C64 offers another perspective — on the same essential questions.

    People investigating AI and home computing will find that much of what “AI” was thought to be in 1987 was not about machine learning. Yes, it did include neural nets, but I agree with Walker that he was ahead of his time! The AI of the 1980s was often what is now cleped “GOAFI” (Good Old Fashioned AI). Take a look at this wiki page with a bibliography of books on AI and the Commodore 64, for instance.

Sign In or Register to comment.