Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion

2026 Participants: Martin Bartelmus * David M. Berry * Alan Blackwell * Gregory Bringman * David Cao * Claire Carroll * Sean Cho Ayres * Hunmin Choi * Jongchan Choi * Lyr Colin * Dan Cox * Christina Cuneo * Orla Delaney * Adrian Demleitner * Pierre Depaz * Mehulkumar Desai * Ranjodh Singh Dhaliwal * Koundinya Dhulipalla * Kevin Driscoll * Iain Emsley * Michael Falk * Leonardo Flores * Jordan Freitas * Aide Violeta Fuentes Barron * Erika Fülöp * Tiffany Fung * Sarah Groff Hennigh-Palermo * Gregor Große-Bölting * Dennis Jerz * Joey Jones * Titaÿna Kauffmann * Haley Kinsler * Todd Millstein * Charu Maithani * Judy Malloy * Eon Meridian * Luis Navarro * Collier Nogues * Stefano Penge * Marta Perez-Campos * Arpita Rathod * Abby Rinaldi * Ari Schlesinger * Carly Schnitzler * Arthur Schwarz * Haerin Shin * Jongbeen Song * Harlin/Hayley Steele * Daniel Temkin * Zach Whalen * Zijian Xia * Waliya Yohanna * Zachary Mann
CCSWG 2026 is coordinated by Lyr Colin-Pacheco (USC), Jeremy Douglass (UCSB), and Mark C. Marino (USC). Sponsored by the Humanities and Critical Code Studies Lab (USC), the Transcriptions Lab (UCSB), and the Digital Arts and Humanities Commons (UCSB).

Code Critique: Ruben and Lullaby by Erik Loyer

Title: Rube & Lullaby
Author: Erik Loyer
Link: https://github.com/eloyer/ruben-and-lullaby
Year: 2009

Upon the occasion of CCSWG20, artist Erik Loyer has shared the code for Ruben & Lullaby, a wonderful iOS work in which you play (as in playing a musical instrument) a conversation between two fighting lovers. You can make it worse by aggravating them or try to smooth things out, as you interact to calm them down (via rubbing the screen) or irritate them (by shaking it). The piece also features a "dynamic score," which is also affected by user interaction.

Erik has pointed us in some potential areas of interest:
Input is handled in EAGLView.m — both accelerometer (tilt to cut, shake to anger) and touch (tap to change eye position, stroke to calm).

The logic behind the story mechanics, shot transitions, and the dynamic score is all contained in StoryManager.m — I'm interested in exploring these as potential constraints on live performance by actors and musicians.

The XML file rl_data.xml contains data describing each character view, as well as a script that serves as a kind of superstructure for the experience.

In Localizable.strings you can find the vestiges of an abandoned attempt to make the story mechanics easier to grasp through text.

Here's a demo of the app.

Let's explore.

Comments

  • Okay, let's get this started properly. First, Erik has pointed us toward some fairly easy-to-read code. Here's a passage from the file StoryManager.m file.

    /**
     * Returns the current ending state of the story based on character emotions.
     */
    - (NSString *) getEndingState {
    
        NSString *state;
        RubenLullabyGLAppDelegate *delegate = [[UIApplication sharedApplication] delegate];
    
        if (ruben.hasExpressedFeelings > .666) {
            if (lullaby.hasExpressedFeelings > .666) {
                if ((ruben.emotion < .666) && (lullaby.emotion < .666)) {
                    if (delegate.glView.hasTouched) {
                        state = @"hug";
                    } else {
                        state = @"happy";
                    }
                } else {
                    state = @"sad";
                }
            } else if (lullaby.hasExpressedFeelings > .333) {
                if ((ruben.emotion < .666) && (lullaby.emotion < .666)) {
                    if (delegate.glView.hasTouched) {
                        state = @"hug";
                    } else {
                        state = @"happy";
                    }
                } else {
                    state = @"sad";
                }
            } else {
                state = @"lLeaving";
            }
        } else if (ruben.hasExpressedFeelings > .333) {
            if (lullaby.hasExpressedFeelings > .666) {
                if ((ruben.emotion < .666) && (lullaby.emotion < .666)) {
                    if (delegate.glView.hasTouched) {
                        state = @"hug";
                    } else {
                        state = @"happy";
                    }
                } else {
                    state = @"sad";
                }
            } else if (lullaby.hasExpressedFeelings > .333) {
                state = @"sad";
            } else {
                state = @"sad";
            }
        } else {
            if (lullaby.hasExpressedFeelings > .666) {
                state = @"rLeaving";
            } else if (lullaby.hasExpressedFeelings > .333) {
                state = @"sad";
            } else {
                state = @"sad";
            }
        }
    
        return state;
    

    This code determines the end conditions based on emotions. While it's tempting to make something out of the 666, I'll just read this as Loyer dividing emotional levels into thirds. This is such an intimate game, I find it touching to look at the code that determines the state of the game by measuring a combination of expressed feelings, emotions, and touch.

    A subsequent portion of the code, brings camera angles into the mix:

                    `   [establishingView setCharacterPose:[NSArray arrayWithObjects:@"hug", nil]];`
    

    Let's talk a little about this model for character relationships, particularly as it pertains to outcomes of the game. What does this code tell us about the view of relationships modeled here?

Sign In or Register to comment.