Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

2026 Participants: Martin Bartelmus * David M. Berry * Alan Blackwell * Gregory Bringman * David Cao * Claire Carroll * Sean Cho Ayres * Hunmin Choi * Jongchan Choi * Lyr Colin * Dan Cox * Christina Cuneo * Orla Delaney * Adrian Demleitner * Pierre Depaz * Mehulkumar Desai * Ranjodh Singh Dhaliwal * Koundinya Dhulipalla * Kevin Driscoll * Iain Emsley * Michael Falk * Leonardo Flores * Jordan Freitas * Aide Violeta Fuentes Barron * Erika Fülöp * Tiffany Fung * Sarah Groff Hennigh-Palermo * Gregor Große-Bölting * Dennis Jerz * Joey Jones * Titaÿna Kauffmann * Haley Kinsler * Todd Millstein * Charu Maithani * Judy Malloy * Eon Meridian * Luis Navarro * Collier Nogues * Stefano Penge * Marta Perez-Campos * Arpita Rathod * Abby Rinaldi * Ari Schlesinger * Carly Schnitzler * Arthur Schwarz * Haerin Shin * Jongbeen Song * Harlin/Hayley Steele * Daniel Temkin * Zach Whalen * Zijian Xia * Waliya Yohanna * Zachary Mann
CCSWG 2026 is coordinated by Lyr Colin-Pacheco (USC), Jeremy Douglass (UCSB), and Mark C. Marino (USC). Sponsored by the Humanities and Critical Code Studies Lab (USC), the Transcriptions Lab (UCSB), and the Digital Arts and Humanities Commons (UCSB).

Structure and Interpretation of Computer Programs (SICP), Ch. 1.1

This discussion was created from comments split from: Oberon: procedure interfaces, parameter modes, and strict typing (1992).

To address the questions, I would like to put another code snippet up for discussion:

Metadata

Context and Code

Structure and Interpretation of Computer Programs was the textbook used to teach introductory computer science and programming at MIT for many years. Other universities around the world followed suit and used SICP, also known as the Wizard Book, as an intro to computer science. (I, too, had SICP as the textbook for my introduction to computer science at the Kiel University, many thousands of kilometers away from MIT.) The first chapter begins with a quote from John Locke's An Essay Concerning Human Understanding, setting the tone for the following introduction to basic programming language concepts and constructs:

The acts of the mind, wherein it exerts its power over simple ideas, are chiefly these three: 1. Combining several simple ideas into one compound one, and thus all complex ideas are made. 2. The second is bringing two ideas, whether simple or complex, together, and setting them by one another so as to take a view of them at once, without uniting them into one, by which it gets all its ideas of relations. 3. The third is separating them from all other ideas that accompany them in their real existence: this is called abstraction, and thus all its general ideas are made.

In just under 40 pages, Abelson and Sussman introduce the basic concepts of programming in Lisp, or in this case MIT Scheme. The language and its advantages for teaching are discussed right at the beginning: the language's sparse syntax – essentially the correct placement of parentheses – is a major advantage over other languages. Lisp is also advantageous because the program constructs themselves are data; this means that programs can be treated as data and there are numerous approaches to metaprogramming. And unlike Oberon, Lisp was not created on the drawing board, but developed organically and iteratively, following the needs of users.

Chapter 1.1 introduces the language itself, presenting the fundamental building blocks and the ways in which they can be combined (see Locke quote!). These are not numerous, so the text immediately addresses some more complex issues, such as different evaluation strategies for expressions. Although a possibility for branching or conditionals (cond) is explicitly introduced, the (nowadays) more common if is treated “only” as a special case of this. It is interesting to note how looping is introduced, more or less in passing in the context of an example. This example illustrates Newton's method (1.1.7) for calculating square roots, which is systematically developed in the text:

(define (sqrt-iter guess x)
    (if (good-enough? guess x)
        guess
        (sqrt-iter (improve guess x) x)))

(define (improve guess x)
    (average guess (/ x guess)))

(define (average x y)
    (/ (+ x y) 2))

(define (good-enough? guess x)
    (< (abs (- (square guess) x)) 0.001))

The recursive call in sqrt-iter is only mentioned in a side note as a way of doing things repeatedly. Further discussion follows in the next chapter. With the ability to branch (cond) and loop (recursion), Turing completeness is achieved in just a few pages. With the means presented, all calculations that a modern computer can perform are possible. A complex programming system has been derived and presented from just a few principles.

One thing that is still of interest here is that the examples used to introduce the language and its features usually work in the abstract and mathematical, such as the aforementioned Newton's method. Similar to Oberon, this appeals to mathematical rigor.

Discussion

SICP positions MIT Scheme as a language well suited for teaching, citing the simplicity of the language, especially its syntax. Chapter 1.1 of SICP seems to provide a plausible and philosophically inspired argument for this. The authors are able to derive complex structures from a few principles. There seems to be an overlap between Scheme and Oberon in that both strive for a conscious reduction of the available resources in order to simplify teaching. Abelson and Sussman may go a little further by equating the functioning of the language with the functioning of the human mind itself, or at least drawing a parallel between the two. If programming with Scheme works like regular thought processes, then it can't be that difficult, right?

I remember having great difficulty getting used to the way of thinking that Scheme requires. In a recent discussion with students about Python vs. Scheme as a first programming language, the students' choice was clear: Python. Scheme was far too complicated. Over the years, I have come to appreciate Scheme and Lisps in general because they make simple things easy. I am not alone in this personal development, as Joel Spolsky also discusses this in an older blog post. However, as with Oberon and based on the code example given above, I wonder whether a programming language should not just be easy to learn. Simplicity is certainly a very sensible requirement, but one that complex applications and professional systems demand. For beginners in programming, easy access is possibly more valuable than a programming system derived from (philosophical) principles.

SICP has not been used as the basis for introductory courses at MIT since 2008. Regarding the reasons for this, Wikipedia contains a noteworthy reference to a comment on an MIT blog post about this change. It states:

I talked to Professor Sussman on the phone, and he told me that he thought I was placing to high an emphasis on the specific language.
He said that he’d actually been trying to have 6.001 replaced for the last ten years (and I read somewhere that Professor Abelson was behind the move too). His point was that the way industries work has simply changed drastically. Understanding the principles is not essential for an introduction to the subject matter anymore, it matters more that you can develop a mental map of systems and make things work for you which is what dealing with the robots in 6.01 will make you do. He sees 6.001 as obsolete. Personally, I wish it was possible to understand all the principles, but I guess its simply a reality that I must deal with. I guess, if you think about it, the path to further progression does not come from re-learning what has already been known a long time, but from using and building on that basis, applying principles in a working fashion, and achieving new things.

Remarkably, this shifts the focus from simplicity to complexity, or rather, how to navigate complex systems. It is unclear why knowledge of the principles should not also be necessary for this, if one follows the general idea. This statement also expresses – again! – a strong idea of what is expected of future computer scientists.

tl;dr My impression is that Oberon and Scheme (or SICP) are examples of languages that have a certain idea of computer science and programming in mind: mathematical, principle-driven, simple, which they attempt to incorporate into teaching. The goal of teaching is understood as something that benefits from these principles. In contrast, Vakil (2018) and Schulte and Budde (2018) have each argued that the goals of computer science education should be far less technical. Both see CS as a tool for self-empowerment that creates a just society and responsible citizens in a digital society. This naturally raises the question: if Oberon and Scheme are not the right tools for this, what is?

Comments

  • I don't know how common this is, but my son goes to one of the top high schools in the nation (so I'm told all the time by them when they are asking for donations! :-) Anyway, he's in AP CS, and to my pleasant surprise, they spend the first half of the year learning Scheme! (From Simple Scheme -- I'm not a huge fan of Simple Scheme, I'd've have gone with SICP, but whatever, it's better than starting with any non-Lisp language, IMHO!) For the second half, they unfortunately devolve to Java, because the AP test is still Java. They call the course "functional and object oriented programming", and Java aside, I think it's pretty great that they're starting with functional, and esp. Lisp ... well, Scheme, close enough.

  • edited January 15

    BTW, funny/interesting to see Lisp (Scheme) being described as “mathematical”. The "mathematical" history of Lisp is quite deep, although not in the way we usually think of that term. or, perhaps we should make a distinction between "mathematical" and "arithmetic". Lisp was explicitly a mixture of IPL-V and the Lambda Calculus. The first (important) application of IPL, by Simon and Newell in the mid 1950s, was the Logic Theory Machine, which reproduced proofs from Principia Mathematica. The Lambda Calculus was, as you probably know, Church's entry to the Turing/Church/Godel response to Hilbert's Entscheidungsproblem (to prove the completeness and consistency of a formal system, which Church and Turning, based on Godel's work, proved impossible). So, yes, Lisp has deep root in mathematics, but unlike most programming language of that day, "simple math" (which I'm calling here "arithmetic") was an afterthought in Lisp. If one was going to do anything very arithmetic, one would reach for Fortran at that time. There weren't all that many choices, of course, but Fortran had "arithmetic" in its name, whereas Lisp had ... Lists (more correctly graphs) ... in it's name. Now, of course, Lisp got bulked up to do perfectly fine arithmetic, and indeed became the basis for all the modern math systems like Macsyma and Mathematica. And a nice side effect of Lisp's facility in list (graph) processing is that it has almost always had rational and bignum arithmetic -- these being more correct than Fortran's (or anyone else's) ugly floating point arithmetic! But still to this day, people are inventing neo-lisps that are better at arithmetic, like Julia. [BTW, it may be useful to dissuade the reader of a common mis-conception about Lisp, that it is slow. Lisp is typically compiled to the metal, and scores well against languages like C in raw speed -- coming in around 2x C's speed in common tests. Lisp is rumored to be slow because it is common to implement Lisp in Lisp as an exercise (and because you can, which is wonderous!) and, yeah, if you do that, the resulting Lisp is slow, but real lisps are perfectly fast, heavy duty programming languages!]

  • As an aside, for people who don't immediately now how to run Scheme in the web REPL:

    1. copy the sqr-itercode snippet above.
    2. Go to the online REPL, e.g. https://try.scheme.org/
    3. In the left-hand pane, paste the code. Make sure to include an enter on the last line.
    4. At the prompt (>), type the expression (sqrt-iter 1.0 9) (a.k.a. "find the square root of 9 through iterative guessing, starting at 1.0).
    5. It should return 3.00009155413138, a good-enough approximation of 3.
    6. Type other expressions of the form (sqrt-iter guess x) to play with the sqrt-iter.
  • Gregor, thank you for this interesting example.

    Regarding education and SICP (Scheme), in addition to Oberon, there might also be a connection to this theme in Scribe -- in particular the drawing of code excerpts from manuals, and the question of how easy or difficult, intuitive or counterintuitive the paradigm and concepts are for learners.

    You point out how a recursive call appears in a very early example code example but is footnoted almost in passing. Looking over the text I can see how this is framed philosophically throughout the preceding introduction: recursion itself is a kind of intellectual wellspring from which Lisp and Scheme arise, as we see in these quotes (all emphasis on recursive is in the original):

    Lisp was invented in the late 1950s as a formalism for reasoning about the use of certain kinds of logical expressions, called recursion equations, as a model for computation. The language was conceived by John McCarthy and is based on his paper “Recursive Functions of Symbolic Expressions and Their Computation by Machine” (McCarthy 1960). (3)

    One of our goals in this chapter is to isolate issues about thinking procedurally. As a case in point, let us consider that, in evaluating combinations, the interpreter is itself following a procedure. [...] Thus, the evaluation rule is recursive in nature; that is, it includes, as one of its steps, the need to invoke the rule itself. Notice how succinctly the idea of recursion can be used to express what, in the case of a deeply nested combination, would otherwise be viewed as a rather complicated process. (12)

    In general, we shall see that recursion is a very powerful technique for dealing with hierarchical, treelike objects. (13)

    By the time we arrive at our example of (sqrt-iter guess x) a learner might even expect that simple or minimal examples might still involve recursion -- even if it hasn't been explained yet! -- as it is apparently paradigmatically fundamental to Scheme (and anyway will be explained in more detail shortly).

    Perhaps introducing recursion before explanation in Chapter 1 also maintains a kind of idiomatic purity, similar to the concept of code being "Pythonic". While it might be trivial to implement an imperative let example that doesn't require a recursive call (thus saving recursion to introduce in its own example), that just wouldn't be the "right way" to implement sqrt-iter. Recursion is soon explained in section 1.2.1, but I think the conscious decision to not bend on recursion and use it before it has been introduced becomes clear much later in the moral language of 3.1.3 "The Costs of Introducing Assignment" and in particular the section "Pitfalls of imperative programming." This might be the only time that we see "let + counter" style imperative looping in a code example, and it is presented as a warning about the wrong way to implement the section 1.2.1 example:

    Instead of passing arguments in the internal iterative loop, we could adopt a more imperative style[...]. This does not change the results produced by the program, but it does introduce a subtle trap. [...] In general, programming with assignment forces us to carefully consider the relative orders of the assignments to make sure that each statement is using the correct version of the variables that have been changed. This issue simply does not arise in functional programs. (318-19)

    Functional programming, recursion, and forbearance from inappropriate assignments and reckless imperative loops are not just being suggested as best practices, they are in some sense held up as virtuous -- like prudence, temperance, or perhaps rectitude.

  • @jeremydouglass There is a very interesting connection worth emphasizing here, because in some sense it gets at the heart of the question of why code matters. Literally everything described by McCarthy in that paper (which, BTW, is here: https://www-formal.stanford.edu/jmc/recursive.pdf) was invented by Newell, Simon, and Shaw, and fully implemented in IPL-V … EXCEPT for the S-expression representation of lists. The invention of what some might call a “mere notational convenience” literally revolutionized computation turning an essentially unusable and difficult to implement language (IPL) into an elegant and trivial to implement one. One might argue that eval, the lisp evaluation function, was also a novel invention of McCarthy’s, but eval was possible in IPL because functions are represented by lists there as well, it was just too difficult. Other practical inventions that made lisp popular followed, such as automatic garbage collection, but important as these were, the thing that killed IPL and led to it’s replacement by lisp was convenience of notation, and nowhere in programming history is there a better example of the power and importance of notation!

Sign In or Register to comment.