<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom">
	<channel>
      <title>2022 Code Critiques — CCS Working Group</title>
      <link>https://wg.criticalcodestudies.com/index.php?p=/</link>
      <pubDate>Wed, 22 Apr 2026 04:53:10 +0000</pubDate>
          <description>2022 Code Critiques — CCS Working Group</description>
    <language>en</language>
    <atom:link href="https://wg.criticalcodestudies.com/index.php?p=/categories/2022-code-critiques/feed.rss" rel="self" type="application/rss+xml"/>
    <item>
        <title>Decolonizing Climate Code / Decolonizing Climate (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/137/decolonizing-climate-code-decolonizing-climate-2022-code-critique</link>
        <pubDate>Mon, 21 Feb 2022 06:29:34 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>HarlinHayleySteele</dc:creator>
        <guid isPermaLink="false">137@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>Any critique of the code that generates the predictive climate data used by the IPCC must be done with care.  </p>

<p>As Mark Marino has explored in chapter four of <a rel="nofollow" href="https://mitpress.mit.edu/books/critical-code-studies" title="Critical Code Studies"><em>Critical Code Studies</em></a> (MIT Press, 2020), there are some are some rather nasty media assemblages out there eager to take snippets of climate code out of context. This can make a person a bit nervous about any attempt to offer even constructive constructive critique of this code (This is why I'm posting this so late into the working group... I've spent the last four weeks fretting over this, worried that even the most minor poetic flourish on my part might be taken out of context...)</p>

<p>The "aura of anxiety" surrounding this code is very much the result of over four decades of disinformation campaigns directed against climate science (<a rel="nofollow" href="https://www.nature.com/articles/465686a" title="Oreskes and Conway 2010">Oreskes and Conway 2010</a>) by those who wish to reap temporary benefits from fossil fuel capitalism (<a rel="nofollow" href="https://onlinelibrary.wiley.com/doi/10.1111/j.1467-7660.2009.01610.x" title="Storm 2009">Storm 2009</a>). These "merchants of doubt" are quite ready to use any excuse they can find to claim the climate scientists are "overreacting."</p>

<p>That is why I feel it is important to say, right out of the gate, that from what I learned while exploring this code, the scientists are actually <em>underreacting.</em></p>

<p>If anything, the scientists have been, perhaps, too timid with their presentation of the severity of what these data indicate.</p>

<hr />

<p>Any discussion of climate struggle should also be prefaced with a critique of "crisis epistemology," as has been offered by Indigenous philosopher Kyle Powys Whyte (Potawatomi) in "<a rel="nofollow" href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3891125" title="Against Crisis Epistemology">Against Crisis Epistemology</a>" (2021). Whyte blends Indigenous histories with critical theory to argue against the use of "crisis" to frame the climate struggle.</p>

<p>This is not because the struggle to stabilize the climate isn't dire (It is.), but rather, because the rhetoric of "crisis" has been used for centuries to rob Indigenous groups of land rights. This issue should be of particular concern to environmentalists, since research continues to show that giving Indigenous people their land back may be one of our best tactics to mitigate climate change and other forms of ecological peril (<a rel="nofollow" href="https://www.nature.com/articles/palcomms201785" title="Etchart 2017">Etchart 2017</a>, <a rel="nofollow" href="https://report.territoriesoflife.org/wp-content/uploads/2021/09/ICCA-Territories-of-Life-2021-Report-FULL-150dpi-ENG.pdf" title="ICCA 2021">ICCA 2021</a>).</p>

<p>This critique of "climate crisis epistemology" was brought into sharp relief in settler occupied Washington State in 2021, when colonial magistrate Jay Inslee evoked the rhetoric of "climate crisis" in <a rel="nofollow" href="https://washingtonstatewire.com/tribal-leaders-legislators-condemn-inslees-surprise-veto-of-tribal-human-rights-provisions-in-climate-commitment-act/" title="a broad move">a broad move</a> that robbed Indigenous groups of decision-making power over their ancestral lands. This move happened in the backdrop of the colonial government's practice of regularly allowing corporations to remove ecosystems on public lands, as exemplified by Washington State DNR's regular practice of allowing corporations to clearcut publicly-owned "legacy forests," which contain mature trees over 120 years old with diameters wider than four feet (<a rel="nofollow" href="https://www.seattletimes.com/seattle-news/environment/amid-climate-crisis-a-proposal-to-save-washington-state-forests-for-carbon-storage-not-logging/" title="Seattle Times 2021">Seattle Times 2021</a>, <a rel="nofollow" href="https://www.c4rf.org/" title="C4RF.org">C4RF.org</a>), making them invaluable carbon sinks (<a rel="nofollow" href="https://academic.oup.com/treephys/article/31/9/893/1676008?login=false" title="Whitehead 2011">Whitehead 2011</a>).</p>

<p>With one face, the colonial government evokes "climate crisis" as its reason to remove Indigenous decision-making powers over their ancestral lands, while, with its other face, it gives logging firms free rein to clearcut public forests. As Kyle Whyte has emphasized, the "climate crisis" rhetoric has simply become a new way to justify a "state of exception" (<a rel="nofollow" href="https://press.uchicago.edu/ucp/books/book/chicago/S/bo3534874.html" title="Agamben 2005">Agamben 2005</a>) in which Indigenous sovereignty is suspended, while those who actively harm the climate and ecology are given free rein.</p>

<p>Again, this shouldn't undermine the seriousness of climate struggle—it simply means lending more care to how this struggle is communicated, while working to better center Indigenous voices in climate discourse.</p>

<hr />

<p>Any conversion about climate in these times also must be prefaced with a critique of the concept of the "population bomb," as has been laid out by Emily Klancher Merchant in her book <em><a rel="nofollow" href="https://oxford.universitypressscholarship.com/view/10.1093/oso/9780197558942.001.0001/oso-9780197558942" title="Building the Population Bomb">Building the Population Bomb</a></em> (Oxford University Press, 2021).</p>

<p>The myth of “the population bomb,” or the belief that population in and of itself drives ecological destruction remains pervasive among climate activists, scientists, and even some thinkers in the humanities. There is little evidence, however, that more people inherently generate more emissions, or that reducing the number of people on the planet would reduce emissions.</p>

<p>As Merchant shows, the concept of the "population bomb" was invented by eugenicists in the middle of the twentieth century, and then was promoted by American businessmen as a means of stalling environmental regulation. Reductive equations that link population and emissions distract us from publicly addressing the activities that directly fuel emissions, while foreclosing upon successful tactics for reducing emissions that simply aren't visible when taking such a hyperopic view.</p>

<hr />

<p><strong>File:</strong> zchunk_L251.en_ssp_nonco2.R]<br />
<strong>Programming Language:</strong> R <br />
<strong>Developed:</strong> 2017<br />
<strong>Authors:</strong> <a rel="nofollow" href="https://github.com/JGCRI/gcam-core/commits/master/input/gcamdata/R/zchunk_L251.en_ssp_nonco2.R?author=pralitp" title="Paralit Patel">Paralit Patel</a>, <a rel="nofollow" href="https://github.com/JGCRI/gcam-core/commits/master/input/gcamdata/R/zchunk_L251.en_ssp_nonco2.R?author=pkyle" title="pkyle">pkyle</a>, <a rel="nofollow" href="https://github.com/kvcalvin" title="kvcalvin">kvcalvin</a>, <a rel="nofollow" href="https://github.com/mbins" title="mbins">mbins</a> <br />
<strong>Source File:</strong> <a href="https://github.com/JGCRI/gcam-core/blob/master/input/gcamdata/R/zchunk_L251.en_ssp_nonco2.R" rel="nofollow">https://github.com/JGCRI/gcam-core/blob/master/input/gcamdata/R/zchunk_L251.en_ssp_nonco2.R</a><br />
<strong>Interoperating Files:</strong> GCAM (<a href="https://github.com/JGCRI/gcam-core/releases" rel="nofollow">https://github.com/JGCRI/gcam-core/releases</a>), The SSP database V.2 (<a href="https://tntcat.iiasa.ac.at/SspDb" rel="nofollow">https://tntcat.iiasa.ac.at/SspDb</a>)<br />
<strong>Note:</strong> In the parlance of climate modellers, this is a run of the GCAM5 under SSP1/5, SSP2, and SSP3/4.</p>

<pre><code># Copyright 2019 Battelle Memorial Institute; see the LICENSE file.

#' module_emissions_L251.en_ssp_nonco2
#'
#' Produce regional non-CO2 emissions coefficient data for SSPs 1/5, 2, and 3/4 as well as a GDP control.
#'
#' @param command API command to execute
#' @param ... other optional parameters, depending on command
#' @return Depends on \code{command}: either a vector of required inputs,
#' a vector of output names, or (if \code{command} is &quot;MAKE&quot;) all
#' the generated outputs: \code{L251.ctrl.delete}, \code{L251.ssp15_ef}, \code{L251.ssp2_ef}, \code{L251.ssp34_ef}, \code{L251.ssp15_ef_vin}, \code{L251.ssp2_ef_vin}, \code{L251.ssp34_ef_vin}. The corresponding file in the
#' original data system was \code{L251.en_ssp_nonco2.R} (emissions level2).
#' @details This section takes in the non-CO2 emissions factors for SSP 1/5, 2, and 3/4 across sectors.
#' First, create data that spans the years 2010-2100 in five year increments by interpolation of input data.
#' Next, add emissions controls for future years of vintaged technologies for SSP emission factors.
#' Then, add columns that have regional SO2 emission species.
#' A GDP control of regional non-CO2 emissions in all regions is also created.
#' @importFrom assertthat assert_that
#' @importFrom dplyr filter group_by left_join mutate select semi_join
#' @author CDL May 2017
module_emissions_L251.en_ssp_nonco2 &lt;- function(command, ...) {
  UCD_tech_map_name &lt;- if_else(energy.TRAN_UCD_MODE == 'rev.mode', &quot;energy/mappings/UCD_techs_revised&quot;, &quot;energy/mappings/UCD_techs&quot;)
  if(command == driver.DECLARE_INPUTS) {
    return(c(FILE = &quot;emissions/A_regions&quot;,
             # the following files to be able to map in the input.name to
             # use for the input-driver
             FILE = &quot;energy/calibrated_techs&quot;,
             FILE = &quot;energy/calibrated_techs_bld_det&quot;,
             FILE = UCD_tech_map_name,
             &quot;L161.SSP2_EF&quot;,
             &quot;L161.SSP15_EF&quot;,
             &quot;L161.SSP34_EF&quot;,
             &quot;L201.nonghg_steepness&quot;,
             &quot;L223.GlobalTechEff_elec&quot;))

  } else if(command == driver.DECLARE_OUTPUTS) {
    return(c(&quot;L251.ctrl.delete&quot;,
             &quot;L251.ssp15_ef&quot;,
             &quot;L251.ssp2_ef&quot;,
             &quot;L251.ssp34_ef&quot;,
             &quot;L251.ssp15_ef_elec&quot;,
             &quot;L251.ssp2_ef_elec&quot;,
             &quot;L251.ssp34_ef_elec&quot;,
             &quot;L251.ssp15_ef_vin&quot;,
             &quot;L251.ssp2_ef_vin&quot;,
             &quot;L251.ssp34_ef_vin&quot;))
  } else if(command == driver.MAKE) {

    year &lt;- value &lt;- GCAM_region_ID &lt;- Non.CO2 &lt;- supplysector &lt;- subsector &lt;-
      stub.technology &lt;- agg_sector &lt;- MAC_region &lt;- bio_N2O_coef &lt;- future.emiss.coeff.year &lt;-
      SO2_name &lt;- GAINS_region &lt;- emiss.coeff &lt;- technology &lt;- minicam.energy.input &lt;-
      tranSubsector &lt;- tranTechnology &lt;- input.name &lt;- efficiency &lt;-
      future.emiss.coeff.year &lt;- NULL # silence package check.

    all_data &lt;- list(...)[[1]]

    # Load required inputs
    get_data(all_data, &quot;emissions/A_regions&quot;) -&gt;
      A_regions
    get_data(all_data, &quot;L161.SSP2_EF&quot;) -&gt;
      L161.SSP2_EF
    get_data(all_data, &quot;L161.SSP15_EF&quot;) -&gt;
      L161.SSP15_EF
    get_data(all_data, &quot;L161.SSP34_EF&quot;) -&gt;
      L161.SSP34_EF
    get_data(all_data, &quot;L201.nonghg_steepness&quot;) -&gt; L201.nonghg_steepness
    L223.GlobalTechEff_elec &lt;- get_data(all_data, &quot;L223.GlobalTechEff_elec&quot;)

    # make a complete mapping to be able to look up with sector + subsector + tech the
    # input name to use for an input-driver
    bind_rows(
      get_data(all_data, &quot;energy/calibrated_techs&quot;) %&gt;% select(supplysector, subsector, technology, minicam.energy.input),
      get_data(all_data, &quot;energy/calibrated_techs_bld_det&quot;) %&gt;% select(supplysector, subsector, technology, minicam.energy.input),
      get_data(all_data, UCD_tech_map_name) %&gt;% select(supplysector, subsector = tranSubsector, technology = tranTechnology, minicam.energy.input)
    ) %&gt;%
      rename(stub.technology = technology,
             input.name = minicam.energy.input) %&gt;%
      distinct() -&gt;
      EnTechInputNameMap

</code></pre>

<p>There are 309 more lines. If you want to see the whole thing, go to: <a rel="nofollow" href="https://github.com/JGCRI/gcam-core/blob/master/input/gcamdata/R/zchunk_L251.en_ssp_nonco2.R" title="https://github.com/JGCRI/gcam-core/blob/master/input/gcamdata/R/zchunk_L251.en_ssp_nonco2.R">https://github.com/JGCRI/gcam-core/blob/master/input/gcamdata/R/zchunk_L251.en_ssp_nonco2.R</a></p>

<hr />

<p>The thing I find interesting about this code, along with the larger assemblages it is part of, isn't what's there, but what <em>isn't</em> there.</p>

<p>First, I should explain that when you run this code and plot the data, you're going to wind up with a graph that looks somewhat like this:</p>

<p><img src="https://wg.criticalcodestudies.com/uploads/editor/cr/0fv3l981fzq2.jpg" alt="" title="" /></p>

<p>You've probably seen this graph. It has made rounds in the media, and it has been part of the most recent meetings of the IPCC and COP. Indeed, we are looking at the cat's pajamas of climate code! This is the big stuff—the CMIP-approved stuff! This code, and its assemblages, generate the predictive climate data that inform leaders at the highest levels of government, along with the media and the public, as we attempt to collectively make decisions about what happens next.</p>

<p>The process of developing, reviewing, and running the scenario-models that produced this graph took roughly seven years to complete, and the parameters for the models and the data have been gone through many rounds of professional examination and peer-review before receiving their CMIP endorsement. CMIP stands for Climate Model Intercomparison Project, and the CMIP may be thought of as an ensemble of over 100 endorsed models from over 50 modeling institutions that work somewhat like an orchestra to model different aspects of the climate (<a rel="nofollow" href="https://climate.copernicus.eu/latest-projections-future-climate-now-available" title="ECMWF 2021">ECMWF 2021</a>), (<a rel="nofollow" href="https://www.wcrp-climate.org/wgcm-cmip/wgcm-cmip6" title="WCRP">WCRP</a>).</p>

<p>Each major run or "phase" of the CMIP corresponds with a new IPCC climate assessment cycle. Each five-year cycle centers the development and presentation of an Assessment Report (AR) that draws upon the predictive climate data generated by the CMIP. We are currently at the end of the 6th IPCC assessment cycle, with the CMIP6 having been run in roughly 2017-18 and the <a rel="nofollow" href="https://www.ipcc.ch/report/ar6/wg1/" title="AR6">AR6</a> having been offered to policymakers and the public in 2021. Presently, models and parameters are being prepared, adjusted (to fit the latest research), and reviewed for the 7th phase of the CMIP, which, once completed, will inform the AR7, which will last from 2023-2028.</p>

<hr />

<p>Here at the CCSWG, we are readers--readers of code, and sometimes other things.</p>

<p>Looking at the graph above, the reader might find themselves in a bit of quandary, asking, "How should I interpret this graph? How should I read it?"</p>

<p>The way many of us have been taught to interpret the above graph (and even some scientists read it this way) is to point to the top two lines and say, “Those are the worst-case scenarios.” And then we are supposed to gesture towards the colorful lines at the bottom and say, “That’s the good outcome.” We then often look towards the middle-lower lines and assume that we can sort of average everything together and say, "Those lines down there, the middle ones, that must be the path we are on."</p>

<p>The trouble is... that is not how to read this graph.</p>

<p>To get a sense of how this graph should actually be read, you need to get a bit more familiar with the code and data that produced it.</p>

<hr />

<p>If you look up at the above code snippet, in Lines 30-32, we can see a spot in which the SSP scenario data is being plugged into the GCAM planetary model:</p>

<pre><code>30.              &quot;L161.SSP2_EF&quot;,
31.              &quot;L161.SSP15_EF&quot;,
32.              &quot;L161.SSP34_EF&quot;,
</code></pre>

<p>The GCAM, or Global Change Analysis Model, is an open source model of the planet, and it may be downloaded <a rel="nofollow" href="https://github.com/JGCRI/gcam-core" title="here on GitHub">here on GitHub</a>. The GCAM has been developed in a colaboratory effort between multiple labs, and an article documenting the design of the latest version by Katherine Calvin and her co-authors may be found <a rel="nofollow" href="https://gmd.copernicus.org/articles/12/677/2019/" title="in issue 12">in issue 12</a> of the journal <em>Geoscientific Model Development</em>. (Note: the above code snippet was co-authored by one of the co-authors of this paper, so we can assume this code snippet was at least partially composed by someone involved with the creation of the model itself!)</p>

<p>The GCAM is one of six models of Earth that the SSP data is designed to run on (<a rel="nofollow" href="https://www.sciencedirect.com/science/article/pii/S0959378016300681" title="Riahi et al. 2017">Riahi et al. 2017</a>). The other five planetary models (in case you are curious) are the <a rel="nofollow" href="http://pure.iiasa.ac.at/id/eprint/14835/" title="AIM/CGE">AIM/CGE</a>, the <a rel="nofollow" href="https://models.pbl.nl/image/index.php/Welcome_to_IMAGE_3.2_Documentation" title="image">IMAGE</a>, the <a rel="nofollow" href="https://www.iamconsortium.org/resources/model-resources/message-globiom/" title="MESSAGE-GLOBIOM">MESSAGE-GLOBIOM</a>, the <a rel="nofollow" href="https://www.iamcdocumentation.eu/index.php/Model_Documentation_-_REMIND-MAgPIE" title="REMIND-MAgPIE">REMIND-MAgPIE</a>, and the <a rel="nofollow" href="https://www.witchmodel.org/" title="WITCH">WITCH</a>. (The WITCH has a rather fun online tool, <a rel="nofollow" href="http://www.magicc.org/" title="MAGICC">MAGICC</a>, that includes an online modeling interface you can use to run the SSP data or input a social scenario of your own—this is a very good tool for students who might be new to coding, and you can even use it to run the SSPs without needing any knowledge of coding 🧙‍♀️).</p>

<hr />

<p>The SSPs are a set of ready-made data sets that represent different scenarios that are designed to be run through these six CMIP-approved models of Earth. There are five scenarios, the SSP 1-5, and the things that happen in them depend upon what the humans in the models do. Each "scenario" is treated as a possible path we might take in the time we have left before catastrophic levels of increased global temperature are locked in.</p>

<p>You can download the SSP scenario data straight from its source, the <a rel="nofollow" href="https://tntcat.iiasa.ac.at/SspDb/dsd" title="IIASA SSP Database">IIASA SSP Database</a> (you have to create an account to access the data, which takes a couple minutes). If you're in a hurry, you can glance over the SSP1-5 data here <a rel="nofollow" href="https://github.com/JGCRI/ssp-data/tree/master/data" title="JGCRI/ssp-data &quot;on GitHub">on GitHub</a>. (Each of the five files contains all of the data for one of the SSP scenarios). Scenario 1 (or SSP1) is the most optimistic, while the fifth scenario, the SSP5 is the least.</p>

<hr />

<p>The emergence of the SSP scenarios is briefly explored by <a rel="nofollow" href="https://gmd.copernicus.org/articles/9/3461/2016/" title="Brian C. O’Neill and his co-authors">Brian O’Neill and his co-authors</a> in issue 9 of <em>Geoscientific Model Development.</em> In this article, we learn that it was not until the most recent phase of the CMIP, the CMIP6, that the scenarios (which were previously limited to the RCPs) were removed from the core experiment and became their own MIP, the ScenarioMIP, (and the scenarios themselves were renamed the SSPs). Parsing the scenarios from the core experiment in this way has opened up space for more time and attention to be spent refining the scenarios.</p>

<p>The scenarios, it should be emphasized, are the only part of the CMIP's predictive climate data that factors in the role of social activities upon future emissions.</p>

<p>Since the last CMIP, the modeling community that creates the Scenarios has been working to bring more voices into the process of developing the Scenarios. In 2019, these communities hosted the first <a rel="nofollow" href="http://pardeecenter.squarespace.com/" title="Scenarios Forum">Scenarios Forum</a>, an event bringing together "a diverse set of communities" to engage with the Scenario models to "exchange experiences, ideas...and identify knowledge gaps for future research." These scenario modelers have continued to acknowledge a need for feedback, including a "particular need for social sciences to inform scenarios on societal dynamics and tipping points" <a rel="nofollow" href="https://www.nature.com/articles/s41558-020-00952-0" title="O’Neill et al. 2020">O’Neill et al. 2020</a>.</p>

<p>My hope, I suppose, with this critique, is to show how voices from the humanities many also be valuable to the efforts to develop, refine, and share these scenario-models. Likewise, I do think that the humanities would benefit from lending its artful tools towards the scenario-models of CMIP and other efforts to model the planet--as Katherine Buse has emphasized <a rel="nofollow" href="https://history.uchicago.edu/directory/katherine-buse" title="in her compelling work">in her compelling work</a> that treats climate models as a medium in their own right.</p>

<hr />

<p>Returning to my reading of the above graph, alongside its attendant code and data assemblages, I perceive this graph to be telling me a type of story, or multiple stories, about the future; the actual future, or what Genette has called "nonfiction diegeses" (<a rel="nofollow" href="https://www.cornellpress.cornell.edu/book/9780801410994/narrative-discourse/" title="1980">1980</a>). A diegesis might be thought of as a storyworld, and we usually think of these in fiction settings, such as the diegesis of the MCU or of the Star Trek Universe. There are nonfiction diegeses as well, and sometimes it can be bumpy to bring everyone into the same nonfiction diegesis, but this is required for any form of collective action.</p>

<p>The active process of collectively working your way into a shared narrative about reality that facilitates action has been called "frame alignment" by cultural theorist Robert Carley (<a rel="nofollow" href="https://sunypress.edu/Books/C/Culture-and-Tactics" title="Culture and Tactics, SUNY Press, 2018">Culture and Tactics, SUNY Press, 2018</a>; cf. <a rel="nofollow" href="http://cup.columbia.edu/book/prison-notebooks/9780231060820" title="Gramsci 2011">Gramsci 2011</a>).</p>

<p>So, each line on the graph represents a <em>different</em> story about the future, a story that is also represented by the SSP scenario data set that was used to generate that line on the graph. You cannot average these lines together anymore than you could average together the works of Shakespeare. Each of these sets of scenario data is its own independent story, its own nonfiction narrative about the fate of the world.</p>

<p>Perhaps we are now beginning to suspect that we should be suspicious of those who tell us to read the graph as if the lower-middle lines represent the path we are on...</p>

<hr />

<p>What exactly are these scenarios though? What are the stories that each set of SSP data purports to tell?</p>

<p>There are five "official" SSP Narratives that are used to explain why the numbers are set the way they are in each of the SSP data sets. (<a rel="nofollow" href="https://www.sciencedirect.com/science/article/pii/S0959378016300681" title="Riahi et al. 2017">Riahi et al. 2017</a>).</p>

<p>Here are the first two narratives, which are used to explain the two most "optimistic" emissions scenarios, the SSP1 and the SSP 2:</p>

<blockquote><div>
  <p>SSP1  Sustainability – Taking the Green Road (Low challenges to mitigation and adaptation)<br />
  The world shifts gradually, but pervasively, toward a more sustainable path, emphasizing more inclusive development that respects perceived environmental boundaries. Management of the global commons slowly improves, educational and health investments accelerate the demographic transition, and the emphasis on economic growth shifts toward a broader emphasis on human well-being. Driven by an increasing commitment to achieving development goals, inequality is reduced both across and within countries. Consumption is oriented toward low material growth and lower resource and energy intensity.</p>
  
  <p>SSP2  Middle of the Road (Medium challenges to mitigation and adaptation)<br />
  The world follows a path in which social, economic, and technological trends do not shift markedly from historical patterns. Development and income growth proceeds unevenly, with some countries making relatively good progress while others fall short of expectations. Global and national institutions work toward but make slow progress in achieving sustainable development goals. Environmental systems experience degradation, although there are some improvements and overall the intensity of resource and energy use declines. Global population growth is moderate and levels off in the second half of the century. Income inequality persists or improves only slowly and challenges to reducing vulnerability to societal and environmental changes remain.</p>
</div></blockquote>

<p>The other three narratives can be found in <a rel="nofollow" href="https://www.sciencedirect.com/science/article/pii/S0959378016300681" title="this paper by Keywan Riahi and his co-authors">this paper by Keywan Riahi and his co-authors</a> (2017).</p>

<p>It is worth noting that the present SSP data also include a rather problematic relationship between "population" "GDP" "investment" and "policy." I will save that critique for another day, as it falls more into the realm of social science.</p>

<p>Needless to say though, as a reader, I am having quite a bit of trouble suspending my disbelief for the "optimistic" scenarios--both because I do not find the narratives compelling enough to convince me that they will actually happen, and because as I pour through the data, I can find nothing to support that these scenarios are suggesting evidence-based types of social organizing changes that would be needed to us steer away from the course we are on.</p>

<p>This isn't to say that I don't have a level of optimism. I just want to be convinced.</p>

<p>I am a very picky reader, and it is going to take an extra amount of work to convince me to suspend my disbelief.</p>

<hr />

<p>Currently, according to <a rel="nofollow" href="https://www.scientificamerican.com/article/clean-energy-lags-put-world-on-pace-for-6-degrees-celsius-of-global-warming/" title="IEA estimate">IEA estimates</a>, we are on a path to create a 6°C rise in global temperature by 2100. Likewise, the tiny "dip" in emissions that occurred at the start of the pandemic quickly vanished as companies turned towards cheaper forms of fossil fuel, including coal, to deal with economic woes (<a rel="nofollow" href="https://www.nature.com/articles/d41586-020-01125-x" title="Tollefson 2020">Tollefson 2020</a>, <a rel="nofollow" href="https://www.iea.org/news/coal-power-s-sharp-rebound-is-taking-it-to-a-new-record-in-2021-threatening-net-zero-goals" title="IEA 2021">IEA 2021</a>).</p>

<p>That is the path we are currently on:</p>

<p>A 6°C rise locked in less than 80 years.</p>

<p>It is a path so bad, the scientists didn't think to include it in the SSP scenarios</p>

<p>The Y-axis on the above graph doesn't even go that high.</p>

<hr />

<p>This continued acceleration in emissions comes despite survey data that shows that 64% of people in 50 countries representing half the world's population believe climate change is a global emergency (<a rel="nofollow" href="https://www.undp.org/press-releases/worlds-largest-survey-public-opinion-climate-change-majority-people-call-wide" title="UNDP-Oxford 2021">UNDP-Oxford 2019</a>). It also comes despite research that shows that 62% of Americans say that climate change has effected them personally (<a rel="nofollow" href="https://www.pewresearch.org/science/2019/11/25/u-s-public-views-on-climate-and-energy/" title="Pew 2021">Pew 2021</a>).</p>

<p>This is not a matter of political will: the majority of people already want to change the emissions course we are on. There is something else at play...</p>

<hr />

<p>This isn't to say that there is no data in the SSPs to back up the emissions drops.</p>

<p>There is, in fact, plenty of data in the scenarios about emissions going down--but it all centers specific, practical actions: more solar, less coal, etc.</p>

<p>There is nothing, however, in the data that explains the types of social intervention that would need to happen to allow these changes to occur.</p>

<p>I want there to be a compelling reason for any emissions dip in the stories these scenarios tell, otherwise, as an audience member, I am going roll my eyes and yawn. I am going to feel like I’m stuck watching a B-rated movie in which the actions of the protagonists simply aren't even feasible, my engagement waning, as the theatre around me starts to burn.</p>

<hr />

<p>For any drop in emissions in these scenarios/narratives, I want that dip to be linked to an evidence-based change in the institutions and power structures that mediate our lives.</p>

<p>By "evidence-backed," I mean, I want evidence that shows these changes to the social structure would reduce emissions, or at least have a good chance of it.</p>

<hr />

<p>Here's a bit of evidence that gives me hope:</p>

<ul>
<li>One study looking at 72 countries over a 30 year period found that a one unit increase in a country’s score on the women’s political empowerment index was associated with an 11.51% decrease in the country’s carbon emissions (<a rel="nofollow" href="https://onlinelibrary.wiley.com/doi/10.1002/sd.1926" title="Lv and Deng 2019">Lv and Deng 2019</a>).</li>
<li>Reducing racism has been associated with reductions in emissions (<a rel="nofollow" href="https://genderpolicyreport.umn.edu/diversifying-leadership-climate-and-energy/" title="Stephens 2019">Stephens 2019</a>, <a rel="nofollow" href="https://www.washingtonpost.com/outlook/2020/06/03/im-black-climate-scientist-racism-derails-our-efforts-save-planet/" title="Johnson 2020">Johnson 2020</a>)</li>
<li>Restoring Indigenous peoples’ decision-making power over their ancestral lands is an emissions-reducing tactic—and this is backed up by geospatial satellite data (<a rel="nofollow" href="https://report.territoriesoflife.org/" title="ICCA 2021">ICCA 2021</a>).</li>
<li>Transitioning from investor-owned to cooperatively-owned business structures has likewise been explored as a promising tactic to reduce emissions (<a rel="nofollow" href="https://www.ica.coop/en/newsroom/news/cooperation-transition-green-economy-new-report-highlights-how-cooperative-model" title="ICA-EU 2021">ICA-EU 2021</a>).</li>
</ul>

<p>Why not put together some Scenarios that link emission drops to factors like these?</p>

<p>Beyond lending plausibility, adding these evidence-backed factors to the equation would give me, as a reader of the code, some protagonists to root for.</p>

<hr />

<p><strong>Correction</strong>: An earlier version of this piece contained the phrase "previously called the RCPs" rather than "previously limited to the RCPs."</p>
]]>
        </description>
    </item>
    <item>
        <title>128 Language Uroboros Quine (Code Critique 2022)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/131/128-language-uroboros-quine-code-critique-2022</link>
        <pubDate>Mon, 07 Feb 2022 13:20:49 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>code_anth</dc:creator>
        <guid isPermaLink="false">131@/index.php?p=/discussions</guid>
        <description><![CDATA[<p><strong>Title:</strong> 128 Language Uroboros Quine<br />
<strong>Author/s:</strong> Yusuke Endoh (mame)<br />
<strong>Language/s:</strong> Ruby + 127 other languages<br />
<strong>Year/s of development:</strong> 2013-ongoing<br />
<strong>Software/hardware requirements:</strong> Check <a rel="nofollow" href="https://github.com/mame/quine-relay#ubuntu" title="https://github.com/mame/quine-relay#ubuntu">here.</a> for necessary dependencies to be installed prior to running it.</p>

<p>"A quine is a computer program which takes no input and produces a copy of its own source code as its only output" (Wikipedia) and this particular quine is a "Ruby program that generates Rust program that generates Scala program that generates ...(through 128 languages in total)... REXX program that generates the original Ruby code again."</p>

<p>Yusuke Endoh is a member of the Ruby language's core-team. Ruby is a language known for having a beginner-friendly syntax and a welcoming community. It is, supposedly, a language "optimized for programmer happiness" and not for compiler happiness. In other words, Ruby leverages techniques such as syntactic sugaring to create its approachable syntax.</p>

<p>Nothing, in this program, seems approachable to me. How can I even begin analysing it?</p>

<pre><code>eval$s=%q(eval(%w(B=92.chr;g=32.chr;puts(eval(%q(N=10.chr;n=0;e=-&gt;s{Q[Q[s,B],?&quot;].K(N,B+?n)};E=-&gt;s{'(&quot;'+e[s]+'&quot;)'};d=-&gt;s,t=?&quot;{s.K(t){t+t}};def~f(s,n)s.K(/.{1,#{n*255}}/m){yield$S=E[$s=$&amp;]}end;Q=-&gt;s,t=?${s.K(t){B+$&amp;}};R=&quot;;return~0;&quot;;V=-&gt;s,a,z{s.K(/(
#{B*4})+/){a+&quot;#{$&amp;.size/2}&quot;+z}};C=%w(System.Console~Write);$C=C*?.;$D=&quot;program~QR&quot;;$G=&quot;~contents~of&quot;+$F=&quot;~the~mix!g~bowl&quot;;$L=&quot;public~static&quot;;rp=-&gt;s,r{v=&quot;&quot;;[r.!ject(s){|s,j|o={};m=n=0;s.size.times{|i|o[f=s[i,2]]||=0;c=o[f]+=1;m&lt;c&amp;&amp;(m=c;n=f)};v=n+v;
s.K(n,(j%256).chr)},v]};%(fn~mX{Z`x21(&quot;{}&quot;,#{E[&quot;object~QR~extends~App{#{f(%((display~&quot;#{e[%(Zf(&quot;1d;s/.//;s/1/~the~sum~of~a~son~and0/g;s/0/~twice/g;s/2/`x59ou~are~as~bad~as/g;s/3/~a~son`x21Speak~your~m!d`x21/g^n#The~Relay~of~Qu!e.^n#Ajax,~a~man.^n#
Ford,~a~man.^n#Act~i:~Qu!e.^n#Scene~i:~Relay.^n#[Enter~Ajax~and~Ford]^n#Ajax:^n#&quot;);function[]=f(s);for~i=1:2:length(s),Zf(&quot;2%s3&quot;,part(dec2b!(hex2dec(part(s,i:i+1))),$:-1:2)),end;endfunction`n#{s,v=rp[&quot;Transcript~show:~'#{d[&quot;Z&quot;+E[%(fun~p~n=Z(Int.to
SJ~n`x5e&quot;~&quot;);fun~mX=(p~0;p~0;p~130;List.tabulate(127,p);SJ.map(fn~c=&gt;(p(3+ord~c);Z&quot;-1~0~&quot;;c))#{E[~~~~~~%(object&quot;Application&quot;{state&quot;ma!&quot;{foreach(s~![#{f(%(puts~&quot;#{Q[e[%(echo~'a::=`x7e#{Q[Q[&quot;let~s=#{E[%(void~p(!t[]c){foreach(!t~v~!~c)stdout.Zf(&quot;%c%c
&quot;,v/256,v%256);}void~mX{!t[]a;p({19796,26724,0,6,0,1,480,19796,29291,#{s=%(module~QR;!itial~beg!~#{f(&quot;let~s=#{E[%(Module~QR:Sub~MX:Dim~s,n,i,c~As~Object:n=Chr(10):For~Each~c~!&quot;#{d[&quot;&lt;?xml#{O=&quot;~version='1.0'&quot;}?&gt;&lt;?xml-#{I=&quot;stylesheet&quot;}~type='text/xsl
'href='QR.xslt'?&gt;&lt;xsl:#{I+O}~xmlns:xsl='http://www.w3.org/1999/`x58SL/Transform'&gt;&lt;xsl:output~method='text'/&gt;&lt;#{U=&quot;xsl:template&quot;}~match='/'&gt;&lt;`x21[CDATA[#{%(sub~f(s$,n)Z(s$);:for~i=1to~n~Z(&quot;Y&quot;);:next:end~sub:f(&quot;#{V[e[%(H,format=&quot;#{y=&quot;&quot;;f(&quot;^H{-}{txt}
{#{Q[&quot;echo~-E~$'#{Q[Q[E[%(with~Ada.Text_Io;procedure~qr~is~beg!~Ada.Text_Io.Put(&quot;#{d[%(trans~B(Buffer)`ntrans~O(n){`nB:add(Byte(+~128~n))}`ntrans~f(v~n){`nO(+(/~n~64)107)`nO(n:mod~64)`nO~v}`ntrans~D(n){if(&lt;~n~4){f(+(*~6~n)9)48}{if(n:odd-p){D(-~n~3
)`nf~27~48`nf~36~11}{D(/~n~2)`nf~21~48`nf~48~20}}}`ntrans~S(Buffer&quot;#{e[%W[STRINGz:=~226+~153,a:=z+~166,b:=a+&quot;2&quot;+z+~160,c:=b+&quot;8&quot;+z+~165,t:=&quot;#{d[%(class~QR{#$L~void~ma!(SJ[]a){a=#{E[&quot;H('#{Q[e[&quot;implement~ma!0()=Z&quot;+E[&quot;BEGIN{Z#{E[%(echo~'#{%(f(s){Syste
m.out.Z(s);}s=&quot;389**6+44*6+00p45*,&quot;;for(c:#{E[(s=&quot;#!clude&lt;iost ream&gt;`n!t  ~mX{std::cout&lt;&lt;#{E[%(class~Program{#$L~void~MX{#$C(&quot;Qu!e~Relay~Coffee.^n^nIngredients.^n&quot;);for(!t~i=9;i++&lt;126;)#$C($&quot;{i}~g~caffe!e~{i}^n&quot;);#$C(&quot;^nMethod.^n&quot;);foreach(char~c~
!#{E[%((doseq[s(lazy-cat[&quot;IDENTIFICATION~DIVISION.&quot; &quot;PROGR     AM-I      D.~QR.&quot;&quot;    PROCEDURE~DIVISION.&quot;'DISPLA`x59](map~#(str&quot;~~~~^&quot;&quot;(.replace~%1&quot;^&quot;&quot;&quot;^&quot;^&quot;&quot;)&quot;^&quot;&quot;)(re-seq~#&quot;.{1,45}&quot;&quot;#{e[&quot;(f=(n)-&gt;Array(n+1).jo!~'Y');console.log('%s',#{V[E[%((H-l!e&quot;
#{e[&quot;import~std.stdio;void~mX{H(`x60#{%(method~MX   {Z(       @                     &quot;#{d[&quot;['']p[#{&quot;IO.puts&quot;+E[%((pr!c~&quot;#{e[&quot;`nma!(_)-&gt;`nio:fH#{d[E['Zfn(&quot;&quot;&quot;'+d[?&quot;+&quot;%opti                on~noyywrap`n%%`n%%`n!t~mX{puts#{E[&quot;echo~'#{Q[Q[%(~:~A~.&quot;#{g*9}
&quot;~;~:~B~A~.&quot;~WRITE(*,*)'&quot;~A~;~:~C~B~T`x59PE~.&quot;~    '&quot;                                          ~CR~;~:~D~S&quot;~#$D&quot;~C~S^&quot;~Z~^&quot;(&amp;&quot;~C~S^&quot;~#{e[%(Z&quot;#{e[&quot;s:=OutputTex                                  tUser();WriteAll(s,#{E[%(Zf&quot;#{e[d[f(&quot;.template~1`n#{d['
set~Z&quot;-&quot;;Z'+E[%(package~ma!;import&quot;fmt&quot;;func~                                                mX{fmt.Pr!t#{E[%(236:j;{119:i;{206i-:i;.48&lt;{71+}{[i]^48-*}if                                            }%}:t;&quot;algoritmo~QR;!&quot;[195][173]++'cio~imprima(&quot;'&quot;
013141&quot;t&quot;/12131&quot;t~6*&quot;/1:1918151:??6271413/4=                                                        3626612/2/353251215/`x5a0`x5a0R&quot;t&quot;#{e[%(z=new~jav                                                    a.util.zip.G`x5aIPOutputStream(System.out);z.H
('#{&quot;ma!=putStr&quot;+E[&quot;class~QR{#$L~function~m                                                             X{neko.Lib.Z#{E[%(procedure~mX;i:=c:=0;s:=                                                          #{E[%(.class~public~QR`n.super~#{$T=&quot;java/i
o/Pr!tStream&quot;}`n.method~#$L~ma!([L#{S=&quot;ja                                                             va/lang/S&quot;}J;)V~;]`n.limit~stack~2`ngetst                                                                atic~#{S}ystem/out~L#$T;`nldc~&quot;#{e[%(cla
ss~QR{#$L~void~ma!(SJ[]v){SJ~c[]=new~SJ     [999                                                           99],y=&quot;&quot;,z=y,s=&quot;#{z=t=(0..r=q=126)                          .map{|n|[n,[]]                             };a=&quot;&quot;;b=-&gt;n{a&lt;&lt;(n%78+55)%84+42};(%(P
={0:'[+[]]',m:'((+[])'+(C=&quot;['construct     or']&quot;                                                               )+&quot;+[])['11']&quot;};for(R~!~B=('                     `x21[]@`x21`x21[]@[][[]]@'+(                       A=&quot;[]['fill']&quot;)+&quot;@([]+[])['fontcolor
']([])@(+('11e20')+[])['split']([])@&quot;      +A+C                                                              +&quot;('return~escape')()(&quot;+A+'                   )').split('@'))for(E~!~D=eval(G='('+B[                    R]+'+[])'))P[T=D[E]]=P[T]||G+&quot;['&quot;+
E+&quot;']&quot;;for(G='[',B=0;++B&lt;36;)P[D=B.to                      SJ(36)]=B&lt;10?(G+='+`x2                                1+[]')+']':P[D]||&quot;(+('                 &quot;+B+&quot;'))['to'+([]+[])&quot;+C+&quot;['name']]('36')&quot;;A                   +=C+&quot;('console.log(unescape(^&quot;&quot;;
for(E~!~G=#{E[%(Z(&quot;&quot;&quot;#{Q[e[%(s=();a()  {~s+         =($(echo~-n~$1|od~-An~-tu1~-v)~$2)                              ;};a~&quot;Section`x48                eader+name:=QR;SectionPublic-ma!&lt;-(&quot;~10;t='#{&quot;cons                 ole.log&quot;+Q[E[%(@s=global[#{i=(s
=%(`x48AI~1.2`nVISIBLE~&quot;#{&quot;x=sJ.K(#{V[E  [&quot;changequote(&lt;@,@&gt;)`ndef!e(p,&lt;@#{&quot;all:`n`t@echo~                            '#{d[&quot;l!el:99                999;Z#{E[&quot;solve~satisfy;output~[#{E[%(.assembly~t{}.me                 thod~#$L~void~MX{.entrypo!t~l
dstr&quot;#{e[&quot;m{{`x21:~x`nqr:~|-`n~:db`x     60#{e[s=&quot;$Z#{E[&quot;Zf#{E[&quot;echo&quot;+E[&quot;#import&lt;stdio.h&gt;#{N                         }!t~mX{puts#                {E[&quot;Z_sJ&quot;+E[&quot;s=double#{E[&quot;Z#{E[&quot;$console:l!e[#{&quot;#$D(output                );beg!~H(#{f((p=&quot;eval&quot;;%($_=
&quot;#{s,v=rp[&quot;$_='#{Q[%(&lt;?php~$z=3+$      w=strlen($s=#{Q[E[&quot;!t~mX{H#{E[&quot;(#{?_*11})dup~=/s(|~~~~~.                         ~~~|)def               (#{Q[&quot;qr:-H('#{Q[e[&quot;!it{#{f(%(Z('cat(&quot;')`nfor~c~!&quot;&quot;.jo!([&quot;ech                o~'say~''%s'''^n&quot;%l~for~l~!
#{E[d[d[&quot;eval$s=%q(#$s)&quot;,?'],?']     ]}.split(&quot;^n&quot;)]):Z('r=fput(char(%d))'%ord(c))`nZ('end^n&quot;)')#                         ),6)               {&quot;Zf#{d[$S,?%]};&quot;}}}&quot;],?']}').&quot;,B]}){9~7{exch~dup~1~and~79~mul~32               ~add~exch~2~idiv~3~1~roll~
s~exch~2~!dex~exch~put~1~sub~d      up~6~eq{1~sub}if}repeat~s~=~pop~pop}forall~=~quit&quot;]+R}}&quot;]]})*3;                        ec   ho&quot;^x89PNG^r^n^x1a^n&quot;;$m=&quot;&quot;;$t=&quot;^xc0^0^xff&quot;;for($i=-1;++$i&lt;128*$z;$m.=$c--?($w-              $c||$i&gt;$z)&amp;&amp;$i/$z&lt;($c&lt;$w?o
rd($s[(!t)($c/3)]):$c--%3+2)      ?$t[2].$t[$c%3%2].$t[$c%3]:&quot;^0^0^0&quot;:&quot;^0&quot;)$c=$i%$z;foreach(array(&quot;I`                     x48DR&quot;                 .pack(&quot;NNCV&quot;,$w+2,128,8,2),&quot;IDAT&quot;.gzcompress($m),&quot;IEND&quot;)as$d)ec              ho~pack(&quot;NA*N&quot;,strlen($d)
-4,$d,crc32($d));).K(B,&quot;`x7      f&quot;),?']}';s:g/^x7f/Y/;Z~$_&quot;,128..287];s=&quot;$_='#{Q[s,c=/['Y]/]}';$n=32;$                                                      s='#{Q[v,c]}';$s=`x7es{..}{$a=$&amp;;$b=chr(--$n&amp;255);`               x7es/$b/$a/g;}eg;Z&quot;;(s+N
*(-s.size%6)).unpack(&quot;B*&quot;)      [0].K(/.{6}/){n=$&amp;.to_i~2;((n+14)/26*6+n+47).chr}}&quot;;s|.|$n=ord$&amp;;substr~                                                           unpack(B8,chr$n-!t($n/32)*6-41),2|eg;eval~pack              'B*',$_).scan(/[~,-:A-z]
+|(.)/){p=&quot;s++#{$1?&quot;chr~#      {$1.ord}+e&quot;:$&amp;+?+};&quot;+p};p),1){&quot;'#$s',&quot;}}'')end.&quot;.K(/[:;()]/){?`x5e+$&amp;}}]&quot;]}                                                              ;quit&quot;]};t=num2cell(b=11-ceil(s/13));for~              n=1:9m={};for~i=1:141f=@
(x,y,n)repmat(['Ook'~char      (x)~'~Ook'~char(y)~'~'],[1~abs(n)]);m(i)=[f(z=46,63,n)~f(q=z-(i&lt;13)*13,q,i-1                                                                 3)~f(33,z,1)~f(63,z,n)];end;t(x=b==n)              =m(diff([0~s(x)])+13);en
d;Zf('%%s',t{:})&quot;]]+R}}&quot;      ]]}`n&quot;]};&quot;]}`x60`n~global~_start`n~_start:mov~edx,#{s.size}`n~mov~ecx,m`n~mov~e                                                                   bx,1`n~mov~eax,4`n~!t~128`n~mov~eb              x,0`n~mov~eax,1`n~!t~12
8`nx:~|`n~}}{{{qr}}}&quot;]}&quot;      call~void~[mscorlib]#{C*&quot;::&quot;}(sJ)ret})]}];&quot;]};quit();&quot;,?$].K(?'){&quot;'^''&quot;}}'&quot;}@&gt;)`                                                                     np&quot;],?&amp;,?&amp;]},'&amp;(%d+)&amp;',function              (s)return~sJ.rep('Y',to
number(s))end);Z(x)&quot;.K(       /[:&quot;]/,&quot;:^0&quot;)}&quot;`n`x4bT`x48`x58B`x59E~B`x59E)).size+1}x~i8]c&quot;#{s.K(/[^&quot;`n`t]/){&quot;^%0                                                                      2`x58&quot;%$&amp;.ord}}^00&quot;declare~               i32@puts(i8*)def!e~i32@
mX{%1=call~i32@puts(i8*       getelementptr([#{i}x~i8],[#{i}x~i8]*@s,i32~0,i32~0))ret~i32~0})],?#].K(?',%('&quot;'&quot;'))                                                                        }';for((i=0;i&lt;${#t};i+=9              9));do;x=${t:$i:99};a~&quot;^
&quot;${x//[Y^`&quot;]/Y^0}^&quot;.Z;&quot;       ~10;done;a~&quot;);&quot;;p(){~echo~-n~$1;};f(){~for~x~!~${s[*]};do;p~$3;for((j=$2;j--;));do  ;                                                                         h~$1~$x~$j;done;done;              };p~k^`x60;h(){~p~^`x60$
{1:$(($2&gt;&gt;$3&amp;1)):2};};f       ~kki~7~'`x60`x60s`x60`x60s`x60`x60s`x60`x60s`x60`x60s`x60`x60s`x60`x60s`x60`x60s     i'                    ;s=();a~'A          G-`x                             48-`x48Fy.IlD==;=j               dlAy=;=jldltldltl{lAulAy
=jtlldlAyFy=?=jdlAyGFyF       yG2AFy&gt;zlAFFBCjldGyGFy&gt;GFy.AGy=G==n`x48==nlldC=j@=jtlldltldlAut11';h(){~p~${1:$        (                   ((($2%83-10)&gt;     &gt;((2-$3)*2)                          )%4)):1};};f~sk               i^`x60~3)]]}&quot;&quot;&quot;))]})A+=&quot;'
+`x21[]+'&quot;+G.charCodeAt(       E).toSJ(16);for(A+=&quot;^&quot;.replace(/'+`x21[]+'/g,^&quot;%^&quot;)))')()&quot;,R=0;R&lt;9;R++)A=A.re          p                    lace(/'.*?'/g,  function(B){T=          [];fo          r(E=1;B[E+1]                ;)T.push(P[B[E++]]);retur
n~T.jo!('+')});console.l        og('&quot;'+A+'&quot;'))).bytes{|n|r,z=z[n]||(b[r/78];b[r];q&lt;6083&amp;&amp;z[n]=[q+=1,[]];t[             n]                    )};b[r/78];b[r]}&quot;;!t~i=0,n=0,q=     0;for(;++n&lt;1       26;)c[n]=                &quot;&quot;+(char)n;for(;i&lt;#{a.size
};){q=q*78+(s.charAt(i)-1        3)%84;if(i++%2&gt;0){y=q&lt;n?c[q]:y;c[n++]=z+y.charAt(0);System.out.Z(z=c[q])             ;q=0                    ;}}}})]}&quot;`n!vokevirtual~#$T/Zln(   L#{S}J;)V`nretur     n`n.en                d~method)+N]};H(&quot;DO,1&lt;-#&quot;||
*s);s?while~t:=ord(move(1)        )do{i+:=1;u:=-i;every~0to~7do{u:=u*2+t%2;t/:=2};H(&quot;PLEASE&quot;)^(i%4/3);H             (&quot;DO,1SU                    B#&quot;||i||&quot;&lt;-#&quot;||((c-u)%256));c:=u ;};H(&quot;PLEASEREADOUT   ,1^                 nPLEASEGIVEUP&quot;);end)]};}}&quot;].
tr(?&quot;+B,&quot;`x21`x7e&quot;)}'.tr('         `x7e`x21','Y`u0022')as~byte[]);z.close())]}&quot;{&quot;W&quot;&quot;w&quot;@j~1+:j^-~118%1              +*}%&quot;/3551                     2416612G61913@921/17A331513&quot;t'&quot;);fim')]};})],B]}`n.e n                  dtemplate&quot;,61){&quot;Zn#$S`n&quot;},?%]
]}&quot;`nquit)]});CloseStream(s         );QUIT;&quot;]}&quot;)]}&quot;~DUP~A~.&quot;~DO~10~I=1,&quot;~.~CR~S&quot;~&amp;A,&amp;&quot;~C~.&quot;~10~~~~~              ~CONTINUE&quot;~CR~                      S^&quot;~&amp;A)^&quot;,&amp;&quot;~C~0~DO~B~.&quot;~&amp;char(&quot;~COUNT~.~.&quot;~),&amp;'&quot;                   ~CR~LOOP~S^&quot;~&amp;^&quot;^&quot;&quot;~C~S&quot;~end~#
$D&quot;~C~A~.&quot;~STOP&quot;~CR~A~.&quot;~END          &quot;~CR~B`x59E~;~D~),B],?`x21].K(?',%('&quot;'&quot;'))}'&quot;]};}&quot;.K(?&quot;){'&quot;3             4,&quot;'}.K(N){'&quot;10,&quot;'                      }+?&quot;,?%]+'~&quot;&quot;&quot;)'],?`x7e]}.&quot;]}&quot;))]}]p['']pq&quot;]}                   &quot;);})}`x60);}&quot;]}&quot;))].K(?`x60,&quot;Yx
60&quot;),'#{f(',')}']})&quot;]}&quot;))[&quot;~~~          ~^&quot;~^&quot;.&quot;&quot;STOP~RUN.&quot;])](Zln(str&quot;message(STATUS~^&quot;~~~~~&quot;(              .replace(.replace(str                        ~s)&quot;Y&quot;&quot;YY&quot;)&quot;^&quot;&quot;&quot;Y^&quot;&quot;)&quot;^&quot;)&quot;)))).reverse]                     })#$C($&quot;Put~caffe!e~{(!t)c}~!to#$
F.^n&quot;);#$C(&quot;Liquify#$G.^nPour#$           G~!to~the~bak!g~dish.^n^nServes~1.^n&quot;);}})]};}/****               //****/&quot;;t={};b=&quot;&quot;;L=&quot;&quot;;                          n=i=0;D=-&gt;n{L&lt;&lt;(n+62)%92+35;D};                       s.bytes{|c|n&gt;0?n-=1:(t[c]=(t[c]||[]
).reject{|j|j&lt;i-3560};x=[];t[c].m            ap{|j|k=(0..90).f!d{|k|not~s[i+1+k]==s[j+k]}|                |91;k&gt;4&amp;&amp;x&lt;&lt;[k,j]};x=x.max)?                              (n,j=x;x=b.size;(u                            =[x,3999].m!;D[u%87][u/87];L&lt;&lt;b[0,u];
b[0,u]=&quot;&quot;;x-=u)while~x&gt;0;x=4001+i-j             ;D[x%87][x/87][n-5]):b&lt;&lt;c;t[c]+=[i+=1]}                ;&quot;#!clude&lt;stdio.h&gt;`nchar*p=#{E[L]}                                                                       ,s[999999],*q=s;!t~mX{!t~n,m;for(;*p;){
n=(*p-5)%92+(p[1]-5)%92*87;p+=2;if(n&gt;                3999)for(m=(*p++-5)%92+6;m--;q                  ++)*q=q[4000-n];else~for(;n--;)*q++=*p                                                                   ++;}puts(s)#{R}}&quot;)]}){s+=&quot;00g,&quot;;for(m=1;m
&lt;256;m*=2)s+=&quot;00g,4,:&quot;+(c/m%2&gt;0?&quot;4+&quot;:&quot;&quot;                     )+&quot;,&quot;;f(s);s=&quot;4                       ,:,&quot;;}f(s+s);for(c:Base64.getDecoder().decod                                                             e(&quot;kaARERE`x58/I0ALn3n5ef6l/Pz8+fnz58/BOf5/7
/hE`x58/O`x5azM5mC`x58/Oczm`x5azBPn5+`x58/                                                     OczMznBL/nM5m`x5azBPu++fPPOc5zngnnO`x5azO`x5agnBMG                                                       AW7A==&quot;)){c=c&lt;0?256+c:c;for(i=0;i++&lt;3;c/=8)f(c%
8);f(&quot;8*+8*+,&quot;);}f(&quot;@&quot;);).K(?',%('&quot;'&quot;'))}'|se                                               d~-e's/Y/YY/g'~-e's/&quot;/Yq/g'~-e's/.*/Z~&quot;&amp;&quot;^nquit/')]}}&quot;]],                                               ?']}');&quot;.K(/^+/){&quot;`x5e#{$&amp;.size}`x5e&quot;}]}.split(&quot;Y`x
5e&quot;);for(!t~i=1;i&lt;a.length;a[0]+=a[i+1],i+=2){a[0]                                      +=&quot;Y&quot;.repeat(Integer.parseInt(a[i]));}System.out.Z(a[0]);}})]}&quot;;F                                       ORiTO`~UPBtDO`~INTn:=ABSt[i];Z(~(50+n%64)+c+~(50+n%8MOD
8)+c+~(50+nMOD8)+b+&quot;`x4a&quot;+a)OD]*&quot;REPR&quot;]}&quot;)`nwhile(`x21=(                         S:length)0){`ntrans~c(S:read)`nD(c:to-!teger)`nf~35~39}`nf~24~149`n!terp:librar                         y&quot;afnix-sio&quot;`ntrans~o(afnix:sio:OutputTerm)`no:H~B)].K(N,'&quot;&amp;Ch
aracter'+?'+'Val(10)&amp;&quot;')}&quot;);end;)]+&quot;`nsys.exit~0&quot;,B],?']}'&quot;,/[^{}]/]}}&quot;,35){y&lt;&lt;&quot;,`n&quot;+$S;&quot;%s&quot;}}&quot;)+y],'&quot;,','):f(&quot;']}&quot;,0))}]]&gt;&lt;/#{U}&gt;&lt;/xsl:#{I}&gt;&quot;].K~N,'&quot;&amp;~VbLf~&amp;&quot;'}&quot;:s=&quot;~~~&quot;:For~i=0To~7:s~&amp;=Chr(32-(Asc(c)&gt;&gt;7-i~And~1)*23):Next:#$C(s~&amp;n~&amp;Chr(9)&amp;n~&amp;&quot;~~&quot;
):Next:#$C(n~&amp;n~&amp;n):End~Sub:End~Module)]}`nput=s`nZ`nqa`x21&quot;,3){%($H(&quot;%s&quot;,#$S);)+N}}end~endmodule);W=s.size*72+4;&quot;%d,%d&quot;%[W/65536,W%65536]}});foreach(!t~c~!#{E[s]}.data)foreach(!t~v~!~a={0,9,7,4,5,c/100*7/6+1,c%100/10*7/6+1,c%10*7/6+1,7})p({144,v=
15450+v*256,384,v});p({255,12032});})]},i=0,t='k';while(s[i])t='^x60.'+s[i++]+t;console.log(t)&quot;,B],?`x21].K(?',%('&quot;'&quot;'))}'&quot;^n::=^na&quot;)],/[`[`]$]/]}&quot;),4){$S+?,}}])Console.H(s);Application.exit();}})]};Z&quot;0~0~-1&quot;);)],?']}';cr&quot;,127..255];f(%(variable~s
=`x60#{s.K(/.{1,234}/){$&amp;.K(&quot;`x60&quot;,%(`x60+&quot;`x60&quot;+`x60))+&quot;`x60+`n`x60&quot;}}`x60,i;for(i=0;i&lt;129;i++)s=strreplace(s,pack(&quot;C&quot;,255-i),substrbytes(`x60#{v[0,99]}`x60+`n`x60#{v[99..-1]}`x60,i*2+1,2));Zf(&quot;%s&quot;,s)),7){&quot;f('%s')`n&quot;%$s.unpack(&quot;`x48*&quot;)}}Zf(&quot;^n#[E
xeunt]&quot;);quit)]}&quot;)),196){%(Z#$S;)}}}&quot;]});})).gsub(/[!HJKXYZ^`~]/){[B*2,:write,B,:tring,:gsub,&quot;ain()&quot;,B*4,:print,g,:in][$&amp;.ord%47%12]})))*&quot;&quot;)#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_futu
#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffe
#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffe
#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffer_for_future_bug_fixes_#_buffe
####################################################################################  Quine Relay -- Copyright (c) 2013, 2014 Yusuke Endoh (@mametter), @hirekoke  ###################################################################################)
</code></pre>
]]>
        </description>
    </item>
    <item>
        <title>"hello, [other language(s) than English] world!" (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/122/hello-other-language-s-than-english-world-2022-code-critique</link>
        <pubDate>Mon, 31 Jan 2022 11:11:22 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>diogo_ph22</dc:creator>
        <guid isPermaLink="false">122@/index.php?p=/discussions</guid>
        <description><![CDATA[<p><strong>Title:</strong> "hello, world!"<br />
<strong>Author/s:</strong> Brian Kernighan<br />
<strong>Language/s:</strong> several<br />
<strong>Years of development:</strong> 1978-<br />
<strong>Software/hardware requirements (if applicable):</strong>  multiple depending on the language/s</p>

<blockquote><div>
  <p>main( ) {<br />
          printf("hello, world");<br />
  }</p>
</div></blockquote>

<p>The use of  "hello, world!" as a test message was initially described as a code example in Dennis Ritchie and Brian Kernighan's 1978 book <a rel="nofollow" href="https://archive.org/details/TheCProgrammingLanguageFirstEdition/page/n1/mode/2up" title="_The C Programming Language_"><em>The C Programming Language</em></a>. This simple program followed an earlier Bell Laboratories' internal memorandum written by Brian Kernighan in 1974, titled <a rel="nofollow" href="https://www.bell-labs.com/usr/dmr/www/ctut.pdf" title="_Programming in C: A Tutorial_"><em>Programming in C: A Tutorial</em></a> (see also <a rel="nofollow" href="https://en.m.wikipedia.org/wiki/%22Hello,_World!%22_program" title="Wikipedia, online">Wikipedia, online</a>).</p>

<p>The "hello, world!" program is probably one of the simplest programs to write, and often one of the first basic syntaxes to be learned by students and/or coders in several programming languages (eg. Ballerina, BASIC, Batch file, Bash, C, C++, C#, COBOL, Clojure, Dart, Elixir, Ezhil, Fortran, Go, Haskell, Java, JavaScript, Kotlin, Logo, Lua, mq, Objective-C, OCaml, Pascal, Perl, PHP, PowerShell, Prolog, Python, Ruby, Rust, Standard ML, Swift, TI-BASIC, Zig).</p>

<p>Since 1994, Wolfram Roesler and many people around the globe compiled <a rel="nofollow" href="http://helloworldcollection.de" title="_The Hello World Collection_"><em>The Hello World Collection</em></a>, including 603 "hello, world!" programs in several programming languages and in additional 78 human languages. More recently, Geoff Cox and Duncan Shingleton created <a rel="nofollow" href="http://www.anti-thesis.net/hello-world-60/" title="'_hallo welt!_ (hello world!)'"><em>hallo welt!</em> (hello world!)</a>, shown as a projection at BV Gallery, Linz (Austria) in 2008, and as part of AFTER THE NET, at Peninsula Arts Gallery, Plymouth (UK) in 2009, and at Tecnológico de Monterrey, Toluca (México) in 2010. Cox and Shingleton's brilliant codework plays on a  communicative act between programmer and computer, by 'looping more than 100 Hello World programs written in different programming languages, alongside a selection of human languages, [and] combining them into a real-time, multilingual, machine-driven confusion of tongues' (<a rel="nofollow" href="https://aesthetic-programming.net/pages/1-getting-started.html" title="Soon and Cox, 2020">Soon and Cox, 2020</a>).</p>

<p><strong>Discussion Questions</strong></p>

<ol>
<li><p>While most of the programming languages are based in English, some experiments to decolonize code can involve trying to use other (non-Western) languages?</p></li>
<li><p>What could these new code experiments with non-English languages look like?</p></li>
<li><p>Do you know of other examples of "hello, world!" programs in other languages than English?</p></li>
<li><p>How could the "hello, world!" bring new possibilities to teaching, learning and developing alternative modes of thinking and living that are non-Western?</p></li>
<li><p>How could we creatively re-think the "hello, world!" code as non-binary?</p></li>
</ol>
]]>
        </description>
    </item>
    <item>
        <title>Generating names for generated dinosaurs (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/124/generating-names-for-generated-dinosaurs-2022-code-critique</link>
        <pubDate>Tue, 01 Feb 2022 22:22:41 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>rcveeder</dc:creator>
        <guid isPermaLink="false">124@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>Title: The Island of Doctor Wooby<br />
Author: Ryan Veeder<br />
Language: Inform 7<br />
Developed: 2015</p>

<p>Play link: <a href="https://rcveeder.net/wooby/" rel="nofollow">https://rcveeder.net/wooby/</a></p>

<p>In this text adventure, written for a virtual pet-themed game jam, the player visits an island populated by tiny felt dinosaurs. The dinosaur dolls are generated with different legs, necks, spikes or no spikes, teeth or no teeth...</p>

<p>This was all based on real dolls I had sewn, and given names like "Teggisaurus," "Dagadagadon," and "Mustadopagus." I wanted the generated dinosaurs to have generated names that were recognizable both as pseudo-baby talk and as dinosaur names.</p>

<p>(<a rel="nofollow" href="https://www.rcveeder.net/blog/2017/05/17/procedural-generation-of-dinosaurs-in-inform-7/" title="An extremely long commentary on the full dinosaur generation process">An extremely long commentary on the full dinosaur generation process</a> is available for anyone interested.)</p>

<p>It strikes me that, in order to come across as stereotypical baby talk, the "roots" here run fairly long ("boofa," "smoosha"). When more than half of generated names contain more than one of these roots, I think you run into a lot of names that are too long to be, strictly speaking, cute.</p>

<p>A possible angle for discussion: Where else have you seen nonsense words generated to fit in a specific aesthetic? Do strategies for creating different "looks" for words have anything in common?</p>

<pre><code>To name the current test subject:
    let tempname be text;
    let namecomponents be a random number between 1 and 2; [First randomize the number of &quot;roots&quot; that will be in the name. Right now it's 50% one root, 50% two roots.]
    if a random chance of 1 in 6 succeeds:
        increment namecomponents; [Now the distribution is 41.667% one root, 50% two roots, and 8.333% three roots. A dinosaur with three roots in its name turns out to be fairly inconvenient, so we will tone that third option down a little..]
    if namecomponents is 3:
        if a random chance of 2 in 3 succeeds:
            now namecomponents is 2; [Sow now we're looking at 41.6% one roots; 55.6% two roots; 2.8% three roots.]
    if namecomponents is 1:
        choose a random row in Table of Long Name Suffixes;
        now tempname is &quot;[suffix entry]&quot;; [A name with only one root requires a long suffix, since something like &quot;Goomus&quot; doesn't look dinosaury enough.]
    otherwise:
        if a random chance of 1 in 2 succeeds:
            choose a random row in Table of Long Name Suffixes;
            now tempname is &quot;[suffix entry]&quot;;
        otherwise:
            choose a random row in Table of Short Name Suffixes;
            now tempname is &quot;[suffix entry]&quot;;
    choose a random row in table of name roots;
    now tempname is &quot;[root entry][tempname]&quot;;
    decrement namecomponents;
    if namecomponents is greater than 0:
        choose a random row in table of name roots;
        now tempname is &quot;[root entry][tempname]&quot;;
        decrement namecomponents;
    if namecomponents is greater than 0: [Instead of repeating a loop until namecomponents reaches 0 like a real programmer would, why not write out all possible steps]
        choose a random row in table of name roots;
        now tempname is &quot;[root entry][tempname]&quot;;
        decrement namecomponents;
    Now the species name of the current test subject is &quot;[tempname]&quot; in title case;
    now the printed name of the current test subject is &quot;[tempname]&quot; in title case. [The player can change the printed name of a dinosaur with a command like &gt;NAME BUGGOBOGADON ALFIE.]



Table of Name Roots
root
&quot;boofy&quot;
&quot;woofy&quot;
&quot;daga&quot;
&quot;boo&quot;
&quot;doo&quot;
&quot;goo&quot;
&quot;zoo&quot;
&quot;ruffy&quot;
&quot;smooshi&quot;
&quot;woobo&quot;
&quot;wooba&quot;
&quot;wooby&quot;
&quot;boofa&quot;
&quot;ragga&quot;
&quot;muffy&quot;
&quot;libby&quot;
&quot;zaka&quot;
&quot;bitty&quot;
&quot;itty&quot;
&quot;licky&quot;
&quot;grabby&quot;
&quot;buggy&quot;
&quot;buggo&quot;
&quot;dugga&quot;
&quot;duggo&quot;
&quot;grabbo&quot;
&quot;flippy&quot;
&quot;frumpa&quot;
&quot;smoosha&quot;
&quot;smooshy&quot;
&quot;darno&quot;
&quot;darmi&quot;
&quot;lecko&quot;
&quot;steggy&quot;
&quot;teggy&quot;
&quot;tyra&quot;
&quot;bumpo&quot;
&quot;humba&quot;
&quot;wumba&quot;
&quot;dunda&quot;
&quot;dooga&quot;
&quot;booga&quot;
&quot;boogy&quot;
&quot;oogy&quot;
&quot;wicky&quot;
&quot;slibba&quot;
&quot;normo&quot;
&quot;olly&quot;
&quot;goli&quot;
&quot;boli&quot;
&quot;doli&quot;
&quot;garda&quot;
&quot;cappy&quot;
&quot;pulla&quot;
&quot;wedder&quot;
&quot;veeda&quot;
&quot;josie&quot;
&quot;denny&quot;
&quot;dinky&quot;
&quot;rosy&quot;
&quot;ratty&quot;
&quot;hatty&quot;
&quot;chatty&quot;
&quot;darpo&quot;
&quot;poppy&quot;
&quot;gooby&quot;
&quot;linto&quot;
&quot;quibbo&quot;
&quot;bumpi&quot;
&quot;slibbi&quot;
&quot;gardi&quot;
&quot;veedi&quot;
&quot;denno&quot;
&quot;dinko&quot;
&quot;goobo&quot;
&quot;lizza&quot;
&quot;lizzo&quot;
&quot;harry&quot;
&quot;harpo&quot;
&quot;groucho&quot;
&quot;gummo&quot;
&quot;chico&quot;
&quot;zeppo&quot;
&quot;boga&quot;
&quot;goba&quot;
&quot;bago&quot;
&quot;gabo&quot;
&quot;dizzi&quot;
&quot;dizzo&quot;

Table of Short Name Suffixes
suffix
&quot;don&quot;
&quot;dax&quot;
&quot;nus&quot;
&quot;mus&quot;
&quot;rax&quot;
&quot;lus&quot;
&quot;lops&quot;

Table of Long Name Suffixes
suffix
&quot;saurus&quot;
&quot;dactyl&quot;
&quot;raptor&quot;
&quot;daptor&quot;
&quot;gaptor&quot;
&quot;zaptor&quot;
&quot;pitus&quot;
&quot;ceratops&quot;
&quot;cephalus&quot;
&quot;mimus&quot;
</code></pre>
]]>
        </description>
    </item>
    <item>
        <title>Wordle Accessibility (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/111/wordle-accessibility-2022-code-critique</link>
        <pubDate>Thu, 20 Jan 2022 16:46:39 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>hannahackermans</dc:creator>
        <guid isPermaLink="false">111@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>Title Wa11y<br />
Author/s Cariad Eccleston <br />
Language/s Javascript<br />
Year/s of development 2022</p>

<p>Many of us will be familiar with <a rel="nofollow" href="https://www.powerlanguage.co.uk/wordle/" title="Wordle">Wordle</a>, a popular game by created by Josh Wardle in which the player gets 6 chances to guess a five-letter word. The players gets to know which letters of their guess were in the solution-word (marked by a yellow or blue square) and whether those letters were in the correct position in the word (marked by a green or orange square). <br />
Arguably the most interesting part of the game's popularity is its social element. When the game became popular in New-Zealand, people started share how many guesses they had needed. One player then created a grid of emoji squares to signify her process without spoiling the solution for others. After a share-button was added to the game, which copied your personal grid to your clipboard, it became easy to share and the game went viral. (source for this background is this interview with Wardle on <a rel="nofollow" href="https://slate.com/culture/2022/01/wordle-game-creator-wardle-twitter-scores-strategy-stats.html" title="Slate">Slate</a>)</p>

<p>In addition to making the game go viral, each grid gives a little narrative that other players can understand. C Thi Nguyen wrote a Twitter thread about this which you can read <a rel="nofollow" href="https://twitter.com/add_hawk/status/1481386444627218433" title="&quot;here">here</a>. He says that people got to know the game through "incomprehensible little box-chart graphics" but after playing the game you realize "Every game of Wordle is a particular little arc of decisions, attempts, and failures. But each little posted box is <em>a neat synopsis of somebody's else's arc of action, failure, choice, and success</em>."</p>

<p>Now in addition to the onslaught of Wordle grids on social media, my Twitter feed now seems to have one tweet complaining of the accessibility of those grids for every three Wordle tweets. Here's the problem: to screen reader users (such as blind/low vision people), the visual of the grid is read out linearly, which is little more than noise. On Twitter, people beg users to change up their way of sharing their wordle results without the use of emojis. Several programs have been made to convert the squares to a meaningful description. <a rel="nofollow" href="https://wa11y.co/" title="Wa11y">Wa11y</a>, for example, lets readers paste their wordle result into their website and converts it into a "sharable" written version.</p>

<p>Wa11y's code is available on <a rel="nofollow" href="https://github.com/cariad/wa11y.co" title="github">github</a>. I am pasting a snippet of it below this post. The program labels yellow/blue squares as "hasMisplaced" and green/orange squares as "hasPerfect". Thus if there are only black squares (indicating that every letter of the guess was incorrect) it does not say '5 black squares' (or "black square black square black square black square black square" depending on which screen reader you were using) but simply returns "Nothing.", which is a more meaningful description of not getting any letter right. If there are 5 green/orange squares, the person has all letters correct, so the program returns "Won!"). If there is a mixture, it will return which position in the line is misplaced and which are perfect.</p>

<p>Wa11y is not the only program available, so if someone wants to reply with a comparison, go for it!<br />
An example is Wordle Result Image Generator, which generates a shareable image and alt text for the image. <a rel="nofollow" href="https://github.com/andrew-mc/wordle-result-image-generator" title="Github link">Github link</a>.<br />
Another example is WordleBot, which adds explanations to the grid, but is still not useful to screen readers because it still includes all the square emoji's in the post:<a rel="nofollow" href="https://www.imore.com/wordlebot-shortcut-brings-accessibility-your-wordle-results" title="announcement link"> announcement link </a>(I'm not sure the source code is available for this one)</p>

<p>Now the existence of Wa11y is very good, but sharing the grid of square emoji's has already become a habit for Wordle players and it is the simplest action. Telling every individual to change their behavior, then, will take a long time and meets resistance from players. So it makes much more sense for the Wordle website to provide both a grid and a text-based version for readers to avoid the extra step which is seen as a nuisance. I say both options, because people also share their results privately and if they know they are chatting to can and wants to see the grid, the written explanation is not necessary.</p>

<p>Going back to the narrative arc of the grid that C Thi Nguyen described, I am curious how different textual translations give a different narrative of the game. I was surprised, for example, when blind digital accessibility expert suggested ""Screenshot of Wordle score. Three rows of 5 squares. Bottom all green, middle alternate green &amp; grey, top 2 yellow, 2 grey 1 green" which is a literal description, but arguably does not provide the same narrative. How do you think different programs create different narrative arcs for Wordle? How could this be improved?</p>

<p>There's much more to say, but I am interested to read what others have to comment, so I'll end here for now.</p>

<p>This is a snippet, the rest is available on <a href="https://github.com/cariad/wa11y.co" rel="nofollow">https://github.com/cariad/wa11y.co</a>):</p>

<blockquote><div>
  <p>let explanation = '';</p>
  
  <p>if (!hasPerfect &amp;&amp; !hasMisplaced)<br />
  explanation = 'Nothing.';<br />
  else if (decoded.perfectIndexes.length === 5)<br />
  explanation = 'Won!';<br />
  else if (decoded.misplacedIndexes.length === 5)<br />
  explanation = chopAggression &gt;= 1 ? 'all in the wrong order.' : 'all the correct letters but in the wrong order.';<br />
  else if (hasPerfect &amp;&amp; hasMisplaced)<br />
  explanation = <code>${perfect}${misplaced}.</code><br />
  else if (hasMisplaced)<br />
  explanation = <code>${misplaced}.</code><br />
  else<br />
  explanation = <code>${perfect}.</code></p>
  
  <p>const prefix = chopAggression &gt;= 5 ? <code>${num}.</code> : <code>Line ${num}:</code>;</p>
  
  <p>const result = <code>${prefix} ${explanation}\n</code></p>
  
  <p>if (chopAggression &gt;= 4)<br />
  return result.replaceAll(' and ', ' &amp; ');</p>
  
  <p>return result;<br />
  }</p>
</div></blockquote>
]]>
        </description>
    </item>
    <item>
        <title>The Original ELIZA in MAD-SLIP  (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/108/the-original-eliza-in-mad-slip-2022-code-critique</link>
        <pubDate>Tue, 18 Jan 2022 00:18:24 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>jshrager</dc:creator>
        <guid isPermaLink="false">108@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>ELIZA Code Critique thread<br />
Thread Leaders: Jeff Shrager, David Berry, Mark Marino, Jeremy Douglass</p>

<p>Title: ELIZA<br />
Language: MAD-SLIP<br />
Year: 1965<br />
Author: Joseph Weizenbaum<br />
System: IBM 7094</p>

<p>Dearly Beloved,</p>

<p>We are gathered here to analyze a version of Joseph Weizenbaum's original source code for ELIZA, recently discovered in MIT’s archives. You can read more about it <a rel="nofollow" href="https://sites.google.com/view/elizagen-org/the-original-eliza" title="[here]">[here]</a>.</p>

<p>Code below in Spoilers. But also in a syntax highlighted version by <a rel="nofollow" href="https://wg.criticalcodestudies.com/index.php?p=/profile/DavidBerry">@DavidBerry</a></p>

<div><pre><code>$s changed to &quot;s
.EQ., etc. expanded to =, etc.
' abbreviations written out
continued lines (with a 1 in column 7?) wrapped together
line numbering removed
Added lisp-like comments (;;; ;; and ;)

            EXTERNAL FUNCTION (KEY,MYTRAN) 
            NORMAL MODE IS INTEGER 
            ENTRY TO CHANGE. 
            LIST.(INPUT) 
            Vector Values G(1)=&quot;TYPE&quot;,&quot;SUBST&quot;,&quot;APPEND&quot;,&quot;ADD&quot;,&quot;START&quot;,&quot;RANK&quot;,&quot;DISPLA&quot; 
            Vector Values &quot;NUMB = &quot; I3 *&quot; 
            FIT=0 
CHANGE      PRINT COMMENT &quot;PLEASE INSTRUCT ME&quot; 
            LISTRD.(MTLIST.(INPUT),0) 
            JOB=POPTOP.(INPUT) 
            Through IDENT, FOR J=1,1, J&gt; 7 
IDENT       Whenever G(J) = JOB, Transfer To THEMA 
            PRINT COMMENT &quot;CHANGE NOT RECOGNIZED&quot; 
            Transfer To CHANGE 
THEMA       Whenever J = 5, Function Return IRALST.(INPUT) 
            Whenever J = 7 
                Through DISPLA, FOR I=0,1, I  &gt; 32 
                Whenever LISTMT.(KEY(I)) = 0, Transfer To DISPLA 
                S=SEQRDR.(KEY(I)) 
READ(7)         NEXT=SEQLR.(S,F) 
                Whenever F &gt; 0, Transfer To DISPLA 
                PRINT COMMENT &quot;*&quot; 
                TPRINT.(NEXT,0) 
                PRINT FORMAT SNUMB,I 
                PRINT COMENT &quot; &quot; 
                Transfer To READ(7) 
DISPLA          CONTINUE 
                PRINT COMMENT &quot; &quot; 
                PRINT COMMENT &quot;MEMORY LIST FOLLOWS&quot; 
                PRINT COMMENT &quot; &quot; 
                Through MEMLIST, FOR I=1 , 1, I &gt; 4 
MEMLST          TXTPRT.(MYTRAN(I),0) 
                Transfer To CHANGE 
            End Conditional 
            THEME=POPTOP.(INPUT) 
            SUBJECT=KEY(HASH.(THEME,5)) 
            S=SEQRDR.(SUBJECT) 
LOOK        TERM=SEQLR.(S,F) 
            Whenever F &gt; 0, Transfer To FAIL 
            Whenever TOP.(TERM) = THEME, Transfer To FOUND 
            Transfer To LOOK 
FOUND       Transfer To DELTA(J) 
DELTA(1)    TPRINT.(TERM,0) 
            Transfer To CHANGE 
FAIL        PRINT COMMENT &quot;LIST NOT FOUND&quot; 
            Transfer To CHANGE 
DELTA(2)    S=SEQRDR.(TERM) 
            OLD=POPTOP.(INPUT) 
READ(1)     OBJCT=SEQLR.(S,F) 
            Whenever F &gt; 0, Transfer To FAIL 
            Whenever F &lt;&gt; 0, Transfer To READ(1) 
            INSIDE=SEQRDR.(OBJECT) 
READ(2)     IT=SEQLR.(INSIDE,F) 
            Whenever F &gt; 0, Transfer To READ(1) 
            SIT=SEQRDR.(IT) 
            SOLD=SEQRDR.(OLD) 
ITOLD       TOLD=SEQLR.(SOLD,FOLD) 
            DIT=SEQLR.(SIT,FIT) 
            Whenever TOLD = DIT AND FOLD &lt;= 0,Transfer To ITOLD 
            Whenever FOLD &gt; 0, Transfer To OK(J) 
            Transfer To READ(2) 
OK(2)       SUBST.(POPTOP.(INPUT),LSPNTR.(INSIDE)) 
            Transfer To CHANGE 
OK(3)       NEWBOT.(POPTOP.(INPUT),OBJCT) 
            Transfer To CHANGE 
DELTA(3)    Transfer To DELTA(2) 
DELTA(4)    Whenever NAMTST.(BOT.(TERM)) = 0 
                BOTTOM=POPBOT.(TERM) 
                NEWBOT.(POPTOP.(INPUT),TERM) 
                NEWBOT.(BOTTOM,TERM) 
            Otherwise 
                NEWBOT.(POPTOP.(INPUT),TERM) 
            End Conditional 
            Transfer To CHANGE 
DELTA(6)    S=SEQRDR.(TERM) 
READ(6)     OBJCT=SEQLR.(S,F) 
            Whenever F &gt; 0, Transfer To FAIL 
            Whenever F &lt;&gt; 0, Transfer To READ(6) 
            OBJCT=SEQLL.(S,F) 
            Whenever LNKLL.(OBJECT) = 0 
                SUBST.(POPTOP.(INPUT),LSPNTR.(S)) 
            Otherwise 
                NEWTOP.(POPTOP.(INPUT),LSPNTR.(S)) 
            End Conditional 
            Transfer To CHANGE 
            End Function 

           R* * * * * * * * * * END OF MODIFICATION ROUTINE 

        TPRIN 
            EXTERNAL FUNCTION (LST) 
            NORMAL MODE IS INTEGER 
            ENTRY TO TPRINT. 
            SA=SEQRDR.(LST) 
            LIST.(OUT) 
READ        NEXT=SEQLR.(SA,FA) 
            Whenever FA &gt; 0, Transfer To P 
            Whenever FA = 0, Transfer To B 
            POINT=NEWBOT.(NEXT,OUT) 
            Whenever SA &lt; 0, MRKNEG.(POINT) 
            Transfer To READ 
B           TXTPRT.(OUT,0) 
            SEQLL.(SA,FA) 
MORE        NEXT=SEQLR.(SA,FA) 
            Whenever TOP.(NEXT) = &quot;=&quot; 
                TXTPRT.(NEXT,0) 
                Transfer To MORE 
            End Conditional 
            Whenever FA &gt; 0, Transfer To DONE 
            PRINT COMMENT &quot; &quot; 
            SB=SEQRDR.(NEXT) 
MEHR        TERM=SEQLR.(SB,FB) 
            Whenever FB &lt;0 
                PRINT ON LINE FORMAT NUMBER, TERM 
                Vector Values NUMBER = &quot;I3 *&quot; 
                Transfer To MEHR 
            End Conditional 
            Whenever FB &gt; 0, Transfer To MORE 
            TXTPRT.(TERM,0) 
            Transfer To MEHR 
P           TXTPRT.(OUT,0) 
DONE        IRALST.(OUT) 
            Function Return 
            End Function 

        LPRIN 
            EXTERNAL FUNCTION (LST,TAPE) 
            NORMAL MODE IS INTEGER 
            ENTRY TO LPRINT. 
            BLANK = &quot;      &quot; 
            EXECUTE PLACE.(TAPE,0) 
            LEFTP = 606074606060K 
            RIGHTP= 606034606060K 
            BOTH  = 607460603460K 
            EXECUTE NEWTOP.(SEQRDR.(LST),LIST.(STACK)) 
            S=POPTOP.(STACK) 
BEGIN       EXECUTE PLACE.(LEFTP,1) 
NEXT        WORD=SEQLR.(S,FLAG) 
            Whenever FLAG &lt; 0 
            EXECUTE PLACE.(WORD,1) 
            Whenever S &gt; 0, PLACE.(BLANK,1) 
            Transfer To NEXT 
            OR Whenever FLAG &gt; 0 
            EXECUTE PLACE.(RIGHTP,1) 
            Whenever LISTMT.(STACK) = 0, Transfer To DONE 
            S=POPTOP.(STACK) 
            Transfer To NEXT 
            OTHERWISE 
            Whenever LISTMT.(WORD) = 0 
            EXECUTE PLACE.(BOTH,1) 
            Transfer To NEXT 
            OTHERWISE 
            EXECUTE NEWTOP.(S,STACK) 
            S=SEQRDR.(WORD) 
            Transfer To BEGIN 
            End Conditional 
            End Conditional 
DONE        EXECUTE PLACE.(0,-1) 
            EXECUTE IRALST.(STACK) 
            FUNCTION RETURN LST 
            END OF FUNCTION 

        TESTS 
            EXTERNAL FUNCTION(CAND,S) 
            NORMAL MODE IS INTEGER 
            DIMENSION FIRST(5),SECOND(5) 
            ENTRY TO TESTS. 
            STORE=S 
            READER=SEQRDR.(CAND) 
            Through ONE, FOR I=0,1, I &gt; 100 
            FIRST(I)=SEQLR.(READER,FR) 
ONE         Whenever READER &gt; 0, Transfer To ENDONE 
ENDONE      SEQLL.(S,F) 
            Through TWO, FOR J=0,1, J &gt; 100 
            SECOND(J)=SEQLR.(S,F) 
TWO         Whenever S &gt; 0, Transfer To ENDTWO 
ENDTWO      Whenever I &lt;&gt; J, Function Return 0 
            Through LOOK, FOR K=0,1, K&gt; J 
LOOK        Whenever FIRST(K) &lt;&gt; SECOND(K), Function Return 0 
            EQL=SEQLR.(READER,FR) 
            Whenever EQL &lt;&gt; &quot;=&quot; 
            SEQLL.(READER,FR) 
            Function Return READER 
            Otherwise 
            POINT=LNKL.(STORE) 
            Through DELETE , FOR K=0,1, K &gt; J 
            REMOVE.(LSPNTR.(STORE)) 
DELETE      SEQLR.(STORE,F) 
INSRT       NEW=SEQLR.(READER,FR) 
            POINT=NEWTOP.(NEW,POINT) 
            MRKNEG.(POINT) 
            Whenever READER &lt; 0, Transfer To INSRT 
            MRKPOS.(POINT) 
            Function Return READER 
            End Conditional 
            End Function 

        DOCBC 
            EXTERNAL FUNCTION (A,B) 
            NORMAL MODE IS INTEGER 
            ENTRY TO FRBCD. 
            Whenever LNKL.(A) = 0, Transfer To NUMBER 
            B=A 
            Function Return 0 
NUMBER      K=A*262144 
            B=BCDIT.(K) 
            Function Return 0 

            End Function 
        ELIZA 
            NORMAL MODE IS INTEGER 
            DIMENSION KEY(32),MYTRAN(4) 
            INITAS.(0) 
            PRINT COMMENT &quot;WHICH SCRIPT DO YOU WISH TO PLAY&quot; 
            READ FORMAT SNUMB,SCRIPT 
            LIST.(TEST) 
            LIST.(INPUT) 
            LIST.(OUTPUT) 
            LIST.(JUNK) 
            LIMIT=1 
            LSSCPY.(THREAD.(INPUT,SCRIPT),JUNK) 
            MTLIST.(INPUT) 
            Through MLST, FOR I=1,1, I &gt; 4 
MLST        LIST.(MYTRAN(I)) 
            MINE=0 
            LIST.(MYLIST) 
            Through KEYLST, FOR I=0,1, 1 &gt; 32 
KEYLST      LIST.(KEY(I)) 

           R* * * * * * * * * * READ NEW SCRIPT 

BEGIN       MTLIST.(INPUT) 
            NODLST.(INPUT) 
            LISTRD.(INPUT,SCRIPT) 
            Whenever LISTMT.(INPUT) = 0 
                TXTPRT.(JUNK,0) 
                MTLIST.(JUNK) 
                Transfer To START 
            End Conditional 
            Whenever TOP.(INPUT) = &quot;NONE&quot; 
                NEWTOP.(LSSCPY.(INPUT,LIST.(9)),KEY(32)) 
                Transfer To BEGIN 
               OR Whenever TOP.(INPUT) = &quot;MEMORY&quot; 
                POPTOP.(INPUT) 
                MEMORY=POPTOP.(INPUT) 
                Through MEM, FOR I=1,1, I &gt; 4 
MEM             LSSCPY.(POPTOP.(INPUT),MYTRAN(I)) 
                Transfer To BEGIN 
               Otherwise 
                NEWBOT.(LSSCPY.(INPUT,LIST.(9)),KEY(HASH.(TOP.(INPUT),5))) 
                Transfer To BEGIN 
            End Conditional 

           R* * * * * * * * * * BEGIN MAJOR LOOP 

START       TREAD.(MTLIST.(INPUT),0) 
            KEYWRD=0 
            PREDNC=0 
            LIMIT=LIMIT+1 
            Whenever LIMIT = 5, LIMIT=1 
            Whenever LISTMT.(INPUT) = 0, Transfer To ENDPLA 
            IT=0 
            Whenever TOP.(INPUT) = &quot;+&quot; 
                CHANGE.(KEY,MYTRAN) 
                Transfer To START 
            End Conditional 
            Whenever TOP.(INPUT) = &quot;*&quot;, Transfer To NEWLST 
            S=SEQRDR.(INPUT) 
NOTYET      Whenever S &lt; 0 
                SEQLR.(S,F) 
                Transfer To NOTYET 
               Otherwise 
                WORD=SEQLR.(S,F) 
                Whenever WORD = &quot;.&quot; OR WORD = &quot;,&quot; OR WORD = &quot;BUT&quot; 
                    Whenever IT = 0 
                        NULSTL.(INPUT,LSPNTR.(S),JUNK) 
                        MTLIST.(JUNK) 
                        Transfer To NOTYET 
                       Otherwise 
                        NULSTR.(INPUT,LSPNTR.(S),JUNK) 
                        MTLIST.(JUNK) 
                        Transfer To ENDTXT 
                       End Conditional 
                    End Conditional 
                End Conditional 
                Whenever F &gt; 0, Transfer To ENDTXT 
                I=HASH.(WORD,5) 
                SCANER=SEQRDR.(KEY(I)) 
                SF=0 
                Through SEARCH, FOR J=0,0, SF &gt; 0 
                CAND= SEQLR.(SCANRE,SF) 
                Whenever SF &gt; 0, Transfer To NOTYET 
SEARCH          Whenever TOP.(CAND) = WORD, Transfer To KEYFND 
KEYFND          READER=TESTS.(CAND,S) 
                Whenever READER = 0, Transfer To NOTYET 
                Whenever LSTNAM.(CAND) &lt;&gt; 0 
                    DL=LSTNAM.(CAND) 
SEQ                 Whenever S &lt; 0 
                        SEQLR.(S,F) 
                        Transfer To SEQ 
                       Otherwise 
                        NEWTOP.(DL,LSPNTR.(S)) 
                    End Conditional 
                   Otherwise 
                End Conditional 
                NEXT=SEQLR.(READER,FR) 
                Whenever FR &gt; 0, Transfer To NOTYET 
                Whenever IT = 0 AND FR = 0 
PLCKEY              IT=READER 
                    KEYWRD=WORD 
                   OR Whenever FR &lt; 0 AND NEXT &gt; PREDNC 
                    PREDNC=NEXT 
                    NEXT=SEQLR.(READER,FR) 
                    Transfer To PLCKEY 
                   Otherwise 
                    Transfer To NOTYET 
                End Conditional 
                Transfer To NOTYET 

               R* * * * * * * * * * END OF MAJOR LOOP 

ENDTXT          Whenever IT = 0 
                    Whenever LIMIT = 4 AND LISTMT.(MYLIST) &lt;&gt; 0 
                        OUT=POPTOP.(MYLIST) 
                        TXTPRT.(OUT,0) 
                        IRALST.(OUT) 
                        Transfer To START 
                       Otherwise 
                        ES=BOT.(TOP.(KEY(32))) 
                        Transfer To TRY 
                    End Conditional 
                   OR Whenever KEYWRD = MEMORY 
                    I=HASH.(BOT.(INPUT),2)+1 
                    NEWBOT.(REGEL.(MYTRAN(I),INPUT,LIST.(MINE)),MYLIST) 
                    SEQLL.(IT,FR) 
                    Transfer To MATCH 
                   Otherwise 
                    SEQLL.(IT,FR) 

               R* * * * * * * * * * MATCHING ROUTINE 

MATCH               ES=SEQLR.(IT,FR) 
                    Whenever TOP.(ES) = &quot;=&quot; 
                        S=SEQRDR.(ES) 
                        SEQLR.(S,F) 
                        WORD=SEQLR.(S,F) 
                        I=HASH.(WORD,5) 
                        SCANER=SEQRDR.(KEY(I)) 
SCAN                    ITS=SEQLR.(SCANER,F) 
                        Whenever F &gt; 0, Transfer To NOMATCH(LIMIT) 
                        Whenever WORD = TOP.(ITS) 
                            S=SEQRDR.(ITS) 
SCANI                       ES=SEQLR.(S,F) 
                            Whenever F &lt;&gt; 0, Transfer To SCANI 
                            IT=S 
                            Transfer To TRY 
                        Otherwise 
                            Transfer To SCAN 
                        End Conditional 
                    End Conditional 
                    Whenever FR &gt; 0, Transfer To NOMATCH(LIMIT) 

;;; A lot of the core work is done by the complex SLIP matching and
;;; rebuilding functions YMATCH and ASSMBL (see the latter at HIT)
;;; These are described on pages 62L28-29 of the SLIP manual:
;;;    https://drive.google.com/file/d/1XtF7EM1KhwMPKsp5t6F0gwN-8LsNDPOl

TRY                 Whenever YMATCH.(TOP.(ES),INPUT,MTLIST.(TEST)) = 0,Transfer To MATCH 
                    ESRDR=SEQRDR.(ES) 
                    SEQLR.(ESRDR,ESF) 
                    POINT=SEQLR.(ESRDR,ESF) 
                    POINTR=LSPNTR.(ESRDR) 
                    Whenever ESF = 0 
                        NEWBOT.(1,POINTR) 
                        TRANS=POINT 
                        Transfer To HIT 
                       Otherwise 
                        Through FNDHIT,FOR I=0,1, I &gt; POINT 
FNDHIT                  TRANS=SEQLR.(ESRDR,ESF) 
                        Whenever ESF &gt; 0 
                            SEQLR.(ESRDR,ESF) 
                            SEQLR.(ESRDR,ESF)                                   
                            TRANS=SEQLR.(ESRDR,ESF) 
                            SUBST.(1,POINTR) 
                            Transfer To HIT 
                           Otherwise 
                            SUBST.(POINT+1,POINTR) 
                            Transfer To HIT 
                        End Conditional 
                    End Conditional 
HIT                 TXTPRT.(ASSMBL.(TRANS,TEST,MTLIST.(OUTPUT)),0)  ;; See above, re SLIP functions YMATCH and ASSMBL
                    Transfer To START 
                End Conditional 

               R* * * * * * * * * * INSERT NEW KEYWORD LIST 

NEWLST          POPTOP.(INPUT) 
                NEWBOT.(LSSCPY.(INPUT,LIST.(9)),KEY(HASH.(TOP.(INPUT),5))) 
                Transfer To START 

               R* * * * * * * * * * DUMP REVISED SCRIPT 

ENDPLA          PRINT COMMENT &quot;WHAT IS TO BE THE NUMBER OF THE NEW SCRIPT&quot; 
                READ FORMAT SNUMB,SCRIPT 
                LPRINT.(INPUT,SCRIPT) 
                NEWTOP.(MEMORY,MTLIST.(OUTPUT)) 
                NEWTOP.(&quot;MEMORY&quot;,OUTPUT) 
                Through DUMP, FOR I=1,1 I &gt; 4 
DUMP            NEWBOT.(MYTRAN(I),OUTPUT) 
                LPRINT.(OUTPUT,SCRIPT) 
                MTLIST.(OUTPUT) 
                Through WRITE, FOR I=0,1, I &gt; 32 
POPMOR          Whenever LISTMT.(KEY(I)) = 0, Transfer To WRITE 
                LPRINT.(POPTOP.(KEY(I)),SCRIPT) 
                Transfer To POPMOR 
WRITE           CONTINUE 
                LPRINT.(MTLIST.(INPUT),SCRIPT) 
                EXIT. 

               R* * * * * * * * * * SCRIPT ERROR EXIT 

NOMATCH(1)      PRINT COMMENT &quot;PLEASE CONTINUE &quot; 
                Transfer To START 
NOMATCH(2)      PRINT COMMENT &quot;HMMM &quot; 
                Transfer To START 
NOMATCH(3)      PRINT COMMENT &quot;GO ON , PLEASE &quot; 
                Transfer To START 
NOMATCH(4)      PRINT COMMENT &quot;I SEE &quot; 
                Transfer To START 
                VECTOR VALUES SNUMB= &quot;I3 * &quot; 
                End of Program 
</code></pre></div>

<p>An excerpt from that site will serve to introduce this critique:</p>

<blockquote><div>
  <p>​Weizenbaum's ELIZA was intended as a general conversational agent. It interpreted a separate, domain-specific script that determined its style of conversation. The most well-known script is called "DOCTOR". This is the one that carries on the well-known "Rogerian therapist" conversations that appear in Weizenbaum's 1966 paper, and which are most commonly associated with the name "ELIZA." Indeed, the name "ELIZA" has basically come to mean “the ELIZA agent running the DOCTOR script.”  ELIZA can be seen as the precursor of many of the conversation interfaces and chatbots that we have become so familiar with today, such as Siri, but it worked over a much more clunky typewriter-based console.</p>
  
  <p>As Weizenbaum (1967: 475) explains, “From one point of view, an ELIZA script is a program and ELIZA itself an interpreter. From another perspective, ELIZA appears as an actor who must depend on a script for his [sic] lines. The script determines the contextual framework within which ELIZA may be expected to converse plausibly." In Contextual Understanding by Computers (CACM, 1967), he writes: "The first program to which I wish to call attention is a particular member of a family of programs which has come to be known as DOCTOR. The family name of these programs is ELIZA. This name was chosen because these programs, like the Eliza of Pygmalion fame, can be taught to speak increasingly well. DOCTOR causes ELIZA to respond roughly as would certain psychotherapists (Rogerians). ELIZA performs best when its human correspondent is initially instructed to "talk" to it, via the typewriter, of course, just as one would to a psychiatrist."</p>
  
  <p>However, only the DOCTOR script appears in the 1966 paper, not the ELIZA code that interprets that script, although the algorithm is described in great detail. The common misconception that ELIZA was originally written in Lisp arose because shortly after the publication of the 1966 paper, Bernie Cosell wrote a version in Lisp, based upon the description of the algorithm in the 1966 paper. Cosell's version used a version of the published script, but he never saw the original ELIZA.</p>
  
  <p>The language of the original ELIZA was MAD-SLIP. MAD (“Michigan Algorithm Decoder”) was an Algol-like programming language, available on many common large computers at that time. Weizenbaum added list-processing functionality to MAD, creating MAD-SLIP ("Symmetric LIst Processor," Weizenbaum, 1963), a Lisp-like language embedded in MAD. Indeed, adding to the Lisp/MAD-SLIP confusion, the published DOCTOR script, an appendix to the 1966 paper, is parenthesized exactly like a Lisp S-expression, the way one would if one were writing a Lisp program.** Here is a copy of the MAD-SLIP manual from a long, 3-volume 7090 user's guide.</p>
</div></blockquote>

<p>We can probably work almost entire from the code transcript by Anthony Hay, which can be found <a rel="nofollow" href="https://github.com/jeffshrager/elizagen.org/blob/master/1965_Weizenbaum_MAD-SLIP/MAD-SLIP_transcription.txt" title="[in this GitHub repo]">[in this GitHub repo]</a>. And it can be checked vs. <a rel="nofollow" href="https://drive.google.com/file/d/1DkdV2o-36mm3x2nURjhKiCaFcjZtMIoI/view" title="[the original printout]">[the original printout]</a>, as discovered in the MIT archives (also linked to the  <a rel="nofollow" href="https://sites.google.com/view/elizagen-org/the-original-eliza?authuser=0" title="[the elizagen.org page]">[the elizagen.org page]</a>). I can also provide a high res copy upon request, although it's huge and page-by-page, and I don't think that it adds much more than can be seen on the lo-res printout, unless you are looking for traces of clam in the paper.</p>

<p>Further helpful resources are linked from <a rel="nofollow" href="https://sites.google.com/view/elizagen-org/the-original-eliza?authuser=0" title="[the elizagen.org page]">[the elizagen.org page]</a>, including <br />
<a rel="nofollow" href="https://drive.google.com/file/d/1XtF7EM1KhwMPKsp5t6F0gwN-8LsNDPOl/view" title="[a SLIP manual]">[a SLIP manual]</a>, and <a rel="nofollow" href="https://babel.hathitrust.org/cgi/pt?id=mdp.39015021689271&amp;view=1up&amp;seq=5&amp;skin=2021" title="[a primer on the MAD language]">[a primer on the MAD language]</a>. I am trying to invite experts on both MAD and SLIP to the conversation. Mark and Jeremy have partially OCR/xscribed the SLIP manual, but it's difficult. If folks decide it's necessary, we were thinking of crowdsourcing that among those participating in this thread.</p>

<p>Here are some Preliminary discussion questions:</p>

<ol>
<li>What does the SLIP code tell us about the functioning of ELIZA?</li>
<li>How does Weizenbaum take advantage of SLIP’s features? (Note that he wrote SLIP, originally as an add-on to Fortran, and then MAD.)</li>
<li>In what way does the DOCTOR script make use of the affordances of this code?</li>
<li>Does the DOCTOR script merely use or complete this system?</li>
<li>In what way does this code set up a system for a variety of scripts?</li>
<li>What does this code tell us about the affordances or limitations of SLIP?</li>
<li>How does MAD-SLIP facilitate or inhibit the creation of ELIZA and DOCTOR?</li>
</ol>

<p>And some more ELIZA-related questions:<br />
1. How (well) does/did ELIZA implement a "talking cure"?<br />
2. What signs of ELIZA’s heritage can be seen in contemporary conversational interfaces, such as Siri, Alexa, <a rel="nofollow" href="https://en.wikipedia.org/wiki/Artificial_Linguistic_Internet_Computer_Entity" title="ALICE">[ALICE]</a>, and perhaps especially <a rel="nofollow" href="https://woebothealth.com" title="woebot">[woebot]</a>? (One might recall <a rel="nofollow" href="https://www.theatlantic.com/technology/archive/2014/06/when-parry-met-eliza-a-ridiculous-chatbot-conversation-from-1972/372428/" title="the famous transcript of a discussion between ELIZA and PARRY">[the famous transcript of a discussion between ELIZA and PARRY]</a>, an ELIZA knock-off that simulated a paranoid patient.)<br />
3. The code appears to have a "teaching" mode that allows the interlocutor to teach ELIZA new responses. This has, so far as I know, never been significantly commented upon in discussions of ELIZA. Does this change the way we think of Weizenbaum's project?</p>

<p>In excited anticipation; Your humble co-discussion-leader,<br />
'Jeff</p>
]]>
        </description>
    </item>
    <item>
        <title>Arena, Daggerfall, and the Pitfalls of Procedurally Generated Names (Code Critique 2022)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/134/arena-daggerfall-and-the-pitfalls-of-procedurally-generated-names-code-critique-2022</link>
        <pubDate>Wed, 09 Feb 2022 14:43:44 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>JoeyJones</dc:creator>
        <guid isPermaLink="false">134@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>Title: Arena / Daggerfall Unity<br />
Authors: Bethesda Softworks / Daggerfall Workshop<br />
Language: C#<br />
Years of development: 1994 /2021</p>

<p>In 1994, Bethesda Softworks released <a rel="nofollow" href="https://elderscrolls.bethesda.net/en/arena" title="Arena">Arena</a>, a computer roleplaying game with a large world full of procedurally generated characters and locations. The content generation included creating culturally appropriate names for people that you meet. These names would be given when you talk to them, like so:</p>

<p><img src="https://wg.criticalcodestudies.com/uploads/editor/hi/2c4qfrad4pee.png" alt="" title="" /></p>

<p>Each of the names would be generated according to different sets of rules. For the European-inspired fantasy races (Bretons, Nords, Imperials), these would involve simple combination of a prefix and suffix. For example, the male Bretons names have 169 variations, each comprised of a combination of one of<a rel="nofollow" href="https://en.uesp.net/wiki/Lore:Breton_Names" title="13 suffixes and 13 prefixes"> 13 suffixes and 13 prefixes</a>:</p>

<blockquote><div>
  <p>The 13 prefixes for male Breton names are: Agr-, Alab-, And-, Bed-, Dun-, Edw-, Gond-, Mord-, Per-, Rod-, Theod-, Trist-, and Uth-.<br />
  The 13 suffixes for male Breton names are: -ane, -ard, -astyr, -istair, -istyr, -ore, -oryan, -yctor, -yn, -ynak, -yrick, -yval, and -ywyr.</p>
</div></blockquote>

<p>This would be guaranteed to lead to comprehendible fantasy names like "Alabard", Theodyrick", or "Peryn". However, for the Redguard race, the algorithm for generating names is a bit more complicated. Here is the code for generating names as recreated for Daggerfall Unity. This is a project that recreates Daggerfall, which is the sequel to Arena. It faithfully uses the same set of rules and word list as the original Arena.</p>

<pre><code> // Gets random Redguard name which follows 0+1+2+3(75%) (male), 0+1+2+4 (female) pattern
        string GetRandomRedguardName(NameBank nameBank, Genders gender)
        {
            // Get set parts
            string[] partsA, partsB, partsC, partsD;
            if (gender == Genders.Male)
            {
                partsA = nameBank.sets[0].parts;
                partsB = nameBank.sets[1].parts;
                partsC = nameBank.sets[2].parts;
                partsD = nameBank.sets[3].parts;
            }
            else
            {
                partsA = nameBank.sets[0].parts;
                partsB = nameBank.sets[1].parts;
                partsC = nameBank.sets[2].parts;
                partsD = nameBank.sets[4].parts;
            }

            // Generate strings
            uint index = DFRandom.rand() % (uint)partsA.Length;
            string stringA = partsA[index];

            index = DFRandom.rand() % (uint)partsB.Length;
            string stringB = partsB[index];

            index = DFRandom.rand() % (uint)partsC.Length;
            string stringC = partsC[index];

            string stringD = string.Empty;
            if (gender == Genders.Female || (DFRandom.rand() % 100 &lt; 75))
            {
                index = DFRandom.rand() % (uint)partsD.Length;
                stringD = partsD[index];
            }

            return stringA + stringB + stringC + stringD;
        }
</code></pre>

<p>The strings referred to here can be seen in this <a rel="nofollow" href="https://github.com/Interkarma/daggerfall-unity/blob/master/Assets/Resources/NameGen.txt" title="file">file</a>. The code works as described on the <a rel="nofollow" href="https://en.uesp.net/wiki/Lore:Redguard_Names" title="unofficial Elder Scrolls wiki">unofficial Elder Scrolls wiki</a>:</p>

<blockquote><div>
  <p>Names for male Redguards in Arena consist of a prefix followed by a vowel followed by a consonant, sometimes followed by a suffix. Redguards have no surnames.<br />
  The 43 prefixes for male Redguard names are: B, Ba, Bl, Br, C, Ca, Ch, Cr, D, Dh, F, Fh, Fl, Fr, G, Gh, Gl, Gr, K, Kh, Kl, Kr, L, Lh, M, Ma, Mh, N, Nh, R, Rh, Rl, S, Sa, Sh, Shr, Sl, St, T, Th, Tl, V, Vl<br />
  The 5 vowels for male Redguard names are: a, e, i, o, u<br />
  The 15 consonants for male Redguard names are: b, c, d, h, j, k, l, m, n, p, r, s, t, v, z<br />
  The 22 suffixes for male Redguard names are: am, an, ar, e, em, en, er, im, in, ir, ke, 'kern, om, on, rn, t, ta, te, ten, um, un, ur</p>
</div></blockquote>

<p>The Redguards, in their appearance, dress and architecture, are clearly inspired to an extent by Arabic and North African cultures. However, in attempting to make a generator that generates legible and 'Redguard-sounding' names, among the many possible names are several which have absurd or offensive meanings in English and other languages.</p>

<p>For example, this character's name is particularly apt in Portuguese:</p>

<p><img src="https://wg.criticalcodestudies.com/uploads/editor/17/ty69lumxo7tp.png" alt="" title="" /></p>

<p>Following the algorithm (picking from the lists in order: a prefix, a vowel, a consonant and then optionally a suffix), you can easily form a number of words that wouldn't be appropriate to label someone, including racial epithets or words close to them.</p>

<p>Given the huge range of possible names (74,175 male names), it's quite possible most players wouldn't see anything untoward, though some would. After all, the screenshot above was from my own playthrough, the seventh person I spoke to.</p>

<p>What would have been a more anti-racist way of procedurally generating fantasy names?</p>

<p>In later instalments of the series, certain Redguard name suffixes and prefixes took on a specific meaning, inspired by Arabic naming conventions. Instead of stringing together non-meaningful combinations of letters, they could have alternatively taken the approach of using larger name components, as seen with many of the other cultures names in the game.</p>

<p>And should projects that seek to recreate and restore old games, like Daggerfall Unity, reproduce code that can produce derogatory and racist names?</p>
]]>
        </description>
    </item>
    <item>
        <title>Cybernetics and “Command and Control” in kubectl (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/125/cybernetics-and-command-and-control-in-kubectl-2022-code-critique</link>
        <pubDate>Wed, 02 Feb 2022 01:19:18 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>gregorybringman</dc:creator>
        <guid isPermaLink="false">125@/index.php?p=/discussions</guid>
        <description><![CDATA[<p><strong>Code</strong>: kubectl - Kubernetes command line interface<br />
<strong>Author</strong>: Google<br />
<strong>Date</strong>: Current open source framework.<br />
<strong>Github</strong>: <a href="https://github.com/kubernetes/kubernetes" rel="nofollow">https://github.com/kubernetes/kubernetes</a></p>

<p>“Kubectl” or “Kube Control” is the command-line interface for Kubernetes, or K8S, the server orchestration framework originally developed by Google. It is used to manage software principally in the cloud, where it can scale up software applications based upon a specification or manifest, as well as via monitoring the live expenditure of computing resources. The command-line interface works on top of these deployment definitions, much like the Unix command-line allows a user to interrogate the state of a runtime system.</p>

<p>K8S’ name and its history at Google are problematic though: “Kubernetes” is the root word or variant of the root from which we get the "Cybernetics" of Norbert Wiener, and an early name of the platform was "Borg", based upon the Star Trek aliens who assimilate/colonize other alien civilizations and boldly announce to them that "resistance is futile”. In addition, the architecture of the platform demarcates "master" nodes that run within a "control pane" and "worker nodes" that host applications and run them. The fundamental unit of the nodes is the “pod,” which is a collection of application container environments running code. Especially with the name of the command-line interface suffixed with -ctl, which is an abbreviation for “control” the K8S platform seems to be a direct embrace of the discipline of cybernetics as “command and control”.</p>

<p>That said, K8S is powerful, and could be seen to facilitate exploratory programming, coding ateliers, and Marxian dreams of allowing proletarian developers to instantly instantiate a technological and productive base that determines how political and cultural bodies are represented. In particular, two powerful features of Kubernetes/kubectl are namespaces and labels, allowing a developer to organize/orchestrate server resources, as the so-called helmsperson.</p>

<p>Labels more specifically allow search-engine-like queries on these resources, so that a developer could summon all components running their app via just a few commands—despite the fact that this app would be built from nodes of worker pods in a vast homogeneous sea of other worker pods.</p>

<p>For example, labels can be used to group pods by application name or release type as demonstrated by Marko Lukša in his Manning guide to K8S, but the labels can be anything. With kubectl, I can create a YAML manifest:</p>

<pre><code>  apiVersion: v1
kind: Pod
metadata:
    name: myApp-cloud-001 
        labels: 
            app: picturebook 
            rel: beta    
spec:
    containers:     
    - name: myApp-Container 
        image:  myApp-w-nodeJS         
            ports:         
            - containerPort: 8080 
... 
</code></pre>

<p>And I then can create a cloud cluster, replicating nodes with pods by passing the manifest file above as an argument:</p>

<p><code>kubectl apply -f pod.myapp.yaml </code></p>

<p>  And then, after K8S has spun up my app and distributed it potentially to multiple pods all automatically, I can summon these pods with a command like the following, specifying which ones I want to fetch via the labels I've defined:</p>

<p><code>  kubectl get po -L app,rel</code></p>

<p>Or:</p>

<p><code>kubectl get pods -l app=taxes, rel=beta</code></p>

<p>Or another query using kubectl:</p>

<p><code>kubectl get pods -l 'app in (picturebook, taxes)' -L app</code></p>

<p>These searchable labels exemplify inaugural Cybernetics, as “labelling” has been at the center of development of programming after World War II—whenever we speak of variables, identifiers, keywords and code. Because first generation cybernetics formalizes symbolic language within the same cultural and disciplinary space as Lacanian psychoanalysis, so a kubectl label is a use of language becoming absolute and formal and integrated into societal controls. In fact, K8S labels embody applied programming, but also command and control, since the entire lifecycle of their applications is automated through selectors (like in the queries above) on these labels. Note too that, scaling up, scaling down, and restarting pods and containers, a given application never terminates—in true Borg fashion.</p>

<p>At the birth of the Unix computer, if a command to list the file contents of a single computer made the human-computer interface operative, so kubectl has multiplied this system interrogation for thousands of machines simultaneously—as one giant distributed machine. The “power” of the command-line is propagated to kubectl, which can also wrap the operating system command-line in serverless instructions that produce a command result while bypassing runtime systems altogether in some cases**. In effect, in the figure of the software platform K8S, the metaphor of command and control in its specific post-war flavor has been resurrected. But there are many hidden costs of this magic-seeming orchestration across thousands of machines, whether environmental or related to the denial of death and materiality.</p>

<p>As a result, the wealth of <em>critiques</em> of cybernetics, even those from its own age, must be applied to cloud computing. Deleuze and Guattari are helpful here, in offering better metaphors for technology: the notion of the machinic is a critique of the Lacanian signifier-signified relation assumed by the Western labelling subject—while not abandoning technological metaphors. Gilbert Simondon’s work is relevant here too: in the notions of individuation and mechanology, value of technological artifacts in their “aseity”  is deferred until interested actors, readers, and cultural laborers productively articulate their interconnections.</p>

<p>As it turns out, one of the lesser known features of K8S is what is called its affinity and anti-affinity definitions, which can be added to a deployment specification or manifest. An affinity definition is a set of numeric values that specify the probability of hardware and system conditions by which containers and pods should group themselves together and reform pod instances when one or more of them has been temporarily terminated to be replaced by other pods. No doubt, this feature is a tool of automation still ultimately tied to command and control. But could the general notion of affinity rather than a helmsperson offer a way out of current K8S metaphors?</p>

<p>Questions</p>

<ol>
<li>What should Kubernetes/kubectl be called instead? :-)</li>
<li>What would Kubernetes look like if it were built from the ground up on affinities, which would be its raison d’être? What if corporate technology were forced to build its systems on “affinity machines”?</li>
<li>What are some ways “always running” systems are problematic? Does Kubernetes enforce a status quo in its highly programmatically deterministic specifications and manifests?</li>
</ol>

<p>** For example, a command to look up a DNS setting on a resource: <br />
        <code>kubectl run -it srvlookup --image=tutum/dnsutils --rm --restart=Never -- dig SRV taxes.default.svc.cluster.local</code>, after Lukša. <br />
        The <code>rm</code> removes the server image, but the command still produces an output.</p>
]]>
        </description>
    </item>
    <item>
        <title>Kittler's Raytracer -- Code Critique</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/126/kittlers-raytracer-code-critique</link>
        <pubDate>Sat, 05 Feb 2022 02:15:05 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>markcmarino</dc:creator>
        <guid isPermaLink="false">126@/index.php?p=/discussions</guid>
        <description><![CDATA[<p><strong>Author: Friedrich Kittler<br />
Year: early 1980s-2000s <br />
Hardware: Pentium IV,<br />
x86 family of processors<br />
x87 family of floating point coprocessors<br />
Languages: C and Assembly</strong></p>

<pre><code>/*
v. 3.57
31.07.11
PTRACE.PAS (c't 1 / 93,167ff.) Extended
Super ellipsoid, rotational body and procedural textures from povray3
DOS version no longer supported

compile:
Normal: xgraf xsuptrace ray.s matrices.s
// Option -DNEWFRESNEL: simplified Fresnel light from povray3
Option -DJITTER: fuzzy (and time-consuming) shadows
Option -DGAMMA: gamma correction
SVGALIB or DGA: picture.i &quot;.equ XWINDOW, 1&quot; change!

RUNTIME:
&lt;xsuptrace 1&gt;: reproducible noise for runtime tests

FEATURES:
All reflective surfaces can be switched to ReflectionMapping: what
then appears on its surface is a picture to be loaded. This feature
But users have to demand standard interfaces first.

Prompts (':') for scalars and vectors can be defined either with &lt;w [eiter]&gt;
or &lt;n [on]&gt; acknowledge or answer with a new input

3 constant objects: sky, hell, ground (which only allows plein-air images)
Any number of variable objects (limited only by memory)
As variable objects by default 2 balls and 1 other object each predefined
ned. But standard objects can be deleted again.
If more than the standard number is required by an object, inquire
Prompt the new coordinates.
All objects editable (constant only by reassignment of color to normal)
procedure, variable also by location and size).
Some exotic objects are scalable and rotatable. This will be expanded.
Any number of lamps, the first 2 predefined.
Surfaces 1. global, 2. individually assignable after assignment to objects.
LINUX: any size * .24f images can be loaded as 2D textures.

Objects as a ring list for faster intershade () (see Foley, p.
Lamps as a simply linked list.
Left-hand coordinate system: left &lt;x &lt;right, front &lt;y &lt;behind,
below &lt;z &lt;above.

NEW:
01.04.97: AttCol () - acceleration away - transparencies would be worse
07.04.98: individual Fresnel coefficients with individual dullness for
opaque, but metallic surfaces (so these coefficients TransC and
to overwrite the dullness transparently). That's physically correct
and, mirabile dictu, better than POVRAY3. Color charts (ColTabT) global
edierbar
Nov. 22, 1998: ReflV () after Glassner, Image Synthesis, p. 726, again in
Light () calculated per lamp
24.12.98: Object slice selectable; Unzipped DOS - *. 24f files are loadable
08.03.99: Experimental support for Ohta / Maekawa algorithm
01.09.01: DOS no longer supported
13.08.04: Init_Ohta () different, untested
24.10.08: Steel new to /usr/ich/laptop/xsuptrace.c

BUGS:
Spheres of stei, sup, sor determined very empirically
2DMapping on Steiner, Agnesi, SuperQuadrik and Drehkoerper only as a reflection
Map implements: the transformation (x, y, z) -&gt; (u, v) would be hard
MapProc () and Init_Fresnel () are not prepared for multiple calls from xsuptrace
Editing boxes and pyramids still uneven
lambda and thin globally, not variable per surface
No transparency shading as in CALCNEW.C
For floating objects, NULL pointer errors are inevitable; you
Change a midpoint coordinate to small amounts
SOR orb is detected in Edit_Sor () instead of Gravity ()
Refractive indices do not depend on wavelengths of light
08.04.99: intershade () returns L-&gt; Shad correctly, but the object cohesion
breaks again and again, slows down more
30.12.00: xgraf (gcc with optimization) can falsify the SOR curve if
in SorInput the difference quotient Dy / Dx (ie the slope between
two x fixed points) becomes too large.
09.09.03: Changes to transformation matrices only apply to gcc -g. dark
31.07.11: QuColProc () dare

HINTS:
Between internal data structures and user displays, complex conversion
take place; So do not patch global data!
*/

#define  SUPTRACE
#define  COLTABSIZE   127       // unter DOS ggf. kleiner     
#define  PIII                   // bei schlechterer CPU dringend aendern!

// CompileTime: Globale Zaehler, bei neuen Objekten oder Oberflaechen erhoehen

#define SurfCnt       27        
#define FormCnt       16

#include &lt;time.h&gt;
#define  SPALTEN 640
#define  ZEILEN  480
#define  RAY
#include &quot;xdefines.h&quot;   // SVGALIB: defines.h 
#include &quot;ray.h&quot;        // SVGALIB: #include bild16m.c || lock16m.c 

extern float Infinit,halb,Epsilon;
VCT3  lambda = { 52.36, 56.6, 68.313 };
float duenne = 0.35, LampDiv = 1.;

// RunTime: Zaehler je Sitzung 
int ObjCnt  = 0;
int RingCnt = 0;
int LampCnt = 0;
int FormCnts[FormCnt] = {0};
int TestFormCnt = FormCnt-1;

#ifdef JITTER
float circle[19][2] = {
  {0.75, 0.433}, { 0.0,0.866},   {-0.75,0.433},{-0.75,-0.433 },{0.0,-0.866 },
  {0.75,-0.433}, { 0.0,0.0  },   { 0.75,0.0  },{0.375, 0.65  },{-0.375,0.65},
  {-0.75,0.0  }, {-0.375,-0.65}, {0.375,-0.65},{0.375,0.216  },{0.0,0.433  },
  {-0.375,0.217},{-0.375,-0.216},{0.0,-0.433 },{0.375,-0.217}};
float jitter_offset = 0.75/((float)RAND_MAX);
#endif

// global, um Debuggen und Fehler() zu erlauben
static uint x,y,RDepth = 0;
// global wegen ray.s
VCTd4  dquart;          
VCT4   quart;
VCT6   bicub;

#ifdef OHTA
typedef struct { VCT3 dist; float costheta; } dirT;
dirT    **direction;
int     lastring;
#endif

int   normal[FormCnt] = {0,1,4,7,2,11,10,7,6,11,3,14,12,5,2,23};
char  ObjStr[FormCnt][32] = {
  &quot;den Himmel          &quot;,
  &quot;die Hoelle          &quot;,
  &quot;den Boden           &quot;,
  &quot;eine Kugel          &quot;,
  &quot;ein Moebiusband     &quot;,
  &quot;eine Steinerflaeche &quot;,
  &quot;einen Torus         &quot;,
  &quot;ein Ellipsoid       &quot;,
  &quot;einen Kegel         &quot;,
  &quot;eine Pyramide       &quot;,
  &quot;einen Quader        &quot;,
  &quot;eine Agnesi         &quot;,             
  &quot;ein Superellipsoid  &quot;,
  &quot;Rotationskoerper    &quot;,
  &quot;eine Scheibe        &quot;,
  &quot;eine Lemniskate     &quot;
  };

FrameT Frame = {
  { 2.0, -8.0, 0.992},  // eye          
  { 0.1,  0.1, 0.1 },   // Ambient   
    0.96, 17,  5, 
    0.01, 0.0,          // AttenEps unbenutzt 
  { 0.50, 0.5, 0.6}, 0};// Fog_Color, Fog 

LampT *Lampen = NULL;
Prims *Boden,*Hoelle,*Himmel,*Ring,*Last;
int   arg = TRUE;

// VORDEFINIERTE OBJEKTE 

kugelT k1   =  {
  { 2.5, -2.7, 1.0 }, 1.0 };

kugelT k2   =  {
  {-0.2, -4.0, 1.3 }, 1.15 };

steiT  stei =  {
  {-2.0, -5.0, 1.0},
  { 1.0,  1.0, 1.0}};

moeT   moeb =  {
  {{ 2.4, 0.5, 4.0 }, 1.4 }, 
  0.25, 1 }; 

torT   tor  =  {
  { 2.9,-5.5, 0.6}, 0.3, 0.1 };

ellT   ell  =  {
  {3.8, -3.9, 1.4 },
  {0.3, -0.2, 0.6 },
  {0,0,0,0,0,0,0,0,0,0}};

coneT cone =  {
  {{ 1.4,-4.5,1.6 }, 0.29146 },
  0.3, 1.4, 0,1,20 }; 

pyrT pyr = {
  {{ 3,4,
  { 1.0, 1.0, 1.0 },    // LINKS - normal
    {{-0.9,-1.3, 2.2 }, // links 
    { 0.4,-2.5, 2.6 },  // Mitte 
    { 0.1,-0.5, 4.2 }}},// oben  
  { 3,4,
    { 1.0, 1.0, 1.0 },  // RECHTS 
    {{ 0.4,-2.5, 2.6 }, // Mitte  
     { 1.3, 0.2, 2.2 }, // rechts 
     { 0.1,-0.5, 4.2 }}},// oben   
  { 3,4,
    { 1.0, 1.0, 1.0 },   // UNTEN  
    {{ 1.3, 0.2, 2.2 },  // rechts 
    { 0.4,-2.5, 2.6 },   // Mitte  
    {-0.9,-1.3, 2.2 }}}, // links  
  { 3,4,
    { 1.0, 1.0, 1.0 },   // HINTEN 
    {{ 0.1,-0.5, 4.2 },  // oben   
    { 1.3, 0.2, 2.2 },   // rechts 
    {-0.9,-1.3, 2.2 }}}},// links  
  0, 0 };                // Polygon,dummy 

agnT agn = {{{-7.3,0.5,3.5},3.}, {-20, 0, 0}, 1.1 };

BoxInput B1 = { -1,1,3.0,1.5,-1,1 };

boxT box1; 

superT sup = {{ -1.5,-1.5,2.4 }, {2.0/0.7,0.7/0.6,2.0/0.6}, 
  {{-1,-1,-1},{1,1,1}},{1,1,1}};

sorInput sorI = {
  7,TRUE,
  {{ 1.80000,-0.100000},
    { 0.118143,0.020000},
  { 0.620253,0.540084},
    { 0.210970,0.827004},
    { 0.194093,0.962025},
  { 0.286920,1.035000},
    { 0.488354,1.150000}}};

discT scheibe = {{{ -1, 2, 2.7 }, 1.8 }, {0.4,-0.8,0.55}, 1, 0.7 };

lemniT lemnis = {{{ 4.1, -0.8, 1.85 }, 1.3 }, { 90., 0., 0. }};

// Vordefinierte Lampen 

LampT L1 = {
  {  4.0, -3.5, 4.5 },
  {  0.95, 0.8, 0.8 },
  {  1.0,  1.0, 1.0 },   
    0.6, NULL, NULL };
LampT L2 = {
  { -2.0, -3.5, 1.5 },  
  {  0.9,  0.9, 1.0 },
  {  1.0,  1.0, 1.0 },
    0.3, NULL, NULL };

// TEXTUREN 

uchar Lattice[LatCnt][LatCnt][LatCnt];

typedef struct { int Nr; VCT3 c; } ColTabT;
typedef VCT3 ColList[COLTABSIZE+1];

ColList HellTab   = {{ 0,0,0 }};
ColList FlameTab  = {{ 0,0,0 }};
ColList MarbleTab = {{ 0,0,0 }};

ColTabT MarbleCols[4]  = {
  {  0, { 0.41,0.25,0.15}},
  { 36, { 0.20,0.43,0.55}},
  { 66, { 0.91,0.93,0.88}},
  {127, { 0.64,0.66,0.55}}};

ColTabT HellCols[4]  = {
  {  0, { 0.7, 0.2, 0.1}},
  { 30, { 0.5, 0.8, 0.2}},
  { 60, { 0.9, 0.7, 1.0}},
  {127, { 0.7, 0.4, 0.0}}}; 

ColTabT FlameCols[6] = {
  {  0, { 1.00,1.00,1.00}},
  { 59, { 0.99,0.77,0.42}},
  { 63, { 0.75,0.80,0.51}},
  { 88, { 0.90,0.59,0.34}}, 
  {108, { 0.69,0.38,0.30}},
  {127, { 0.12,0.19,0.54}}}; 

#include &quot;surfaces.h&quot;

float  h,v;

/*
#ifdef DEBUG
VCT3   c;
ulong  nr_of_rays = 0; // 1199073 
ulong  nr_of_refl = 0; //  574234 
ulong  nr_of_refr = 0; //  438160 
ulong  nr_of_coh  = 0; //   66893
ulong  nr_of_rest = 0; //   42195
#endif
*/

void 
Translate_Object (Prims *Obj, VCT3 *t)
{
  TransformT trans;
  Compute_Translation_Transform (&amp;trans, t);
  Compose_Transforms (&amp;Obj-&gt;transform, &amp;trans);
}

void 
Scale_Object (Prims *Obj, VCT3 *s)
{
  TransformT trans;
  Compute_Scaling_Transform (&amp;trans, s);
  Compose_Transforms (&amp;Obj-&gt;transform, &amp;trans);
}

void 
Rotate_Object (Prims *Obj, VCT3 *r)
{
  TransformT trans;
  Compute_Rotation_Transform (&amp;trans, r);
  Compose_Transforms (&amp;Obj-&gt;transform, &amp;trans);
}

void 
Fehler(void)
{
  blau();
  printf(&quot;%c%s&quot;,7,&quot;Nullvektor!\n&quot;);
  printf(&quot;Spalte: %d, Zeile: %d, Tiefe: %d\n&quot;,x,y,RDepth);
  exit(1);
}

// RAYTRACER-ZUSATZ-FUNKTIONEN 

void 
FogProc (VCT3 *p,VCT3 *c)
{
  float factor;

  if ((factor = exp(-vdist(p,&amp;Frame.eye)/Frame.Fog_Distance)) &lt; 0.2)
    factor = 0.2;
  factor += 0.1 * turbulence(p);
  c-&gt;x = (c-&gt;x - Frame.Fog_Color.x) * factor + Frame.Fog_Color.x;
  c-&gt;y = (c-&gt;y - Frame.Fog_Color.y) * factor + Frame.Fog_Color.y;
  c-&gt;z = (c-&gt;z - Frame.Fog_Color.z) * factor + Frame.Fog_Color.z;
}

void    
Gravity(Prims *Obj) // ermittelt Schwerpunkt und von daher Umkugel 
{
  float  r,s,t;
  ellT   *ep;
  torT   *tp;
  moeT   *mp;  
  coneT  *cp;
  pyrT   *pp;
  steiT  *sp;
  agnT   *ap;
  boxT   *bp;
  superT *up;
  discT  *dp;
  lemniT *lp;
  int   i,j,ganz;             
  VCT3  d[2];

  switch(Obj-&gt;form)
  {
  case moebius:
    mp = Obj-&gt;UU.moeptr;
    r  = halb * mp-&gt;hoch * mp-&gt;hoch * (1 - M_SQRT_2);
    Obj-&gt;umkugel = mp-&gt;ax;
    Obj-&gt;umkugel.m.x += r;
    Obj-&gt;umkugel.m.y += r;
    Obj-&gt;umkugel.rad += mp-&gt;hoch * 2; 
    break;  
  case steiner:
    sp = Obj-&gt;UU.steiptr;
    Obj-&gt;umkugel.m = sp-&gt;m;
    r = fabs(sp-&gt;ax.x);
    s = fabs(sp-&gt;ax.y);
    Obj-&gt;umkugel.rad = t = fabs(sp-&gt;ax.z);
    if (r &gt; s)
      {
      if (r &gt; t)
        Obj-&gt;umkugel.rad = r;
      }
    else
      {
      if (s &gt; t)
        Obj-&gt;umkugel.rad = s;
      }
    break;
  case torus:
    tp = Obj-&gt;UU.torptr;
    Obj-&gt;umkugel.m = tp-&gt;m;
    Obj-&gt;umkugel.rad = tp-&gt;a + tp-&gt;b;
    tp-&gt;a *= tp-&gt;a;
    tp-&gt;b *= tp-&gt;b;
    break;
  case ellipsoid:
    ep = Obj-&gt;UU.ellptr;
    ep-&gt;mat.a = r = 1./(ep-&gt;ax.x * ep-&gt;ax.x);
    ep-&gt;mat.e = s = 1./(ep-&gt;ax.y * ep-&gt;ax.y);
    ep-&gt;mat.h = t = 1./(ep-&gt;ax.z * ep-&gt;ax.z); // Hyperboloid = -h
    ep-&gt;mat.d = -ep-&gt;m.x*r;
    ep-&gt;mat.g = -ep-&gt;m.y*s;
    ep-&gt;mat.i = -ep-&gt;m.z*t;     // Hyperboloid = -i 
    ep-&gt;mat.j = (ep-&gt;m.x*ep-&gt;m.x*r + ep-&gt;m.y*ep-&gt;m.y*s
      + ep-&gt;m.z*ep-&gt;m.z*t)-1;
    Obj-&gt;umkugel.m = ep-&gt;m;
    r = fabs(ep-&gt;ax.x);
    s = fabs(ep-&gt;ax.y);
    Obj-&gt;umkugel.rad = t = fabs(ep-&gt;ax.z);
    if (r &gt; s)
      {
      if (r &gt; t)
        Obj-&gt;umkugel.rad = r;
      }
    else
      {
      if (s &gt; t)
        Obj-&gt;umkugel.rad = s;
      }
    break;
  case kegel:
    cp = Obj-&gt;UU.coneptr;
    r  = cp-&gt;apex.rad * cp-&gt;apex.rad;
    Obj-&gt;umkugel.m = cp-&gt;apex.m;
    Obj-&gt;umkugel.m.z = halb*(r+1) * (cp-&gt;high+cp-&gt;low) - r * cp-&gt;apex.m.z;
    Obj-&gt;umkugel.rad = r * (cp-&gt;apex.m.z-cp-&gt;high) * (cp-&gt;apex.m.z-cp-&gt;high) 
      + (cp-&gt;high-Obj-&gt;umkugel.m.z) * (cp-&gt;high-Obj-&gt;umkugel.m.z);
    Obj-&gt;umkugel.rad = sqrt(Obj-&gt;umkugel.rad);
    break;
  case pyramide: 
    pp = Obj-&gt;UU.pyrptr;
    pp-&gt;planes = 4;
    for (i = 0; i &lt; 4; i++)
      {
      pp-&gt;flaechen[i].vertnum = 3;
      pp-&gt;flaechen[i].polynum = 4;
      }
    ganz = pp-&gt;flaechen[0].vertnum * pp-&gt;planes;
    Obj-&gt;umkugel.rad = 0;
    Obj-&gt;umkugel.m.x = Obj-&gt;umkugel.m.y = Obj-&gt;umkugel.m.z = 0;
    for (i = 0; i &lt; pp-&gt;planes; i++)
      for (j = 0; j &lt; pp-&gt;flaechen[i].vertnum; j++)
        vaddeq(&amp;Obj-&gt;umkugel.m,&amp;pp-&gt;flaechen[i].vtx[j]);
    vscaleeq(&amp;Obj-&gt;umkugel.m,1./(float)ganz);
    for (i = 0; i &lt; pp-&gt;planes; i++)
      for (j = 0; j &lt; pp-&gt;flaechen[i].vertnum; j++)
        if ((r = vdist(&amp;Obj-&gt;umkugel.m,&amp;pp-&gt;flaechen[i].vtx[j])) 
          &gt; Obj-&gt;umkugel.rad)
         Obj-&gt;umkugel.rad = r;
    for (i = 0; i &lt; pp-&gt;planes; i++)
      {
      vsub(d,&amp;pp-&gt;flaechen[i].vtx[1],&amp;pp-&gt;flaechen[i].vtx[0]);
      vsub(&amp;d[1],&amp;pp-&gt;flaechen[i].vtx[2],&amp;pp-&gt;flaechen[i].vtx[1]);
      vcross(&amp;pp-&gt;flaechen[i].normal,d,&amp;d[1]);
      normalize(&amp;pp-&gt;flaechen[i].normal);
      vsubnorm(d,&amp;pp-&gt;flaechen[i].vtx[0],&amp;Obj-&gt;umkugel.m);
      // gegen falsche Reihenfolge von Polygonecken-Eingaben
      negnorm(d,&amp;pp-&gt;flaechen[i].normal);
      }
    break;
  case box:
    bp = Obj-&gt;UU.boxptr;
    bp-&gt;planes = 6;
    for (i = 0; i &lt; 6; i++)
      {
      bp-&gt;flaechen[i].vertnum = 4;
      bp-&gt;flaechen[i].polynum = 6;    
      } 
    ganz = bp-&gt;flaechen[0].vertnum * bp-&gt;planes;
    Obj-&gt;umkugel.rad = 0;
    Obj-&gt;umkugel.m.x = Obj-&gt;umkugel.m.y = Obj-&gt;umkugel.m.z = 0;
    for (i = 0; i &lt; bp-&gt;planes; i++)
       for (j = 0; j &lt; bp-&gt;flaechen[i].vertnum; j++)
        vaddeq(&amp;Obj-&gt;umkugel.m,&amp;bp-&gt;flaechen[i].vtx[j]);
    vscaleeq(&amp;Obj-&gt;umkugel.m,1./(float)ganz);
    for (i = 0; i &lt; bp-&gt;planes; i++)
      for (j = 0; j &lt; bp-&gt;flaechen[i].vertnum; j++)
        if ((r = vdist(&amp;Obj-&gt;umkugel.m,&amp;bp-&gt;flaechen[i].vtx[j]))
          &gt; Obj-&gt;umkugel.rad)
          Obj-&gt;umkugel.rad = r;
    for (i = 0; i &lt; bp-&gt;planes; i++)
      {
      vsub(d,&amp;bp-&gt;flaechen[i].vtx[1],&amp;bp-&gt;flaechen[i].vtx[0]);
      vsub(&amp;d[1],&amp;bp-&gt;flaechen[i].vtx[2],&amp;bp-&gt;flaechen[i].vtx[1]);
      vcross(&amp;bp-&gt;flaechen[i].normal,d,&amp;d[1]);
      normalize(&amp;bp-&gt;flaechen[i].normal);
      vsubnorm(d,&amp;bp-&gt;flaechen[i].vtx[0],&amp;Obj-&gt;umkugel.m);
      // gegen falsche Reihenfolge von Polygonecken-Eingaben 
      negnorm(d,&amp;bp-&gt;flaechen[i].normal);
      }
    break;
  case agnesi:
    ap = Obj-&gt;UU.agnptr;
    Obj-&gt;umkugel.m = ap-&gt;ax.m;
    Obj-&gt;umkugel.rad = ap-&gt;scale*1.2;
    Create_Transform (&amp;Obj-&gt;transform);
    Rotate_Object (Obj,&amp;ap-&gt;rotate);
    d[0].x = d[0].y = d[0].z = ap-&gt;scale;
    Scale_Object(Obj,d);
    Translate_Object (Obj,&amp;ap-&gt;ax.m);
    break;
  case super:
    up = Obj-&gt;UU.supptr;
    Obj-&gt;umkugel.m = up-&gt;m;
    if (up-&gt;scal.x &gt; up-&gt;scal.y)
      {
      if (up-&gt;scal.x &gt; up-&gt;scal.z)
        Obj-&gt;umkugel.rad = up-&gt;scal.x;
      else
        Obj-&gt;umkugel.rad = up-&gt;scal.z;
      }  
    else
      {
      if (up-&gt;scal.y &gt; up-&gt;scal.z)
        Obj-&gt;umkugel.rad = up-&gt;scal.y;
      else
        Obj-&gt;umkugel.rad = up-&gt;scal.z;
      }
    Obj-&gt;umkugel.rad *= 1.7;   // groesser als theoretisches sqrt(2) 
    Create_Transform (&amp;Obj-&gt;transform);
    Scale_Object (Obj,&amp;up-&gt;scal);
    Translate_Object (Obj, &amp;up-&gt;m); 
    break;
  case disc:
    dp = Obj-&gt;UU.discptr;
    Obj-&gt;umkugel = dp-&gt;ax;
    normalize (&amp;dp-&gt;normal);
    dp-&gt;d = -vdot(&amp;dp-&gt;ax.m,&amp;dp-&gt;normal);   // v.Mangold, I,430  
    break;
  case lemni:
    lp = Obj-&gt;UU.lemniptr;
    Obj-&gt;umkugel = lp-&gt;ax;
    Create_Transform (&amp;Obj-&gt;transform);
    Rotate_Object(Obj,&amp;lp-&gt;rotate);       
    Translate_Object(Obj,&amp;lp-&gt;ax.m);
    break;   
  default: ;
  }
}

// Spezielle Schnittpunkt-Routinen 

#define DEPTH_TOLERANCE 1.0e-3  // double 1.0E-4  
#define ZERO_TOLERANCE  1.0e-4  // double 1.0E-10 
#define SGNX(x) (((x) &gt;= 0.) ? 1 : -1)
#define MAX_ITERATIONS  20
#define PLANECOUNT      9

static float planes[PLANECOUNT][4] =
   {{1, 1, 0, 0}, {1,-1, 0, 0},
    {1, 0, 1, 0}, {1, 0,-1, 0},
    {0, 1, 1, 0}, {0, 1,-1, 0},
    {1, 0, 0, 0}, {0, 1, 0, 0},
    {0, 0, 1, 0}};

char 
intersect_box (VCT3 *P, VCT3 *D, float *dmin, float *dmax)
{
  float tmin,tmax;

  // Links/rechts 
  if (fabs (D-&gt;x) &gt; Epsilon)
    {
    if (D-&gt;x &gt; Epsilon)
      {
      if ((*dmax = (1-P-&gt;x) / D-&gt;x) &lt; Epsilon)
        return FALSE;
      *dmin = (-1-P-&gt;x) / D-&gt;x;
      }
    else
      {
      if ((*dmax = (-1-P-&gt;x) / D-&gt;x) &lt; Epsilon)
        return FALSE;
      *dmin = (1-P-&gt;x) / D-&gt;x;
      }
    if (*dmin &gt; *dmax) 
      return FALSE;
    }
  else
    {
    if (fabs(P-&gt;x) &gt; 1)
      return FALSE;
    *dmin = -Infinit;
    *dmax =  Infinit;
    }
  tmin = tmax = 0.;
  // Oben/unten 
  if (fabs (D-&gt;y) &gt; Epsilon)
    {
    if (D-&gt;y &gt; Epsilon)
      {
      tmin = (-1-P-&gt;y) / D-&gt;y;
      tmax = (1-P-&gt;y) / D-&gt;y;
      }
    else
      {
      tmax = (-1-P-&gt;y) / D-&gt;y;
      tmin = (1-P-&gt;y) / D-&gt;y;
      }
    if (tmax &lt; *dmax)
      {
      if (tmax &lt; Epsilon) 
        return FALSE;
      if (tmin &gt; *dmin)
        {
        if (tmin &gt; tmax) 
          return FALSE;
        *dmin = tmin;
        }
      else
        {
        if (*dmin &gt; tmax)
          return FALSE;
        }
      *dmax = tmax;
      }
    else
      {
      if (tmin &gt; *dmin)
        {
        if (tmin &gt; *dmax) 
          return FALSE;
        *dmin = tmin;
        }
      }
    }
  else
    {
    if (fabs(P-&gt;y) &gt; 1)
      return FALSE;
    }
  // Vorn/hinten 
  if (fabs (D-&gt;z) &gt; Epsilon)
    {
    if (D-&gt;z &gt; Epsilon)
      {
      tmin = (-1-P-&gt;z) / D-&gt;z;
      tmax = (1-P-&gt;z) / D-&gt;z;
      }
    else
      {
      tmax = (-1-P-&gt;z) / D-&gt;z;
      tmin = (1-P-&gt;z) / D-&gt;z;
      }
    if (tmax &lt; *dmax)
      {
      if (tmax &lt; Epsilon) 
        return FALSE;
      if (tmin &gt; *dmin)
        {
        if (tmin &gt; tmax) 
          return FALSE;
        *dmin = tmin;
        }
      else
        {
        if (*dmin &gt; tmax) 
          return FALSE;
        }
      *dmax = tmax;
      }
    else
      {
      if (tmin &gt; *dmin)
        {
        if (tmin &gt; *dmax) 
          return FALSE;
        *dmin = tmin;
        }
      }
    }
  else
    {
    if (fabs(P-&gt;z) &gt; 1)
      return FALSE;
    }
  *dmin = tmin;
  *dmax = tmax;
  return TRUE;
}
</code></pre>

<p>So here's the code for the raytracer written by media philosopher Chapter 6 of <em>Critical Code Studies</em>.  I'd like to invite you to an exploration of this portion.  I can also make the full file available for exploration.  I can also post the original German note. This version has been translated by the Hammermann family.</p>

<hr />

<p><strong>A few starter questions:</strong><br />
&#42; What parts seem to be written by Kittler? And what parts by others?<br />
&#42; How does this raytracer code complement his theoretical writing?</p>
]]>
        </description>
    </item>
    <item>
        <title>Native Land Digital and the rhetoric of maps</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/109/native-land-digital-and-the-rhetoric-of-maps</link>
        <pubDate>Tue, 18 Jan 2022 19:01:26 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>cfocht</dc:creator>
        <guid isPermaLink="false">109@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>Title: Native Land Digital<br />
Authors: Native Land Digital<br />
Years: 2015-present<br />
Runs in a web browser or as a mobile app.<br />
<a href="https://native-land.ca/" rel="nofollow">https://native-land.ca/</a></p>

<p>I saw some discussion emerging on this in the main thread yesterday and figured I'd start a code critique thread for it.</p>

<p>I really like maps! As an avid hiker, I find myself looking at topological maps often, and during a hike I'll often stop and just study the map—looking out at the geographic features and where they're represented on the map—because I like to have a sense of where I am in relation to the features of the land and where they are in relation to each other (in relation to their representations on this piece of paper). It helps me feel connected to the land that I'm on.</p>

<p>There's a lot of information that gets encoded into maps—intentionally or not—and like code, different modes of representation communicate different things and carry different assumptions. Maps are rhetorical, and affect our understanding of space as much as they are informed by it. Some things I'm thinking about here are a recent article from Real Life on <a rel="nofollow" href="https://reallifemag.com/you-are-here/" title="the rhetoric of digital maps">the rhetoric of digital maps</a>, how different projections impact perceived importance of different locations <em>cough cough</em> mercator projection <em>cough cough</em>, and as a John Green fan I'm often reminded of paper towns and how the history of paper towns <a rel="nofollow" href="https://www.youtube.com/watch?v=NgDGlcxYrhQ&amp;ab_channel=TED">influenced his novel</a> of the same name.</p>

<p>When I look at a new map, like this one or the one at <a href="https://thetruesize.com/" rel="nofollow">https://thetruesize.com/</a>, I often find myself reading it for a while like I do with trail maps for whatever rhetorical connection is being made with the land, and I just don't see very much in this map. The org's stated rhetoric for this make is for the reader to be surprised by the scale of the space occupied by native peoples and to be able to see which groups occupied various regions (which facilitates learning about the pre-settler history of the land). It does those two things well, but that's about all I got out of looking at it.</p>

<p>Even though they are <a rel="nofollow" href="https://native-land.ca/what-is-territory/">critical of settler notions of territory</a> it's still only readable in a settler's terms. Because the geographic data is limited, if I want to see which peoples previously occupied the lands that I currently live on, I have to use the "settler labels" overlay to be able to make sense of location (I say this as someone who's well-versed in using the geography). I think part of where this comes from, and more broadly, the technology they're using for the map itself emerged from European cartography. I was immediately reminded of a blog post written by a colleague of mine in the history department here at TTU on <a rel="nofollow" href="https://westernfictioneers.blogspot.com/2013/11/writing-about-indians-when-youre-not_30.html">differences spatial and temporal rhetoric</a> (which includes a maps), which launched me on a whole ADHD rabbit hole yesterday looking at the history of mapmaking and some different styles of maps used in various cultures around the world. Some of the especially interesting ones to me were Inuit coastline maps made from <a rel="nofollow" href="https://decolonialatlas.wordpress.com/2016/04/12/inuit-cartography/">carved driftwood</a> and the Polynesian <a rel="nofollow" href="http://thenonist.com/index.php/thenonist/permalink/stick_charts/">stick charts</a> that <a rel="nofollow" href="https://wg.criticalcodestudies.com/index.php?p=/profile/yaxu">@yaxu</a> mentioned already in the weekly discussion thread.</p>

<p>My impression is that the methods of cartography that came to inform digital maps (mainly GIS) were mostly influenced by "enlightenment-era" values surrounding their idea of scientific advancement. In particular, I'm thinking about "precision", which of course has its benefits—the reason the French government officially adopted the metric system was because it was useful to hold tax officials accountable—but it raises a question about where we run into the limitations of precision and where it turns out to be not-so-useful. Computers and modern maps are both designed with the assumption that precision is useful, so computer maps have that assumption as well. But this map of native territories and languages, it seems to me, is detrimented by the rhetoric of precision. The intent is to push back against colonial notions of borders/boundaries, which it does by rejecting the kind of "jigsaw-puzzle" composition where boundaries don't overlap, but even in the vaguely outlined regions that it denotes, there's still a hard line that precisely outlines the ends of a region. Not only does the precision obscure some of what the map wants to convey, but it also still communicates some rhetoric of borders.</p>

<p>Anyway, I think that's enough for a starter. I'm really curious to hear from folks who have more knowledge of indigenous modes of thought.</p>
]]>
        </description>
    </item>
    <item>
        <title>Consio™ Initializer (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/112/consio-initializer-2022-code-critique</link>
        <pubDate>Fri, 21 Jan 2022 16:02:53 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>p_yes_t</dc:creator>
        <guid isPermaLink="false">112@/index.php?p=/discussions</guid>
        <description><![CDATA[<p><strong>Title: consio initializer<br />
Author/s: Consio™<br />
Language/s: consio v0.97<br />
Years of development: multiple</strong><br />
Software/hardware requirements (if applicable): Consio biosource library, proprietary hardware (now illegal)<br />
full code: <a rel="nofollow" href="https://github.com/codeArchivist/consio_initial" title="here">github repository here</a></p>

<p>The language Consio is a minimal metaprogramming language consisting of code and sensate descriptions. It is developed for virtual worlds. This code, when optically loaded produces a mesh that supports Coscio programs. It is the basis for all future versions of cognitive integration code. Currently this code takes a Coscio program and considers it into a multi-threaded bitstream which runs on the optical neural accelerator (ONA)</p>

<p>This version of CI does not actually run; it produces the mesh which will eventually be loaded by later versions for support. Without a biosource, the code cannot be 'seen', and is thus legal.</p>

<p>[code snippet: <a rel="nofollow" href="https://github.com/codeArchivist/consio_initial/blob/main/primary.sxi" title="primary.sxi">primary.sxi</a>]</p>

<blockquote><div>
  <p>≤#êêêêê‹≈]<a rel="nofollow" href="https://wg.criticalcodestudies.com/index.php?p=/profile/%C2%A1%C3%B9%E2%80%B9%C3%AA">@¡ù‹ê</a> and   ‹2, ê to    %t@:<br />
  ê not 1,  ê ...not!   ‹ê not one, but many. You see, there's {one} me and then there are {many} me s .<br />
  ≤#    ê and   ê and ... so on for as long as necessary...<br />
  ê has 2:  ê one:</p>
  
  <p>continues: 0: 1</p>
  
  <p>≥#<br />
  ê ù<br />
  ê ù é é<br />
  »&gt; <br />
  ê ù <br />
  ê has 1:  ê one: <br />
  continues: 0: <br />
  2-n*<br />
  ≥# ê á ú<br />
  ‹2 = n &gt; 2, then the article is "ñíñ" . 20 &gt; 2, so the article is "ñíì" .<br />
  É á ú é É ù ê éó é è‹æ    1-n*<br />
  ≥# é é <br />
  »&gt; #é ê has   1:  ê one: <br />
  continues: 0:</p>
</div></blockquote>

<p>This code is a good example of the ethos that drove Coscio: a project which originally started as a means to disseminate information quickly, evolved into a framework for creating massive, real-time D3™ "seeing" quickly.</p>

<p>Questions:<br />
1. Knowing how and why this entire code library eventually became illegal to use or even hold outside of regulated research, can we see in this early version hints at what Consio would become?<br />
2. In the snipped above, the root concept of deep-dissociation-by-design D3™ is already evident in the Consio prompt. Is there space for this concept in contemporary neuromesh code projects?<br />
3. In regard to the project that brought Consio down, the rise of Saving Our Children D3 (the "deep conversion therapy plot"), are resulting restrictions too onerous on spiritual software?</p>
]]>
        </description>
    </item>
    <item>
        <title>Ringing the Changes (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/118/ringing-the-changes-2022-code-critique</link>
        <pubDate>Mon, 24 Jan 2022 21:41:14 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>cschnitz</dc:creator>
        <guid isPermaLink="false">118@/index.php?p=/discussions</guid>
        <description><![CDATA[<p><em><a rel="nofollow" href="https://github.com/juliannechat/ringing-the-changes" title="Ringing the Changes">Ringing the Changes</a></em><br />
Stephanie Strickland (director), Jules Chatelain (specifications), Anne Marie Merritt (Python code), Bryn Reinstadler (Scientific Triples code), Nick Montfort (series editor)<br />
Python (RTC code), R (Scientific Triples code)<br />
2020</p>

<p><em>Ringing the Changes</em> uses a Python coded algorithm, layered atop one programmed in R, to perform all the permutations (though only 161 are printed in the physical book) of change-ringing, an homage to 17th century English bell-ringing, where “ordinary folk…sought to ring all 7! (7 x 6 x 5 x 4 x 3 x 2 x 1=5040 ) permutations—all the different arrangements or ‘changes’ possible—with seven bells.” Six of Strickland’s technotext ‘bells’ are primarily based on one source apiece, the seventh is a medley of others. Bells are ‘rung’ algorithmically in different mathematical permutations, a cacophonous, strategic reminder of the ‘changes’ that need to be rung socially and societally—changes for the increasingly volatile climate, racial inequity and injustice, among others.</p>

<p>With code manipulating and separating the human body from what was once a very embodied endeavor (bell ringing), agency is distanced and complicated. We, with Strickland and her bells, are allowed to see each ‘bell’ on its own and in combination with others, repeating and juxtaposing in new configurations throughout each ‘change’ poem. In her <a rel="nofollow" href="https://electronicbookreview.com/essay/a-review-of-stephanie-stricklands-ringing-the-changes/" title="masterful review in ebr">masterful review in <em>ebr</em></a>, Sarah Whitcomb Laiola notes that this refusal of hierarchy within each of the ‘bells’’ voices demonstrates an “ambivalence to [human] attention.” Because the Ringing [of] the Changes is automated, permutable, and ongoing, with Strickland and her team of programmers setting it into motion and simply letting it run, it seems to act as a force for and representative of reality, constant reminders that things (our society’s treatment of the environment, of structural inequity and violence…) need to change, regardless of if we (any humans engaged with the work—producers or consumers…) are paying attention to them or not.</p>

<p>Looking at and playing with the underlying code, though, seems to complicate this initial read of the role of human attention in (and to) the piece, with divisions emerging (as they do) around form and content. At the top of the GitHub repository, the “<a rel="nofollow" href="https://github.com/juliannechat/ringing-the-changes" title="WHAT IS THIS">WHAT IS THIS</a>” section reads, in part, “This is the code used to create the book's structure. It allows you to use your own content to create a similar project.” The Python code used to generate the project, subsequently shared on the GitHub repo, is incredibly thoughtful and well-organized, with comments after every few lines or so to successfully guide even inexperienced programmers like myself through the construction of the program. The code seems, on its surface, to present a different attitude towards human (mostly readerly) attention than the text itself. The questions below follow directly from this apparent tension, though I’d love to hear folks’ thoughts on RTC in general, since it is such a generative text and program.</p>

<pre><code>    if __name__ == &quot;__main__&quot;:
        #This is where all the action is.  Main!

        # Read the command line args, if any.
        args = _parse_args()

        # Read the manifest to setup the group to filename mapping
        _init_manifest_dict(args.group_dir)

        # Read the group files to extract the text for each entry
        _load_groups(args.group_dir)
        with open(args.input_name, mode='r', encoding=_encoding_style) as ringfile, \
                open(args.output_name, mode='a', encoding=_encoding_style) as outfile:
        # Read the bells file to determine what group to read text from, and emit to the output file.
            _parse_ringtones(ringfile, outfile)

        print((&quot;Emited output to {}&quot;).format(os.path.abspath(args.output_name)))
        sys.exit(0)
</code></pre>

<p>Questions:</p>

<ul>
<li>How is a reader meant to see themselves in relationship to the code of RTC? To the printed text?</li>
<li>What tensions, if any, exist between code and printed text in RTC, with particular respect to readerly attention?</li>
<li>How does the collaborative construction of the RTC program (Merrit’s code built upon Reinstadler’s) impact our understanding of the text, if at all?</li>
</ul>
]]>
        </description>
    </item>
    <item>
        <title>Arf Magna (2022 Code Critique) by Nick Montfort</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/110/arf-magna-2022-code-critique-by-nick-montfort</link>
        <pubDate>Tue, 18 Jan 2022 22:40:47 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>nickm</dc:creator>
        <guid isPermaLink="false">110@/index.php?p=/discussions</guid>
        <description><![CDATA[<h1>Arf Magna</h1>

<h2>Nick Montfort</h2>

<h2>HTML with CSS and JavaScript</h2>

<h2>2021</h2>

<h2>Platform: Recent version of a major Web browser (Chrome, Chromium, Firefox, Safari, Edge)</h2>

<h2>Live Version: <a rel="nofollow" href="https://nickm.com/poems/arf_magna.html" title="https://nickm.com/poems/arf_magna.html">https://nickm.com/poems/arf_magna.html</a></h2>

<h3>Possible discussion questions</h3>

<ol>
<li>To what genre or form would you say “Arf Magna” belongs?</li>
<li>“Arf Magna” declares that it is “after” Ramón Llull; how exactly do you understand the relationship of “Arf Magna” to the work and ideas of Llull?</li>
<li>Do you see any importance to line 10 (the license) and the easily accessed source, and if so, what is it? How does it relate to the answer to (2)?</li>
</ol>

<h3>A Note from the Author/Programmer</h3>

<p>Obviously I have some thoughts about these questions myself — I’m glad to discuss them, too. However, by asking for your thoughts I don’t mean to quiz you and see if you get the right answers to “Arf Magna.” I consider this an open work, with your interpretation of it as valid as mine.</p>

<pre><code>&lt;!doctype html&gt;
&lt;html&gt;

&lt;head&gt;
  &lt;meta charset=&quot;utf-8&quot;&gt;
  &lt;title&gt;Arf Magna&lt;/title&gt;

  &lt;!-- © 2021 Nick Montfort

Copying and distribution of this file, with or without modification, are permitted in any medium without royalty provided the copyright notice and this notice are preserved. This file is offered as-is, without any warranty.
--&gt;

  &lt;style&gt;
    body {
      margin: 0;
      overflow: hidden
    }

    #outer {
      height: 100vh;
      width: 100vw;
      background: #000
    }

    header,
    a {
      color: #fff
    }

    p {
      margin: 10px;
      font-size: 16px;
      text-align: right;
      font-family: monospace
    }

    #proposition {
      text-align: center;
      margin-top: 45vh;
      font-size: 7vh;
      font-weight: bold;
      color: #fff
    }
  &lt;/style&gt;
&lt;/head&gt;

&lt;body&gt;
  &lt;div id=&quot;outer&quot;&gt;
    &lt;header&gt;
      &lt;p&gt;&lt;span id=&quot;label&quot;&gt;ARF MAGNA　
          after &lt;a href=&quot;https://plato.stanford.edu/entries/llull/#FiguQuatPhasTheiFunc&quot; target=&quot;_blank&quot;&gt;Ramón Llull&lt;/a&gt;　
          by &lt;a href=&quot;https://nickm.com&quot; target=&quot;_blank&quot;&gt;Nick Montfort&lt;/a&gt;　
          in memoriam &lt;a href=&quot;https://i.redd.it/oe6xt37xhvd01.jpg&quot; target=&quot;_blank&quot;&gt;Pepys&lt;/a&gt;&lt;/span&gt;　
        &lt;span onclick=&quot;toggleFull()&quot;&gt;⛶&lt;/span&gt;
      &lt;/p&gt;
    &lt;/header&gt;
    &lt;div id=&quot;proposition&quot;&gt;(click to begin)&lt;/div&gt;
  &lt;/div&gt;
  &lt;script&gt;
    const a = 'delightful,playful,close,kind,loyal,cute,good,willful,warm,joyful,inquisitive,eager,attentive,cheerful,soft,love'.split(',');
    const b = '_,_,_,_,ty,_,_,_,th,_,_,_,_,_,_,'.replace(/_/g, 'ness').split(',');
    const freqs = [261.6, 293.7, 329.6, 392, 440, 523.3, 587.3, 659.7];
    var ac, oscNodes = [], gainNodes = [];
    var bell = 0, i = 0, j = 0, last = j;
    function toggleFull() {
      if (1 &gt;= outerHeight - innerHeight) {
        let leave = document.exitFullscreen || document.webkitExitFullscreen || document.mozCancelFullscreen || document.msExitFullscreen;
        leave.call(document);
      } else {
        let enter = outer.requestFullscreen || outer.webkitRequestFullscreen || outer.mozRequestFullscreen || outer.msRequestFullscreen;
        enter.call(outer);
      }
    }
    function checkLabel() {
      if (1 &gt;= outerHeight - innerHeight) {
          label.style.visibility = &quot;hidden&quot;;
        } else {
          label.style.visibility = &quot;visible&quot;;
        }
    }
    function note(freq) {
      oscNodes[bell].frequency.value = freq;
      for ([volume, time] of [[.2, 1], [0, 8]]) {
        gainNodes[bell].gain.linearRampToValueAtTime(volume, ac.currentTime + time);
      }
      bell = (bell + 1) % 8;
    }
    function start() {
      outer.removeEventListener('click', start);
      ac = new (window.AudioContext || window.webkitAudioContext);
      for (let n = 0; n &lt; 8; n = n + 1) {
        oscNodes[n] = ac.createOscillator();
        oscNodes[n].frequency.value = 440;
        gainNodes[n] = ac.createGain();
        gainNodes[n].gain.value = 0;
        oscNodes[n].connect(gainNodes[n]);
        gainNodes[n].connect(ac.destination);
        oscNodes[n].start(0);
      }
      document.addEventListener('mousemove', e =&gt; {
        i = ~~(16 * e.pageX / window.innerWidth);
        j = ~~(16 * e.pageY / window.innerHeight);
        if (j !== last) {
          note(freqs[j % 8]);
        }
        proposition.innerText = 'dog’s ' + a[i] + b[i] + ' is ' + a[j];
        outer.style.background = &quot;#&quot; + (4194304 + ~~(16777216 / i + 1)).toString(16);
        previous = j;
      });
      document.dispatchEvent(new Event('mousemove'));
      setInterval(checkLabel, 500);
    }
    outer.addEventListener('click', start);
  &lt;/script&gt;
&lt;/body&gt;

&lt;/html&gt;
</code></pre>
]]>
        </description>
    </item>
    <item>
        <title>BrainSim: Neural Network on a Commodore 64 (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/117/brainsim-neural-network-on-a-commodore-64-2022-code-critique</link>
        <pubDate>Sun, 23 Jan 2022 19:04:07 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>onebigear</dc:creator>
        <guid isPermaLink="false">117@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>Title: Neural Network on a Commodore 64<br />
Author: John Walker <br />
Language: C64 BASIC<br />
Year of development:1987<br />
Software/hardware requirements: Commodore 64 and C64 BASIC</p>

<p>While searching for artificial intelligence that predated the 2000s, I stumbled upon John Walker's blog and found this blogpost on <a rel="nofollow" href="https://fourmilab.ch/documents/commodore/BrainSim/" title="program">BrainSim</a>. It is a program written in 1987 in less than 250 lines of BASIC to implement a neuron network. The program can train the computer to memorize the patterns of three alphabet letters. It is limited to three because of limited memory capacity. The method to use neuron cells as on and off switches was proposed in the 50s. <a rel="nofollow" href="https://www.ling.upenn.edu/courses/cogs501/Rosenblatt1958.pdf" title="This paper">This paper</a> written by Frank Rosenblatt is a reference for the method. I haven't read the paper at all. I took an applied machine learning class here in CU, a picture of the "perceptron", referring to the neural switch, was in the slides. The class was more technical than investigating the algorithm as a historical artefact, so I looked up the paper outside of class. The extent of what I digested the mechanism is that our neurons receive a stimulus, as the stimulus increases over a threshold, it makes a reaction, and transmits the stimulus onto to other neurons. In the program we are looking at in this thread, there are two layers of neurons. The network is visualized in the blog post in detail. The author also explained how it was implemented (a character is an array of size 42, as the character field is 6 by 7).</p>

<p>I was fascinated by this program. As said in the introduction I am a grad assistant working in the Media Archaeology Lab. Our volunteer helped me to "flash" the program onto a floppy disk and we were playing with it on the lab C64. What's beyond this fascination other than seeing Artificial Intelligence running on a C64? These are my questions open for criticism:</p>

<p>1) Can critical code studies blend in with media archaeology? Does the definition of media archaeology allow that? (I argue with a lab colleague on this). While media archaeology emphasizes the experiences interacting with media, I contend that this is a piece of code running on a media. The C64 as a computer through which the program is mediated. As we study the code, the code actualizes (is this the right word?) itself through the media of C64.</p>

<p>2) I also think about how <a rel="nofollow" href="https://wg.criticalcodestudies.com/index.php?p=/profile/nickm">@nickm</a> defined his practice of writing BASIC poems on Apple II and C64 as platform specific. The specificity has also to do with cultural and historical sensitivity.</p>

<blockquote> "I should point out, however, than many people—millions of us—first learned how to write programs like these using BASIC" </blockquote>

<p>This sentiment is true for our lab community as well. Personally, I was born after the sweet spot, by the time I started to fuzz with computers, they were running on GUIs and had a less programmable interface. Other than historical sensitivity, the program presented to me a sensitivity of limit - it could train 3 letters and not more.</p>

<p>3) Lastly, I also want to bring in the concern of retro-fetishism. I see the risk of falling into retro-fetishism, in context of the MAL, if the program is presented in lack of critical strategies. How do you see the program be critically presented with thoughtful strategies: a good and accessible code review, demonstrations of the training process, comparison with artificial intelligence as implemented in modern platforms (ie Jupyternote book)?</p>

<p>I propose this thread both out of my own academic interest and context of the MAL.</p>
]]>
        </description>
    </item>
    <item>
        <title>emoji as asymmetries, playful data structure (2022 Code Critique)</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/106/emoji-as-asymmetries-playful-data-structure-2022-code-critique</link>
        <pubDate>Sun, 16 Jan 2022 19:58:14 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>onebigear</dc:creator>
        <guid isPermaLink="false">106@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>Title: Asymmetrical Emoticon<br />
Author: Biyi Wen<br />
Language: No language, concept only<br />
Year: 2021</p>

<p><img src="https://wg.criticalcodestudies.com/uploads/editor/td/jtrcy8tjjnq3.jpg" alt="" title="" /><br />
In context of a data structure class, there is a method to tell if there are balancing brackets in a string. Similarly, the method can be applied to tell if a word is a palindrome.</p>

<p>I use the metaphor of a stack of plates to understand this structure. I have a handful of plates, each to represent a character. I count the plates and see if they are even or odd. In the example I've drawn, the emoji (0_0) is odd numbered, consisted of five chars. Like stacking plates, I stack the plates one by one upon each other, until the one in the very middle. I remove the one on the middle and start my comparing process. I check: the chars are matching! I remove the plate on the top; I continue to check, the chars are matching again, I remove the plate on the top. If the (0_0) is symmetrical, the plate stack will be empty.</p>

<p><img src="https://wg.criticalcodestudies.com/uploads/editor/oj/cv91qoc2yn6r.jpg" alt="" title="" /><br />
<img src="https://wg.criticalcodestudies.com/uploads/editor/ti/e4u83iy5n21a.jpg" alt="" title="" /><br />
<img src="https://wg.criticalcodestudies.com/uploads/editor/0o/e1mv59ema79p.jpg" alt="" title="" /><br />
<img src="https://wg.criticalcodestudies.com/uploads/editor/lq/dj660djvh67h.jpg" alt="" title="" /></p>

<p>A handful of questions arise when critically examining this concept. As a method, it is intended to yield binary results: a string of characters can either pass the test, or not pass the test. The following are the emojis I gathered that are more nuanced than the one appeared in the example, and I see them challenging the either/or narrative. Faces (let's make this scope of mammals, because there are emojis of animal faces) are not symmetrical. The "eyebrows" of (=•́ܫ•̀=) tilts, expressing a nuanced expression. How do we see the possibility to queer? Do we see qualities that are specific to the culture of emoticons? Another question also arose when I started to think of how visually faces are perceived and processed by humans. Having undergone learning experiences, humans are able to register if a face is <em>roughly</em> symmetrical. The difference between how humans think, and how the computer program makes decisions, provides a space of queering.</p>

<p>I've been too lazy to implement the program. It has some possibilities - I was thinking that machine learning can be used as a queering method for the concept. Here to initiate the discussion.</p>

<p>Emoticon examples:</p>

<p>┻━┻︵ &#40;°□°)/ ︵ ┻━┻</p>

<p>ฅ ̳͒•ˑ̫• ̳͒ฅ</p>

<p>(=•́ܫ•̀=)</p>
]]>
        </description>
    </item>
    <item>
        <title>How to Post a Code Critique Thread 2022</title>
        <link>https://wg.criticalcodestudies.com/index.php?p=/discussion/105/how-to-post-a-code-critique-thread-2022</link>
        <pubDate>Sat, 15 Jan 2022 18:27:39 +0000</pubDate>
        <category>2022 Code Critiques</category>
        <dc:creator>markcmarino</dc:creator>
        <guid isPermaLink="false">105@/index.php?p=/discussions</guid>
        <description><![CDATA[<p>We invite every member of the Working Group to start each one or more code critiques as their own threads. Categorize it as a Code Critique and use (2022 Code Critique) after the name of the code snippet so people can easily find it.</p>

<p>Be sure to include the following at the top:</p>

<p>Title<br />
Author/s<br />
Language/s<br />
Year/s of development<br />
Software/hardware requirements (if applicable)<br />
Then, place any context and questions you have about the code. It's helpful if you point the conversation in a direction.</p>

<p>Then include your code snippet. Use the code tags to access our context highlighting.</p>

<p>You can format code by</p>

<p>highlighting it in the forum editor<br />
clicking the paragraph or pilcrow (¶) button in the editor bar<br />
selecting "code"<br />
...OR by adding three backtick marks ``` on a line directly above your code and on the line directly below as well.</p>
]]>
        </description>
    </item>
   <language>en</language>
   </channel>
</rss>
