Go Code, Young Man

by Sean Gilleran

An architectural plan for South Hall at UC Santa Barbara. North Hall, built soon thereafter, would house its state-of-the-art computer center.

We earnestly urge upon all… to turn their faces Westward and colonize the public lands… The poorest citizen can scarcely be so poor as to be unable to acquire a farm, which a few years of industry and frugality will enable him to cultivate and make his own… A man of energy and good sense will work his way clear.

Horace Greely, 1895

On October 8, 2019, a young woman getting a haircut at the Imperial Kuttz barber shop in Des Moines, Iowa, suddenly found herself talking with then-presidential hopeful Kamala Harris. The woman told Harris that she was pursuing an undergraduate degree in political science. Harris, herself a bachelor of political science, suggested that she should “learn to code” instead.1 The reporter who captured the moment noted that “earlier” that same morning, Harris, the former attorney general of California, had suggested “the same thing” to “a student interested in law.” In a news round-up piece that week, Slate placed Harris at the bottom of its candidate rankings, for “sentencing Iowa’s children to a life of coding,” suggesting that “perhaps” it was “something for her to consider after she finishes fifth in Iowa.”2 Harris did not finish at all in Iowa, withdrawing from the race entirely two months later, but she is not the only 2020 candidate to have suggested computer programming as a path to self-improvement. In May, current Democratic front-runner Joe Biden touted his involvement with a subsidy program “through community colleges” that “taught people how to code.” “My father used to have an expression,” he continued, “It’s about your dignity. It’s about your self-respect.”3

Explaining away un- and under-employment with the pop psychology of self-actualization is not a new trope in American politics, but perhaps what makes the “learn to code” cliché so peculiarly enduring is that it has become a core theme of programming pedagogy itself. In the preface to his best-selling textbook, Learn Python the Hard Way, programmer Zed Shaw encourages his reader to “remember that anything worth doing is difficult at first,” and that although they may be “afraid of failure” or “never [have] learned self-discipline,” or might think themselves “gifted,” they must nevertheless “keep at it.” The journey will be difficult, but the tools will be simple: all you need is “whatever computer you have right now that works.” “If you go through this whole book and you still do not understand how to code,” Shaw writes, “at least you gave it a shot. You can say you tried your best and a little more and it didn’t work out, but at least you tried. You can be proud of that.”4

There was nothing particularly inspirational about the first American coding textbooks. They were designed to eliminate jobs, not create them. COBOL, the most popular programming language of the 1960s by a wide margin, was supposed to be intelligible to managers so that they would not be entirely beholden to the expertise of the engineers working under them. The Department of Defense demanded COBOL compilers be made available with the computers they purchased, which meant that compilers were available for almost every computer on the market. Engineers, unsurprisingly, found it cumbersome and dry. James Saxon’s “Self-Instructional Manual” for COBOL published in 1963 begins by warning the reader that “computer programming is a difficult and exacting profession” and that “the study of this book will not develop expert COBOL programmers;” that it is merely a “simplified introduction to a difficult subject.” Saxon promises us only that he “will teach the basic rules of COBOL.”5 A similar introduction to COBOL’s more academic counterpart, FORTRAN, a language “intended to be capable of expressing any problem of numerical computation,” hopes merely to present its subject “in a form that most people can master in a few hours of careful reading and practice.” The author, Daniel McCracken, imagines four potential uses for his book: as the basis of “a one-semester hour course in engineering, science, or mathematics,” as a supplemental activity for “some other course,” as one part of “an industrial course” built around “realistic problems,” or, perhaps, “for individual study, either in schools or industry.”6

Yet so-called “automatic” languages like COBOL and FORTRAN did not succeed in replacing programmers. If anything, the transition from bare-metal assembly languages to higher-order abstractions encouraged deeper specialization and expertise. Over the course of the next two decades, in the context of the so-called “post-industrialization” of the American economy, introductory programming manuals became protective talismans against the tumult of the future; deeds to the would-be homesteaders of the digital frontier.7 Bob Albrecht introduces his 1972 manual, My Computer Likes Me When I Speak in BASIC, as a “book about people [and] computers.” The reader will learn how to “communicate with a computer,” he writes, “about population problems.” To succeed, they will have to “EXPERIMENT! GAMBLE! GUESS, … THEN TRY IT!”8 Even the somewhat less groovy 1978 C Programming Language by Brian Kernighan and Dennis Ritchie describes C as “pleasant, expressive, and versatile” and, after opening with a friendly program to print “hello, world” to the terminal, asks the reader to “experiment with leaving out parts… to see what error messages you get.” “With these mechanical details mastered,” they write, “everything else is comparatively easy.”9

A cursory word analysis helps reveal this shifting attitude and suggests, at least, a deeper connection to the frontier myth. The word “data” appears nine times as often in the COBOL manual than it does in Kernighan and Ritchie, which, in turn, contains nine times as many instances of the word “variable.” COBOL’s “sentence” becomes C’s “expression.” “Lessons” and “problems” give way to “exercises.” By the 1978 book, even the “computer” itself has begun to disappear in favor of the imagined space within it—a particularly curious note given how much more direct the experience of the machine had by then become.

More work of this kind needs to be done to more fully understand the changing tone of these books and their relationship to the larger history of coding as work and the myth of the digital frontier. The 1950s to the 1970s were critical to both the history of the United States in general and the history of computing in particular—a gold rush of programming pedagogy (or, to borrow a metaphor from Nathan Ensmenger, a “Cambrian explosion” of code).10 Yet while the infectious positivity of these books remains, the open, egalitarian vision that helped drive their creation seems to have vanished. Frederick Jackson Turner’s so-called frontier thesis is today remembered mostly for its erasures—for ludicrously positioning the United States “at the hither edge of free land…, strong in selfishness and individualism,” where “each frontier” would “furnish a new field of opportunity, a gate of escape from the bondage of the past; and freshness, and confidence, and scorn of older society.”11 Escape, maybe, but for whom? At whose expense?

Shaw closes Learn Python the Hard Way with “Advice from an Old Programmer.” “Programming,” he intones, “as a profession is only moderately interesting… you could make about the same money and be happier running a fast food joint.” True growth can only come from exploration—from contact with the frontier. “Go out and explore,” he continues, cautioning only that “you may find that people treat you harshly because you can create software” and that “because you can dissect their logic, they [will] hate arguing with you.” “To this,” he writes, “I have just one piece of advice: they can go to hell. […] You can code. They cannot. That is pretty damn cool.”12

  1. Deepa Shivaram, Tweet, @deepa_shivaram (October 8, 2019), https://twitter.com/deepa_shivaram/status/1181617683458150405

  2. Jim Newell, “What Will Post-Heart Attack Bernie Be Like?,” Slate Magazine (October 11, 2019), https://slate.com/news-and-politics/2019/10/2020-bernie-sanders-elizabeth-warren-joe-biden-democratic-debate.html

  3. AP News, “Joe Biden Boasts of Going ‘into the Hood,’ Finding Woman Who Could Learn Coding Skills” (May 13, 2019), https://apnews.com/7b32078dfed9c4d18c98096aa106ae7f

  4. Zed Shaw, Learn Python the Hard Way: A Very Simple Introduction to the Terrifyingly Beautiful World of Computers and Code, 3rd ed. (Upper Saddle River, NJ: Addison-Wesley, 2014), 2–3. 

  5. James A. Saxon, COBOL: A Self-Instructional Manual (Englewood Cliffs, NJ: Prentice-Hall, 1963), iv. 

  6. Daniel D. McCracken, A Guide to FORTRAN Programming (New York: John Wiley & Sons, 1961), v. 

  7. On the paradoxes of the “post-industrial” economy, see especially Jefferson Cowie, Stayin’ Alive: The 1970s and the Last Days of the Working Class (New York: New Press, 2010) and Louis Hyman, Temp: How American Work, American Business, and the American Dream Became Temporary (New York: Viking, 2017). 

  8. Bob Albrecht, My Computer Likes Me When I Speak in BASIC (Menlo Park, CA: DYMAX, 1972), 1. 

  9. Brian W. Kernighan and Dennis M. Ritchie, The C Programming Language (Englewood Cliffs, NJ: Prentice-Hall, 1978), ix; ibid., 5–6. 

  10. Nathan Ensmenger, The Computer Boys Take Over: Computers, Programmers, and the Politics of Technical Expertise (Cambridge, MA: MIT Press, 2010), 105. 

  11. Frederick Jackson Turner, The Frontier in American History (New York: Henry Holt & Co., 1921), 3; ibid., 32–8. 

  12. Shaw, 241–2.