Paul Edwards, “The Closed World: Computers and the Politics of Discourse in Cold War America” Part II

Part II: Cyborg Subjectivity

(Part I: Military Development of the Computer)

In the second half of his book, Paul Edwards considers new subject positions constructed by Cold War computer discourse. Where the first half of his book focuses on the development of computer hardware, the second explores software, and Artificial Intelligence in particular. Once again, he shows that “closed world” discourse dictated the direction that computer development took (again often as a result of enormous military funding through ARPA) while at the same time the metaphors activated by computer development were mobilized to change our own subject positions as humans.

Edwards spends some time on the rise of cognitive psychology, which grew out of the World War II war effort, which sought to articulate the soldier as part of a technological system, and to thereby remove the weak link in command and control by effecting a tighter integration between human and machine. The result was “human engineering,” which relied upon a new metaphor elaborated by Alan Turing: “The Mind is a computer.” That is, the computer made possible a new subjectivity, in which we conceive of ourselves as computers. Thus was born cognitive psychology, the figure of the cyborg (combination of human and machine), and the possibility of artificial intelligence (AI).

AI was developed by ARPA’s Information Processing Techniques Office (IPTO), headed by J. C. R. Licklider, whose vision of computerized military command and control “helped to shape the AI research agenda for the next twenty-five years” (240). The goal was to generate computer languages that would mirror, and be compatible with, human languages, and the result was LISP, an abstract, very high-level recursive computer language designed to be interactive with human operators—and to effect a “man-computer symbiosis” (269). Because AI had vague goals and no immediate commercial utility, its development relied heavily on ARPA, and was thus actively shaped by military goals. During the second half of the Carter administration, and then under Reagan, AI became the centerpoint of renewed Cold War discourse of nuclear holocaust. Coupled with new missiles and space-based weaponry, AI was seen by the U.S. government as the linchpin of a futuristic technological system that would protect from Russian ICBM attack by generating an impenetrable shield around the country. The military’s twin development programs of this era, the Strategic Space Initiative and DARPA’s Strategic Computing Initiative, were meant to produce space and terrestrial weapon systems essentially controlled by computers, using advanced AI, to automate the process of warfare. This was, of course, a rehash of the 60s dream of the electronic battlefield, including a virtual clone of SAGE. The only difference was that while SAGE was Semi-Automated, the new systems would be fully automated, taking human error and lag time out of the equation. These programs were infeasible from the start, and never had a chance of becoming truly operational. They were, however, extremely effective politically, as they generated the illusion of defensive weaponry and protection from thermonuclear war, while promoting the closed-world mentality of rigid boundaries between “us” and “them.”

This discourse relied heavily on the figure of the computer’s “microworld,” a reduction of the complexity, unpredictability and uncertainty of the real world into a simplified, programmable, miniature world inside the computer. The real world was transformed discursively, via the computer, into a rational, closed system. Popular films reflected the resulting “cyborg subjectivity.” In the 60s, computers were figured as evil, disembodied overlords wresting control from humans (Fail-Safe, Dr. Strangelove, 2001: A Space Odyssey, Collosus: The Forbin Project). By the mid 80s, at the height of Cold War II, cyborgs were the central features of science fiction set in extreme closed worlds (the claustrophobic Terminator, Star Wars, various space-based films and shows). Edwards describes these cyborgs as “embodied second selves” who navigate the closed world by interfacing with both humans and underlying computer networks. By the late 80s and 90s, the cyborg had been rehabilitated (Darth Vader becomes a human father at the end of the Star Wars trilogy, the Terminator becomes a protector in Terminator 2). As computers became ubiquitous PCs, computers ceased to be dangerous others, and we came to realize that in the closed world of computerized systems (from which there is now no escape), “the possibility that remains, the only possibility for genuine self-determination, is the political subject position of the cyborg.” (350)

The book at

Tagged with:

1 Comment

  1. […] Part II considers Edwards’ discussion of the subject positions that computers enabled during this period of time after. […]

Leave a Reply

Comments are closed.