Alan Moore and Dave Gibbons’ graphic novel Watchmen (1986–1987) is set in an alternative history in which the United States wins the Vietnam War. The HBO miniseries Watchmen (2019) updates the original graphic novel’s thought-experiment, asking its viewers to question how American imperial power yoked with white supremacist tendencies operationalizes itself in the contemporary world. In the text, upon President Richard Nixon’s request, the godlike superhero Dr. Manhattan intervenes in Vietnam on behalf of the US government. Through the militarization of Dr. Manhattan’s superpowers, the US defeats the Viet Cong and colonizes Vietnam, turning the nation into the 51st state of the United States. Dr. Manhattan’s powers enable the US to win the Vietnam War and, by extension, control the outcome of the Cold War in the Watchmen superhero universe. Dr. Manhattan is the personification of scientism and cybernetic thinking, symbolizing how technological innovation born out of Cold War science was always-already radicalized, politicized, and weaponized by the United States.
The Vietnam War refracted through the Watchmen series sheds light on the bruised ego of the American psyche and the political gut-punch that multiple US presidents received at the hands of the Viet Cong. The defining logic driving the American war effort in Vietnam was that, with the most advanced military in the world, the United States should be able to effortlessly bring down the Viet Cong with technological solutions to the problems of war. In this essay, I suggest that contemporary conversations about digital battlefields and autonomous weapons can be enriched by a better understanding of the history of media technologies whose development was supported by military concerns in the quarter-century after the Second World War. Specifically, I examine the media history of one such technological solution: an electronic sensor network along the Ho Chi Minh trail, described in the study “Air Supported Anti-Infiltration Barrier” by the Institute for Defense Analyses’ JASON division, a group of scientists that advised the US government (Deitchman et al.).
The JASON division study proposed that installing a network of electronic sensors along the Ho Chi Minh trail, the North Vietnamese supply and logistics route that wound its way through the jungles of Laos, would be the most efficient way to bring the conflict with Vietnam to a close. This network would provide feedback to the computer surveillance systems that were monitoring the troop movements of the North Vietnamese People’s Army of Vietnam (PAVN) and the Viet Cong in South Vietnam and was called the McNamara Line, after the Secretary of Defense Robert McNamara. The McNamara line consisted of several surveillance sensors distributed throughout the Ho Chi Minh trail that surveilled opposition troop movements and instantaneously updated the air command, which used data links and computer-assisted intelligence-gathering systems to engage in targeted attacks. acoustic, seismic, and thermal sensors sent data to reconnaissance airplanes which then transmitted the battlefield surveillance data to two IBM 360-65 computers in Thailand. The computers processed the data and provided aircraft carriers with battlefield coordinates for engaging in targeted bombing missions (Nijholt 123). The US Senate Armed Services Committee greenlit JASON division’s strategy with the Electronic Battlefield program. Senator Barry Goldwater referred to the program as the greatest step forward since gunpowder (Deitchman). The 553d Reconnaissance Squadron of the US Air Force Special Operations Command codenamed the mission Operation Igloo White. Operation Igloo White was the first time the US military had used real-time computer-driven surveillance in war. Deploying electronic sensors in the battlefield gave the cybernetic systems the necessary sensory feedback to operate accurate and precise feedback control systems.
As critical digital humanists, we need to pay close attention to how quantitative methods and positivist discourse were able to reprioritize the academic commitments in the field of geography and fundamentally change the nature of geographic knowledge production. Can we bridge the critical–quantitative binary when there is a similar quantitative revolution taking place in the humanities faculties of contemporary neoliberal universities in North America? As we engage in socially progressive and civic-minded scholarly practices, we must be mindful of the technological legacies of the tools we employ and the “military a priori” that they hold, i.e. the use-case scenarios as used in warfare. By examining how this emphasis on the quantitative has affected other fields, we can more clearly assess the risks posed to digital humanists. The field of geography offers a powerful example. Mapping and geographic information system (GIS) technologies are power-laden forms of knowledge production that help to (re)create as much of the world as they represent through technoscientific, modernist, and colonial discourses.1 Maps both enable and suppress knowledge through the intentional and unintentional elements of the ideological practice of cartography. Bernhard Siegert writes that maps are not merely representations of territory; rather, these instruments are media technologies instilled with the capacity to co-constitute subjects (13). They were developed using mathematical techniques and have since become a part of the cultural praxis. The signs on a map do not just convey geo-textual information: they represent epistemic orders and gesture changes in cartographic procedures that tip-off changes in knowledge creation and perception. Peter Galison (235) explains how Norbert Wiener’s World War II anti-aircraft predictor was able to conjoin soldier, calculator, and firepower into a seamless integrated system.2 For Wiener, an enemy pilot and the bomber they flew functioned as one entity that was visible to the anti-aircraft predictor programmed to eliminate them. The enemy pilot was disaggregated into data points such as speed, direction, altitude, and signature moves (Packer and Galison 3167). In this anti-aircraft predictor example, quantitative data made it possible for servomechanism feedback to supersede human intentionality. Within digital humanities (DH), the instruments, tools, and technologies that we learn and use are removed from the material conditions under which they are developed, used, or researched. It is important to understand these tools as something informed by and infused with materialist genealogies of race, capitalism, and technology. By critically examining the discourse which interrogates the social roots, histories, and implications of GIS technology, we can see how the field of geography was operationalized by military and political interests, erasing its engagement with critical social discourse.
Following World War II, the US military began to formalize its application of scientific methods and approaches (Ryan). Under the Kennedy administration, staff from the RAND Corporation, which was established as a clearinghouse for devising and formalizing quantitatively informed decision-making strategies, were brought to the Office of the Secretary of Defense in 1961 (Gray 111). McNamara championed an economic approach to managing the Department of Defense by promoting operations research, systems analysis, and data-driven decision making. Under McNamara, defence strategy planning for the nuclear age was being formalized using mathematical modelling techniques. Mathematicians and economists from RAND were employing game theory, rational choice theory, forecasting, and future scenario planning to formulate defence strategy and policy. McNamara installed in the Department of Defense a planning system very similar to the one he had devised as the president of the Ford Motor Company (Murray). This Planning, Programming, and Budgeting System (PPBS) provided an economic framework driven by mission outputs rather than military spending requests. Systems analysis compared different units’ capacity to perform the same mission and allotted resources to the most efficient and cost-effective units. In 1966, the Office of Systems Analysis was put in charge of evaluating progress in the Vietnam war effort (Murdock). McNamara’s penchant for quantitative measures of efficiency, including accurate and precise statistical accounts, had become institutional practice by this time. Progress in the war was measured by performance indicators including enemy body counts, casualties, weapons captured, prisoners taken, sorties flown, and bombs dropped (Murray). The strategy and operations of the Vietnam war were thus envisioned and executed through a quantitative decision-making lens via technologies that addressed techno-scientific concerns arising from that instrumental reasoning. Neither history nor the ambiguities of the battlefield mattered to the managerial logic that operated on a set of game-theoretical assumptions and technological imperatives. The quantitative decision-making systems that McNamara installed as part of the US Army’s defence policymaking infrastructure paved the way for contemporary big data and machine learning systems and the onslaught of contemporary predictive analytics software systems.
Cybernetics, an interdisciplinary field that brings together science, engineering, and philosophy under a common discursive paradigm, allows for the cross-pollination of scientific ideas across computer, biological, and social sciences as well as the development of common frameworks that can be ported across disciplinary boundaries. The enthusiasm that scientists had for cybernetics and social scientists had for game theory and modern technology readily translated into the technological choices and weapons that were deployed during the Vietnam War. Cybernetics, operations research, and systems analysis were mobilized to impose order and discipline onto information collection, distribution, and processing: an example of what Antoine Bousquet calls cybernetic warfare (Bousquet). Just as the triple helix of government, industry, and academia was actively being reconfigured to believe that decision making under uncertainty is better performed by computer technologies that are accurate, precise, and––by extension––more reliable (Rosenzweig). The military organization was thus being reimagined as a vast socio-technical machine that could be optimized and directed based on quantitative analytics, mathematical modelling, and applied cybernetic technologies. The engine that drove cybernetic thinking was the anticipation of nuclear conflict with the Soviet Union, but conceiving warfare strategies based on those war simulations and cybernetic models proved to be disastrous in the Vietnam War. In particular, war planning was fundamentally changed with the introduction of information feedback systems and engineering control systems. Norbert Wiener’s concept of cybernetic systems is characterized by closed-loop feedback control systems that use feedback from the system’s output to course correct the system’s input signal and achieve the desired output. Command and control systems, operational research, and systems analysis have functionally reduced military conflict to an exercise in cost-benefit analysis, using mathematical functions that can be optimized, modelled, and simulated.
Paul Edwards argues that, in the Cold War era, the rapid computerization and networking of the military created a closed-world system engineered to defend against the possibility of total nuclear annihilation (The Closed World). The rise of a science and technology based military power gave rise to the idea of the world as a closed system, one that was malleable and controllable through American technological prowess. This perception conceived of the world as a system that was protected and manipulated by the United States government (Edwards, Closed Worlds). Computer systems provided technical tools for rational discourse, and computer sciences and systems engineering theories developed in this era articulate the world through data points that are calculable and manageable. Simulations and models could predict the likely outcomes of conflicts, thereby industrializing and informationalizing warfare. In cybernetic warfare, the ability of a war-machine–system to summon the necessary resources and manage complex logistical systems determines military and strategic victory (Bousquet).
The evolution of command centres into command-and-control centres signifies the pivotal role played by modern information and communication technologies in the informationalization of warfare. Where command centres served as the centralized nerve centres for the military, i.e. the war room. Command and control centres allowed for signals warfare, i.e. warfare using information communication technologies. The capacity to recalibrate the commands based on continuous and instantaneous feedback information received from the ground made the command and controls able to adopt precision-guided munition in command and control warfare (C2W). The threat of nuclear war was exacerbated by the emergence of jet planes and intercontinental ballistic missiles. Computers offered a technological breakthrough that enabled the instantaneous transmission and reception of geographic coordinate information. The Vietnam War was the first time satellite communications were used in real-world combat operations (Spires and Sturdevant). The Vietnam War took place during the early years of Communications Satellites Corporation (COMSAT). American Telephone and Telegraph (AT&T) built COMSATs international satellite network and was its largest shareholder. In January 1966, the Defense Communications Agency (DCA) in its attempt to secure inexpensive satellite communication, informed COMSAT that it would need 30 circuits over the Pacific; this was in direct violation of the FCC’s rules (Spires and Sturdevant). The DCA planned to integrate the regional communication networks into a single network that would allow commercial satellite communication systems to be available on the battlefield. As of July 1967, US forces had installed the Initial Defense Communications Satellite Program (IDCSP) ground terminals that transmitted high-resolution surveillance images between Vietnam and the United States. Satellite communication provided systems analysts with real-time battlefield information, keeping them connected with command-and-control centres in a cybernetic loop. Due to a lack of military satellite resources during the Vietnam War, though, the military established the practice of outsourcing satellite communication for administrative and logistics to commercial vendors and using military satellites for sensitive command-and-control communications only (Spires and Sturdevant). Merging commercial and military satellite operations was meant to be a cost-savings technique. Operation Igloo White serves as the media genealogical precursor to environmental sensing technology initiatives like Hewlett-Packard’s Central Nervous System for the Earth (CeNSE) that established sensor network systems for ecological observation (Gabrys).
The quantitative turn in the machinery of the Vietnam War was marked with the launch of the Hamlet Evaluation System (HES). The HES was developed by the CIA after McNamara tasked it to devise a data-collection and statistical system for measuring the status of the US pacification campaign in Vietnam American military progress (Enthoven 302). In 1966, McNamara implemented a Hamlet Evaluation Worksheet with eighteen indicators, nine measuring security conditions, nine covering socio-economic development projects, each measured on a five-point scale. HES provided decision makers with quantified data about how rural pacification was progressing across 12,500 hamlets spread across forty-four provinces in Southern Vietnam (Belcher, “Data”). Since US and Vietnamese officials lacked reliable census data, HES data became the most comprehensive dataset about the Vietnamese populace. The information gathered from the HES surveys was transferred to punchcards and analyzed by systems analysts on an IBM 360 mainframe computer (Barnes). Using Synagraphic Mapping System (SYMAP), a computerized cartography program developed by Howard Fisher’s Harvard Laboratory for Computer Graphics and Spatial Analysis, the mapping software integrated data from the HES survey to provide coordinates for targeted bombing missions (Barnes). The computerized cartography program allowed geographic information to be layered on top of the HES data, providing monthly data visualizations of the progress made by the US pacification program (Belcher, “Sensing”). The data visualizations added quantitative information to map coordinates, making them quantitatively richer texts for military personnel. These data visualizations removed the subjectivity of map reading and outsourced it to the computer, which provided accurate, precise, computationally based readings. This military intelligence paired with cartography produced a particular type of location-based information that could display local social geographic information at the micro level as well as track enemy troop movement across active battle zones on the macro level. The installation of the Universal Transverse Mercator (UTM) system for assigning map coordinates as the standard geographic reference for all military computers also streamlined the flow of geographic information through these computerized systems.
The practice of analytical cartography untethered the field of cartography from the social and cultural constraints that geography departments typically placed upon the field (Tobler, “Analytical”). Analytical cartography is the academic sub-discipline that powers much of the development in geographic information science (Clarke and Cloud). It helps us understand the epistemic shift from reading maps as representations of territory to maps as instruments. It reveals the mathematical operations that go on behind the scenes to co-create the cultural practice that is computerized map reading. i.e. how the operative conditions of map making, the cartographic processes and procedures are being quantified in turn introducing changes into the orders of representation (Siegert). In the process, analytical cartography promoted a quantitatively informed discipline dominated by questions of techniques and technologies (Tobler, “Analytical”). It privileged statistical techniques over the communicative paradigm of cartography, expanding the theoretical core of the discipline. Emerging technologies in computerized cartography in the late 1950s and 1960s foregrounded photogrammetry and geodesy as central tenets of the field, eventually giving way to geographic information systems (GIS) and shifting the field toward spatial data sciences over time (Clarke and Cloud).
The focus of analytical cartography was not limited to an academic discipline; rather, it focused on solving geographically defined military problems (Barnes, “Geographical”). GIS emerged from the Laboratory for Computer Graphics at Harvard University's Graduate School of Design (GSD). The Lab was established in 1965 with a grant from the Ford Foundation’s Department of Public Affairs, the Office of Naval Research, other military sources and various smaller contributions from and to the GSD (Light 136). Under Howard Fisher’s direction, the lab assembled a multidisciplinary team including urban planner Allan Schmidt, water engineer and economist Peter Rogers, architect Allen Bernholtz and programmer Betty Benson in the development of SYMAP (Chrisman; Steinitz, “Beginnings of Geodeisgn: A Personal Historical Perspective”). In 1967, Fisher recruited landscape architecture student Jack Dangermond to the team. After graduating with a Masters in Landscape Architecture in 1969, Dangermond went on to form the Environmental Systems Research Institute (ESRI) with his wife Laura Dangermond in Redwood, California. In the early days of ESRI, Dangermond would continue to work and develop SYMAP, at times poaching workers from the lab for his startup (Wilson 58).
Analytical cartography wrenched theory out of the field, stunting its development for the infusion of information technologies into the mapping process using GIS and turning it into a subservient subject that provides data for the needs of other disciplines and fields (Tobler, “Analytical”). The computer-generated map compiled by GIS is laden with the particularities of the power-knowledge emanating from that system, one that signals how the discipline of cartography had to make room for the technique of GIS to take centre stage (Barnes, “Geographical”). GIS is the model cybernetic postwar science, one that blurs the boundaries between theory and praxis, science and engineering, civilian and military uses, and classified and unclassified systems, contributing to economic prosperity and national security. GIS's ability to cross over into other disciplines and to bear fruit in the marketscape meant the newly defined field was able to achieve interest and uptake from the commercial sector (Farish). GIS’s capacity to generate new and fertile streams of geo-information that led to more accurate and precise contouring of the military establishment also had a use-value for commercial terrain. Analytical cartography thus exemplifies how academic departments and research agendas were being reconfigured after World War II in the United States.
The epistemic infrastructure upon which the subject area of GIS forms itself is deeply connected to the US military investments and innovations happening during the Vietnam War. As the US Civil Rights Movement was gathering steam in the late 1960s, when fervent critiques of American imperialism, capitalism, and the military–industrial complex were pulsing through campuses across America, geography departments undergoing a quantitative revolution were quite often arming the war machine (Billinge et al. 129). For instance, Howard Fisher and Thomas Thayer, Deputy Director of Intelligence and Force Effectiveness in the systems Analysis and Research Division and point man on Vietnam assessment for the Office of the Secretary of Defense were corresponding in March 1967, hammering out details of a graphic display system to analyze kill ratios (Wilson 65). Thayer had enlisted Ithiel de Sola Pool’s Simulmatics Corporation to conduct a field study to evaluate the Hamlet Evaluation System (HES) (Light 43). Simulatics was a pioneer in big data and predictive analytics even before those terms had been popularized.
The Second World War marked a paradigm shift in the relationship that geographers had with the military. After the war, the US military was actively engaged in formalizing an instrumental model of geography education that folded in the tools and practices deployed in the field of combat. This period can be best understood as a melting pot in which the military, industry, and academia were enmeshed in a triple helix where ideas, technologies, and techniques freely circulated between the three domains. The key figures involved in the establishment of new governmental bodies maintained their academic affiliations and managed the growth of these domains as interlocutors and power brokers. With the creation of governmental bodies such as the Office of Strategic Research and Development (OSRD) and the Office of Strategic Services (OSS), research in the sciences, social sciences, and military were directed toward a single outcome. Waldo Tobler, one of the pioneers of analytical cartography and GIS also worked at RAND, developing the nuclear attack early warning system (Barnes, “Geographical”). Tobler was one of the pioneering figures in using high-speed computing to automate cartographic data processing (Tobler, “Automation”). The work performed by analytical cartography was deemed to be of strategic importance and kept classified and highly confidential. As a field, it existed in the space between government cartography and academic geographical cartography (Cloud, “American”). Military cartography brings to light the relationship between geographic knowledge and geographic intelligence. It is only as long as geographic information systems can render actionable intelligence that these systems have value for the military establishment; therefore, secrecy and silence have always been part of the cartographic epistemology (Cloud, “American”).
As the geo-spatial sciences were coalescing in the shadow of the Cold War, the knowledge-power being developed out of that field were primed to serve the military cause. The techniques and technologies that were developed in the Cold War period were funded by the Department of Defense in congress with various industry partners working alongside academia (Clarke and Cloud).
Geographic information systems emerged from the technoscientific interdisciplinary cybernetic assemblage made possible by the political climate of the Cold War (Haraway, Modest_Witness). In fact, Cold War-era cartographic systems were possible only as a result of a stolen German archive of maps from World War II. The US Army geodesist Major Floyd W. Hough and his unit, the HOUGHTEAM, acquired the archive of the Reichsamt für Landesaufnahme (RfL), the German national survey agency, which included geodesic maps from various German technical universities, government institutes, libraries, and other institutions (Miller). The collection from Saalfeld, for example, included aerial photo archives of the Luftwaffe, over 90 tonnes of maps covering all of Europe, Asiatic Russia, parts of North Africa, and various other parts of the world. The HOUGHTEAM also acquired high-quality German stereoplanigraphs, cutting-edge technologies used by the Third Reich to create topographic maps from aerial photographs, as well as other German geodesic engineering and map-making paraphernalia (Miller). The HOUGHTEAM also found among the RfL collection Russian geodetic survey data and maps of the USSR that the Nazi army had captured from the Russians. The HOUGHTEAM’s haul of RfL maps was mostly classified, and these geodesic datasets served as the bedrock of most cartographic systems that were used to engage in the surveillance of the USSR during the Cold War. The HOUGHTEAM also expatriated Nazi officers of the RfL, computational staff, geodesic scientists, and engineers who worked on the German maps to the United States where they continued their work for the US Army Map Service (Hough; Miller). The HOUGHTEAM’s work on the European Datum ED50, the common geodetic network linking all of Europe was built on, built with, and built by the Nazis. The ED50 technology infrastructure undergirds the global coordinate system known as the Universal Transverse Mercator (UTM), the foundation upon which the analytical cartographic system built a numerically precise and accurate coordinate system (Rankin; Warner).
As geospatial information traverses the automated cartographic circuitry, it transforms from geo-data to geo-intelligence. As Safiya Noble elucidates, GIS must be understood as tools that hasten data extraction, processing, and analysis and that are marketed as services for providing actionable business intelligence (Noble 88). The circuitry that allows for this movement of geo-information spans academic, military, and commercial systems. The contours of the political economy of these GIS reveal how intertwined these information systems are and how they have been cybernetically designed: these are feedback control systems with input signals, actuating apparatuses, and sensing elements. The contractors and subcontractors providing software, hardware, and wetware come from academic institutions and commercial and non-profit organizations (Winthrop-Young 191). These wares are enmeshed with the intelligence establishment and the Department of Defense, providing geographic intelligence for the surveillance apparatus of the state (Vernon). The militarization of geographic knowledge allowed scholars, soldiers, and spies to draw from and add to the same knowledge base of datafied geographic information: the technologies and techniques developed by one axis of this triple helix are effortlessly transferred over and adopted by another. Geographical praxis in service of the state and developed in congress with military systems analysts, espionage agencies, and commercial contractors meant that the type of scholarly work that was produced was geared primarily towards those militaristic goals. Materials sciences needed geographic data inputs to engineer military equipment and furnish military personnel with gear that could withstand harsh terrains and climatic conditions. Moreover, the development of GIS meant that with rational authority and technical apparatuses, wars could be waged over hazardous terrain and victory could be attained bloodlessly: McNamara’s fallacy was iterated upon to temper the use of military force to become the Weinberger Doctrine and, later on, the Powell Doctrine. As the personnel rosters between the sectors of academia, the military, the public, and commercial enterprise merged, the user base for technologies that were developed in one domain found itself in another domain (Krishnan). Linking automated cartography to military hardware was a way to implement cybernetic thinking in the war zone (Minnich).
GIS involved technological and scientific practices that privileged metrics and measurement by machines in service of the military (Perkins and Dodge). Geographical knowledge systems created in the post-Cold War era were made possible because of major defence spending initiatives contracts that incentivized spatial analysis for strategic and economic benefit (Vernon). If mapping technologies are the militaristic gaze par excellence, satellites upgrade that convention with the “view from nowhere” (Haraway, “Situated”). By splicing together satellite imagery with scientific approaches and managerial world-building, geographic information systems foster an abstract aesthetic, one that champions the image as part of the truth-making regime dispassionately created by a remote surveillance system (Perkins and Dodge).
The JASON division was founded in 1958 in direct response to the Soviet launch of the Sputnik satellite system and was organized under the Institute for Defense Analysis (IDA). The Jasons performed analytical work providing actionable intelligence and advice for the Department of Defense on how to tackle thorny scientific and technological problems. It was the Jasons’ advice to throw an electronic barrier across the Ho Chi Minh trail in order to deter Vietnamese troops and supply movement across the porous border, but it was not until the Pentagon Papers came out in 1971 that the Jasons’ involvement in the Vietnam War was made known to the public (Finkbeiner). As mentioned above, the original concept for the Jasons’ electronic barrier—the McNamara Line—was to create an acoustic curtain along a hundred-kilometre strip of the Ho Chi Minh trail and place cherry bombs that would be triggered by the movement of footsoldiers and trucks. The truck and personnel road networks were both unobservable by aerial reconnaissance aircraft because of the thick underbrush of the rainforest and the camouflage used by the Vietnamese soldiers. The air-supported anti-infiltration barrier system that was deployed consisted of anti-truck and anti-personnel sections. The target acquisition part of the air-supported barriers was made up of aerially distributed sensors that were continuously monitored by the Canadian-designed and -produced specialized cargo aircraft the de Havilland Canada DHC-4 Caribou, designated by the US military as the C-7 Caribou (Tambini 123). The sensors would send a signal to the central computer in the Infiltration Surveillance Center at the Nakhon Phanom Air Base in Thailand. IBM 360 computers at this base would pinpoint the location of the sensor and provide geographic coordinates for airstrikes to cut off the troop movement in real time. In order to get the sensor network up and running, several divisions within the US armed forces had to work in agile groups coming together, which initiated a new paradigm for planning and operating in the war known as “network-centric-warfare” (Deitchman).
In 1966 McNamara sent the JASON study to the Joint Chiefs of Staff for evaluation but executed the plan and set the project in motion before receiving their response (Correll). McNamara appointed Army Lt. General Alfred D. Starbird as the head of the Defense Communication Planning Group, the task force in charge of implementing Operation Igloo White. The Vietnam portion of the project was known as Operation Dye Marker, and the Laos portion was known as Operation Muscle Shoals. The Chief of Staff of the Army, William Westmoreland, was opposed to the operation, and the Marines, Air Force, and Navy were saddled with operationalizing a hi-tech project without adequate labour and logistical support. The sensors were developed by companies including Texas Instruments, Magnavox, General Electric, Western Electric, and Hazeltine Corp (Novak), and were designed to look like twigs, jungle plants, and dog excrement to blend into the surroundings. The acoustic sensors would pick up noises from trucks, heat sensors would pick up body heat, and chemical sensors would pick up the smell of human urine. The seismic sensors deployed to detect troop and vehicle movement were developed by Sandia National Laboratories, a nuclear weapons laboratory. US Department of Defense’s Advanced Research Projects Agency (ARPA) funded two Sandia’s researchers to work on the seismic sensor research project (Ullrich 6). When the sensors went off, they would send signals to the aircraft patrolling the skies which relayed it to the Infiltration Surveillance Center computers. The sensors operating on nickel–cadmium batteries were found to be functional for only a few weeks. In order to conserve power, the next iteration of sensors were developed to be turned on remotely. It was not until lithium batteries were employed that the sensor lifespan increased to up to two months. Evaluated on their own merit, the sensor technologies employed by the Jasons were successful: conceptually, the idea of receiving real-time information about enemy troop movement in order to push for surgical strikes was grounded in cybernetic principles. But none of these sensors were made or tested in less than optimal battlefield conditions. Commercial, off-the-shelf sensors were not designed to withstand being deployed from aircraft, and several sensors were damaged and rendered non-functional upon deployment.
Operation Igloo White ran from 1967 to 1972 and cost the United States nearly $1 billion a year (Edwards, “Cyberpunks”). The Infiltration Surveillance Centre was an exemplar of the command-and-control centre with its automated, computerized, mathematically modelled systems being rigorously employed. It was built with absolute faith in the functional capabilities of electronic communication peripherals, data processing computers, and various other process automation paraphernalia. As Packer and Reeves explain, the first computerized communication and command-and-control system, Semi-Automated Ground Environment (SAGE), enabled the automation and computerization of surveillance, replacing the sensory and cognitive labour performed by wetware with computer hardware and software that acquire, process, and store data. The computer research and development were guided by the policies and agendas of the American military and the political establishment, and Cold War ideology drove the development of command-and-control systems, non-human surveillance systems, artificial satellite systems, and advanced missile systems . Operation Igloo White is a microcosm of these ideological aspirations and technologies that had not yet been vetted and iterated being deployed in the war zone.
During the Vietnam War, the US dropped more than two million tonnes of bombs on the Ho Chi Minh trail. The electronic barrier that Operation Igloo White had erected was supposed to make US bombing missions more accurate and precise, but in reality, the operation failed to achieve either of those objectives. The acoustic sensors could not distinguish between civilian trucks and military trucks, leading to several civilian casualties. The cluster bombs and flechettes that were dropped along with the sensors were also responsible for enormous catastrophe in some cases by not going off. Undetonated bombs and mines pepper the Ho Chi Minh trail even today and continue to maim and injure civilians. The electronic battlefield as conceived by the military systems analysts and operations researchers lacked contingencies to deal with the vagaries of real-world operations. The service and maintenance of sensors was a point of contention, and the unreliability of the electronic equipment meant that the modern electronic weapons were recreating the Byzantine Generals Problem. The communication links between the sensors, aircraft, and computers left the system prone to several points of failure. Nonetheless, these technologies were iterated upon in the half-century since, and the cybernetic thinking and warfare strategies developed by a secretive group of academics working for the Department of Defense are still in circulation. The virtual border technologies conceived of for use in Vietnam are being retooled, retrofitted, and refurbished for use once again as the political winds blow in an authoritarian direction. Moreover, the electronic battlefield technologies developed in Vietnam have become domestic consumer technologies, surveillance systems, smart sensors, and drones. For example, ESRI, the maker of ArcGIS, has developed partnerships with Radio Frequency (RF) data analytics company HawkEye 360 to develop a software suite to enter the Radio Frequency Mapping (RFM) marketplace. The marriage of analytics capabilities to the technologies described above turns them into means of prediction, a capacity heretofore not possible. The ability to accurately and precisely geolocate RF transmissions is a capacity that several players in the public and private sectors would like to develop rapidly. Using a constellation of formation-flying satellites, HawkEye 360 can generate saleable RF geolocation data from Very High Frequency (VHF) marine radio, Ultra High Frequency (UHF) push-to-talk radio, maritime radar systems, Automatic Identification System (AIS) beacons, and L-band satellite devices (Jewett). RF signal mapping has the capacity to provide much deeper situational awareness by pairing RF data along with data from various other sources, including Synthetic-Aperture Radar (SAR), optical media, and social media, allowing users to track, pinpoint, and analyze signal behaviour over time. The use cases for geospatial data analytics using RF emissions in tandem with more traditional forms of geo-intelligence include maritime tracking, illegal fishing, and wildlife poaching. (Vinsoki). As an ArcGIS add-in, the HawkEye RF Data Explorer can pipe data from the satellite grid into ESRI’s cloud datastores, seamlessly and frictionlessly moving data between national security infrastructures, commercial vendors, and public partners in the software and hardware stack.Regularizing military technologies and suturing them to everyday practices have made them part of domestic life. In the process, the militarized and politically charged histories of the technologies get wiped clean, leaving behind a techno-optimistic narrative of progress for civil society to latch onto.
Barnes, Trevor J. “Geographical Intelligence: American Geographers and Research and Analysis in the Office of Strategic Services 1941–1945.” Journal of Historical Geography, vol. 32, no. 1, Jan. 2006, pp. 149–68.
Barnes, Trevor J. “War by Numbers: Another Quantitative Revolution.” Geopolitics, vol. 20, no. 4, Oct. 2015, pp. 736–40. Taylor & Francis Online.
Belcher, Oliver. “Data Anxieties: Objectivity and Difference in Early Vietnam War Computing.” Algorithmic Life: Calculative Devices in the Age of Big Data, edited by Louise Amoore and Volha Piotukh, Routledge, 2015, 141–56.
Belcher, Oliver. “Sensing, Territory, Population: Computation, Embodied Sensors, and Hamlet Control in the Vietnam War.” Security Dialogue, vol. 50, no. 5, Oct. 2019, pp. 416–36.
Bousquet, Antoine. “Cyberneticizing the American War Machine: Science and Computers in the Cold War.” Cold War History, vol. 8, no. 1, Feb. 2008, pp. 77–102. Taylor & Francis Online.
Chrisman, Nick. Charting the Unknown: How Computer Mapping at Harvard Became GIS. ESRI Press, 2006.
Clarke, Keith C., and John G. Cloud. “On the Origins of Analytical Cartography.” Cartography and Geographic Information Science, vol. 27, no. 3, Taylor & Francis, Jan. 2000, pp. 195–204, doi.org/10.1559/152304000783547821.
Cloud, John. “American Cartographic Transformations during the Cold War.” Cartography and Geographic Information Science, vol. 29, no. 3, Jan. 2002, pp. 261–82. Taylor & Francis Online.
Correll, John T. “Igloo White” Air Force Magazine, Nov. 2004, www.airforcemag.com/PDF/MagazineArchive/Documents/2004/November%202004/1104igloo.pdf.
Deitchman, Seymour J. “The ‘Electronic Battlefield’ in the Vietnam War.” The Journal of Military History, vol. 72, no. 3, 2008, pp. 869–87. Project MUSE.
Deitchman, Seymour, V. Fitch, M. Gell-Mann, H. Kendall, L. Lederman, H. Mayer, W. Nierenberg, F. Zachariasen, and G. Zweig. Air-Supported Anti-Infiltration Barrier. Study S-255, Institute For Defense Analyses, Jason Division, August 1966. fas.org/irp/agency/dod/jason/barrier.pdf.
Edwards, Paul N. “Cyberpunks in Cyberspace: The Politics of Subjectivity in the Computer Age.” The Sociological Review, vol. 42, no. S1, May 1994, pp. 69–84. Wiley Online Library.
Edwards, Paul N. The Closed World: Computers and the Politics of Discourse in Cold War America. MIT Press, 1997.
Enthoven, Alain C. How Much Is Enough? Shaping the Defense Program, 1961-1969. CB-403, RAND Corporation, 2005, www.rand.org/pubs/commercial_books/CB403.html.
Farish, Matthew. “Canons and Wars: American Military Geography and the Limits of Disciplines.” Journal of Historical Geography, vol. 49, July 2015, pp. 39–48. ScienceDirect.
Finkbeiner, Ann. “Jason: Can a Cold Warrior Find Work?” Science, vol. 254, no. 5036, Nov. 1991, pp. 1284–86.
Gabrys, Jennifer. Program Earth: Environmental Sensing Technology and the Making of a Computational Planet. University of Minnesota Press, 2016.
Galison, Peter. “The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision.” Critical Inquiry, vol. 21, no. 1, 1994, pp. 228–66. JSTOR, www.jstor.org/stable/1343893.
Gray, Colin S. “What RAND Hath Wrought.” Foreign Policy, no. 4, 1971, pp. 111–29. JSTOR.
Haraway, Donna. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies, vol. 14, no. 3, 1988, pp. 575–99. JSTOR.
Haraway, Donna J. Modest_Witness@Second_Millennium.FemaleMan©_Meets_OncoMouse™: Feminism and Technoscience. Routledge, 1997.
Hough, Floyd W. “International Cooperation on a Geodetic Project.” Transactions, American Geophysical Union, vol. 32, no. 1, Feb. 1951, p. 106. AGU.
Jewett, Rachel. “HawkEye 360 Prepares to Expand RF Tracking Capabilities With Second Cluster.” Via Satellite, Jan. 28, 2021. www.satellitetoday.com/imagery-and-sensing/2021/01/28/hawkeye-360-prepares-to-expand-rf-tracking-capabilities-with-second-cluster/.
Krishnan, Armin. War as Business: Technological Change and Military Service Contracting. Ashgate, 2008.
Light, Jennifer S. From Warfare to Welfare: Defense Intellectuals and Urban Problems in Cold War America. Johns Hopkins University Press, 2005.
Miller, Greg. "The Untold Story of the Secret Mission to Seize Nazi Map Data" Smithsonian Magazine, Nov. 2019. www.smithsonianmag.com/history/untold-story-secret-mission-seize-nazi-map-data-180973317/.
Minnich, Richard T. “U.S. Defense Industry in Transition: Can the Leopard Change Its Spots?” Business Forum, vol. 18, 1993, pp. 13–7. Gale Academic OneFile, link.gale.com/apps/doc/A14500067/AONE?u=anon~2d5ac39c&sid=googleScholar&xid=70d8b750.
Murdock, Clark A. “McNamara, Systems Analysis and the Systems Analysis Office.” Journal Of Political & Military Sociology, vol. 2, no. 1, University Press of Florida, 1974, pp. 89–104, www.jstor.org/stable/45292889.
Murray, Williamson. “Clausewitz Out, Computer In: Military Culture and Technological Hubris.” The National Interest, no. 48, 1997, pp. 57–64. JSTOR, www.jstor.org/stable/42897124.
Nijholt, Anton. Computers and Languages: Theory and Practice. Elsevier North-Holland, 1988.
Noble, Safiya Umoja. “Geographic Information Systems: A Critical Look at the Commercialization of Public Information.” Human Geography, vol. 4, no. 3, Nov. 2011, pp. 88–105. SAGE Journals.
Novak, Matt. “How the Vietnam War Brought High-Tech Border Surveillance to America.” Gizmodo, 24 Sept. 2015. paleofuture.gizmodo.com/how-the-vietnam-war-brought-high-tech-border-surveillan-1694647526.
Packer, Jeremy, and Josh Reeves. “Romancing the Drone: Military Desire and Anthropophobia from SAGE to Swarm.” Canadian Journal of Communication, vol. 38, no. 3, 2013, pp. 309–31.
Packer, Jeremy, and Peter Galison. “Abstract Materialism: Peter Galison Discusses Foucault, Kittler, and the History of Science and Technology.” International Journal of Communication, vol. 10, 2016, pp. 3160–73.
Perkins, Chris, and Martin Dodge. “Satellite Imagery and the Spectacle of Secret Spaces.” Geoforum: Journal of Physical, Human, and Regional Geosciences, vol. 40, no. 4, July 2009, pp. 546–60. ScienceDirect.
Rankin, William. After the Map: Cartography, Navigation, and the Transformation of Territory in the Twentieth Century. University of Chicago Press, 2016.
Rosenzweig, Phil. “Robert S. McNamara and the Evolution of Modern Management.” Harvard Business Review, Dec. 2010, pp. 86–93.
Ryan, Alex J. “Military Applications of Complex Systems.” Philosophy of Complex Systems, edited by Cliff Hooker, vol. 10, North-Holland, 2011, pp. 723–80. ScienceDirect.
Siegert, Bernhard. “The Map Is the Territory.” Radical Philosophy, vol. 169, September/October 2011, pp. 13–16, www.radicalphilosophy.com/article/the-map-is-the-territory.
Spires, David N. and Rick W Sturdevant. “From Advent to Milstar: The U.S. Air Force and the Challenges of Military Satellite Communications," in Beyond the Ionosphere: Fifty Years of Satellite Communication, edited by Andrew J. Butrica, 1997, pp. 65–79, ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19970026049.pdf.
Steinitz, Carl. “Beginnings of Geodesign: A Personal Historical Perspective.” Esri, 2013, www.esri.com/about/newsroom/arcnews/beginnings-of-geodesign-a-personal-historical-perspective/.
Tambini, Anthony J. Wiring Vietnam: The Electronic Wall. Scarecrow Press, 2007.
Tobler, W. R. “Analytical Cartography.” The American Cartographer, vol. 3, no. 1, 1976, pp. 21–31. Taylor & Francis Online. doi.org/10.1559/152304076784080230.
Tobler, Waldo R. “Automation and Cartography.” Geographical Review, vol. 49, no. 4, Oct. 1959, pp. 526–34.
Ullrich, Rebecca. Building On and Spinning Off: Sandia National Labs’ Creation of Sensors for Vietnam.” History of Science Society Meeting, 8. Nov. 1996. Atlanta, GA. Conference paper. University of North Texas Digital Library, digital.library.unt.edu/ark:/67531/metadc677915/.
United States Congress. Senate Committee on Armed Services. Investigation Into Electronic Battlefield Program: Hearings .... US Government Printing Office, 1971. HathiTrust, hdl.handle.net/2027/uc1.$b643957.
Vernon, Rebecca Rafferty. “Battlefield Contractors: Facing the Tough Issues.” Public Contract Law Journal, vol. 33, no. 2, Winter 2004, p. 369–422. HeinOnline, www.jstor.org/stable/25755275.
Warner, Deborah Jean. “Political Geodesy: The Army, the Air Force, and the World Geodetic System of 1960.” Annals of Science, vol. 59, no. 4, 2002, pp. 363–89. Taylor & Francis Online.
Winthrop-Young, Geoffrey. “Hardware/Software/Wetware.” Critical Terms for Media Studies, edited by W. J. T. Mitchell and Mark B. N. Hansen, pp. 186–214. University of Chicago Press, 2010.
Wilson, Matthew W. New Lines: Critical GIS and the Trouble of the Map. University of Minnesota Press, 2017.