Session Details for 2014 SIGCIS Workshop
Session Details for 2014 SIGCIS Workshop
"Computing the Big Picture: Situating Information Technology in Broader Historical Narratives"
9:00–10:30 am: Opening Plenary. Room A.
Introduction to Workshop and of Keynote Speaker by Andrew Russell (Stevens Institute of Technology) and Jason Gallo (Science and Technology Policy Institute)
Plenary Lecture by Prof. Jennifer Light (MIT)
10:30–11:00 am: Coffee Break and Presentation of Computer History Museum Prize
11:00 am–12:30 pm
Information Technology and the Automated Society (Traditional Papers I): Room A
Chair and Commentator: Rebecca Slayton (Cornell University)
-
Paul Ceruzzi (Smithsonian National Air and Space Museum), “The SHOT/AHA Series on Historical Perspectives on Technology, Culture, and Society: What Should a Booklet on Computing and Information Technologies Contain?”
PAPER ABSTRACT: I have been asked to prepare a booklet on the place of Computing and Information Technologies in American History for the joint SHOT/AHA series “Historical Perspectives on Technology, Culture, and Society.” A number of excellent volumes have already appeared, and they have received an enthusiastic positive response from members of SHOT and the AHA. The prospect of covering this topic in a work of around 20,000 words (60-80 double-spaced, typed pages) is daunting. What are the “Big Questions,” and what can be left out? The term “Big Questions” I borrowed from a 1970 essay by George Daniels on the History of Technology (Technology & Culture vol. 11, pp. 1-21); it complements the other logical starting place, Michael Mahoney’s 1988 essay on “The History of Computing in the History of Technology” (Annals 10/2, pp. 113-126). But a lot has happened in computing since Mahoney’s essay, especially the development of the Internet, the smartphone, and social media. To the layperson (and to members of the AHA who are not familiar with this history) these are synonymous with computing. How often does the popular press mention IBM—or even Microsoft? My paper will present some preliminary ideas about how to structure such a work, and I hope also to engage members of the SHOT SIG in this endeavor. -
Arvid Nelsen (University of Minnesota), “Debates on Automation in the 20th Century: Interpreting New Sources at CBI” nels0307@umn.edu
PAPER ABSTRACT: Automation and its results sparked debate in the twentieth century (Bix, Shaiken, Noble). Little is agreed upon about this debate, including sometimes even its existence. Historians of computing, and historians of technology more broadly -- drawing upon available archival sources -- are likely to discover little evidence of the multitude of opinions, hopes, and concerns about automation and its effects on society. Archival collections interested in computing have traditionally focused their collecting efforts on the personal papers of individuals and the corporate records of organizations that, most often, stood to benefit from the successful implementation of automation. Archives have typically if unintentionally omitted dissenting perspectives, including those of marginalized groups. An intentionally more inclusive collection policy that seeks primary-source materials created by labor unions, left-wing political organizations, and even informal social movements offers a broader representation of ideas and opinions about automation. The consideration of marginalized groups and unexamined source materials indicates that opinions about automation were far more nuanced than simply either "for" or "against."
This paper reports preliminary findings from a new collection at the Charles Babbage Institute designed to add diversity to the perspectives found within its holdings - on automation and other cultural and social issues that are usually missing from traditional corporate archival records. Recognizing that the debate over automation itself arose from longer-standing debates over industrialization, the paper begins its examination with materials published in the 1930s on Technocracy. It then examines pamphlets and tracts published in the 1950s and 1960s that articulate the (varied) perspectives of labor and management. Moving further into the 1970s and 1980s, the paper examines the effects of automation on white collar workers with special attention to women. This array of materials reveals the existence not of a singular debate on automation but rather many debates with varying concerns about the number, rate, and identities of displaced workers, the transformation in the quality of work and life, the effects of increased leisure time, and the broader environment of capitalism and the wage system.
By reporting on new and heretofore overlooked research materials, and suggesting preliminary interpretations of them, this paper points the way toward new perspectives in the cultural and social history of computing. It offers new understandings of a major social and cultural transformation that is vital to the history of computing. -
Andrew Gansky (University of Texas at Austin), “The Meaning of Life in the Automated Office” agansky@utexas.edu
PAPER ABSTRACT: This paper looks askance at technologically driven narratives of computing. Drawing on corporate records and sociological studies of office automation from the 1960s to 1980s, I illustrate that computer systems’ proven technological capabilities were relatively tangential to their careers in office environments. I instead emphasize the many social actors beyond the computing profession who mobilized the computer as a compelling metaphoric and technological instrument to describe and address various societal concerns. In particular, worker reactions to seemingly quotidian features of computer workstations, such as input devices, displays, noise and heat production, and unit size, became sites for scrutinizing gendered labor divisions, human needs for interpersonal communication, employee surveillance, and the job satisfaction of workers whose tasks were easily mechanized.
As a primary case study, I take the firm Diebold Group, Inc., which provided consultation services to large enterprises such as banks, insurance firms, and school systems contemplating computerized office automation. Although these reports do remark upon how data processing equipment made some human work redundant and fundamentally altered job tasks in other positions, they also indicate the relative failure of many computer systems to deliver on basic promises of increased efficiency and productivity or the rationalization of management practices. Instead of assigning fundamental historical agency to computers and their designers, the Diebold papers define computers as only one element of upheaval in work environments facing numerous internal and external pressures. Nevertheless, the computer frequently became a key discursive object through which work relationships were discussed and contested. Such rhetorical maneuvers show how workers repurposed computing’s ostensibly technological aspects to symbolize their perceived value within corporate organizations. These valuations could be bound up with particular identities, such as race and gender, as well as more sweeping questions about the status of the human and the ultimate ends of the large business enterprises that increasingly delimited working people’s life chances.
I do not therefore eschew attentiveness to technological functions, but I do articulate the need to recognize that users encountered computers as devices employed for particular social ends, making the computer far more than the sum of its hardware and software. Historians attentive to these dimensions can offer new and productive perspectives on how workers used computers to make sense of changing educational requirements, skill obsolescence, gender and racial dynamics, cycles of professional and personal instability, and the greater purpose of their working lives in large organizations.
-
Ekaterina Babintseva (University of Pennsylvania), “Between Life and Mechanism: The Notion of Information in Warren McCulloch's Theory” ekatb@sas.upenn.edu
PAPER ABSTRACT: This paper examines the theoretical work of neuropsychiatrist Warren McCulloch on information and discusses the significance and implications of his views for historians of computing, technology, and biology. Interested in higher nervous activity, McCulloch organized and chaired the Josiah Macy conferences – a set of academic meetings that took place in 1940s-1950s and brought together scholars from different disciplines with one shared interest, namely, the work of the human mind. Historians often refer to this community of scientists as the “cybernetics group” to emphasize their shared interest in the analogies among brain and computing machines and the influence of Norbert Wiener's 'Cybernetics' on their research interests.
Studying the work of the “cybernetics group” as a whole, scholars tend to make generalizations about the research of Macy's participants and argue for the mechanization of thinking by scientists belonging to that group. Instead of approaching the “cybernetics group” as an academically homogeneous community that reduced thinking to a mechanical process and the brain to a computational machine, this paper focuses exclusively on McCulloch's theory of information that argues against the dualism of living and non-living. According to McCulloch, information runs through the genes and neurons of living organisms and through the wires and cables of non-living objects, thus building a holistic picture of the world where living and mechanic are indistinguishable. McCulloch's theory of information with its denial of the duality of living and non-living is reminiscent of the theory of French philosopher of biology Georges Canguilhem. Canguilhem – a contemporary of McCulloch – argued against the binary opposition of mechanism and vitalism and suggested that both technology and living organisms perpetually oscillate between life and mechanism.
This paper argues that McCulloch's theory of information suggests that certain technologies, specifically computers, can be interpreted as forms of life that fall behind the conventional categories of mechanism and life. Full understanding of such technologies requires both the perspectives of historians of technology and historians of biology. Drawing upon the archival materials from the American Philosophical Society (Philadelphia), published works of Warren McCulloch, and published materials from Josiah Macy conferences, this project contributes to the existing scholarship on the history of cybernetics, history of computing, and the history of mathematical and mechanical models in biology.
- William Vogel (University of Minnesota), “Shifting Attitudes: Women in Computing, 1965-1985” williamfvogel@gmail.com
PAPER FULL TEXT: Available for discussion during session. Images (? 20 MB) are in a separate file.
PAPER ABSTRACT: This paper examines the place of women in the computing industry from 1965 to 1985, using advertisements, feature and other articles, and letters from the trade journal Datamation, along with recruitment materials from the Control Data and Burroughs Corporations. It is an early work in progress supported by a Sloan Foundation grant intended to examine the experience of women in the computing industry in these crucial decades. Though the underrepresentation of women in computing today is a subject of contemporary discourse, the historical perspective of this period of study, in which the percentage of women in the computing industry tripled, can illuminate discussions which often seem to treat the gender dimensions of contemporary computing as ahistorically timeless. This work can also help redress the relative neglect of gender within the historiography of computing.
Datamation is significant as a representative of computing industry perspectives. Detailed examination of its advertisements, letters, and articles shows that its approach to gender can be divided into 4 distinct periods between the 1960s and the 1980s: (a) one of open sexism and outright hostility toward women throughout the 1960s, (b) conflicting attitudes between 1969 and 1974, (c) conscious positive editorial attention to women within the industry between 1975 and 1981, and (d) an apparent decline in this interest in by middle of the 1980s.
The significance of this last period, in light of the peak and subsequent decline of the percentage of women in computing in the mid to late 1980s is a noteworthy question. This industry-wide evidence from Datamation is complimented by recruiting materials from the Control Data and Burroughs Corporation archival holdings at the Charles Babbage Institute, including some aimed specifically at women, highlighting the shifting place of gender within industry attitudes in this period.
Connecting computer history to economic history, I consider the stakes of economic and ecological models of human-computer interaction. This work draws on contemporary papers published by foundational visualization researchers, including Stuart Card, George Robertson, Jock Mackinlay, Ben Bederson, and Peter Pirolli, who, in turn, incorporated theories of perception and behavior from other disciplines. I ask how and by what criteria the framework of information foraging allowed computer scientists to formalize principles for effective visualization. I argue that information foraging and the information visualizer defined economistic user subjectivities and visual modes of knowledge and production. This research historicizes contemporary debates on the relationship of computing to economic and political arrangements, often described in abstract terms such as the knowledge economy or post-Fordism.
I plan to expand the scope of this project to include interviews with researchers who developed these models and archival research at key institutions and enterprises, including the Xerox Palo Alto Research Center, the Human-Computer Interaction Lab at the University of Maryland, and Silicon Graphics. As an early stage doctoral student, I would especially appreciate feedback that contextualizes this project with specialized secondary literatures in the history of computing. The larger research project will address historical questions regarding the development of visual and computational models of knowledge and communication, cognitive theories of visualization, knowledge work, and materialist or hardware-oriented innovations in displays, and graphics processing. Together, these frameworks, machines, and systems inaugurated a contemporary mode of knowledge and production based on visual, computational interaction with data.
- Steve Anderson (University of California, Riverside), “The Digital Imaginary: Mainframe Computers from the Corporate Basement to the Silver Screen, 1946-1968” sande010@ucr.edu
PAPER ABSTRACT: Although rarely discussed as a crucial component of postwar consumption, mainframe computers provided the foundations of modern consumer culture. By calculating, processing and storing information, mainframes amplified the rapid pace of consumer growth during the postwar era. My dissertation will examine the adoption and cultural impact of the digital mainframe computer from its invention in 1946 until 1968 and the early demonstrations of personal computing technology. Although largely hidden from view in corporate basements, through the processing of paperwork mainframe computers were intimately tied to networks of public and private infrastructure, finance, and distribution. Their social and cultural implications, though, were the frequent subject of mass media considerations during the years of my study. My project thus examines a range of dimensions within the history of technology and consumer culture, showing that changing patterns of mass consumption in the postwar era were magnified by the processing power of digital mainframe computers. - Margarita Boenig-Liptsin (Harvard University), “Making the Citizen of the Information Age: A comparative study of computer literacy programs for children, 1960s-1990s” mliptsin@fas.harvard.eduPAPER FULL TEXT: Available for discussion during session.
PAPER ABSTRACT: My dissertation is a comparative history of the first computer literacy programs for children. The project examines how programs to introduce children to computers in the United States, France, and the Soviet Union from the 1960s to 1990s embodied political, epistemic, and moral debates about the kind of citizen required for life in the 21st century. I analyze historic archival material and personal interviews using the Science and Technology Studies (STS) method of cross-national comparison and the framework of coproduction in order to show computer literacy programs to be sites in which key epistemic and normative debates of the second half of the 20th century play out.
The designers of computer literacy programs identified computer literacy both as a set of practical skills necessary for life in the information age and as a formative practice for disciplining a citizen's mind. Debates about what it means to be and how to become computer literate entailed commitments to what it meant to think well and be a full human being and were framed in the rhetoric of citizenship. Such debates included, for instance, whether to focus on the teaching of typing and programming skills or on algorithmic ways of thinking and whether to use ready-made software to deliver content or let students play with an open-ended software environment.Despite wide agreement that computer literacy could be a solution to economic and educational problems facing the three countries, there was no one such thing as "computer literacy." The actors differed on the methods they would use and in their opinions about what kinds of knowledge and skills that were important to foster. To ground the study I focus on a few representative pioneers from the three countries and their respective programs: Seymour Papert and Patrick Suppes in the US, Jacques Perriault and Jean-Jacques Servan-Schreiber in France, and Andrei Ershov in the Soviet Union. These individuals and their colleagues were the most influential and distinct set of actors developing computers in education in the three countries. I show the complexity and nuances of disagreement among these pioneers, especially in a time of Cold War, economic troubles, and educational upheavals. At the same time, I demonstrate how they struggled with the same fundamental question about the ideal form of the future citizen in relation to the growing role of the computer in public life.
12:30–2:00 pm: Lunch with IEEE History Committee and related SHOT SIGs
Chair: Christopher Leslie (New York University)
Commentator: Cyrus Mody (Rice University)
-
Nicholas Lewis (University of Minnesota), “Computing Behind the Red Line: The HPC History Project at Los Alamos” lewi0740@umn.edu
PAPER ABSTRACT: Los Alamos National Laboratory, heir to the famed Manhattan Project lab that inaugurated the atomic era, has long been an enigma to computing historians. We know that many well-known high-performance computers—IBM’s Stretch, two dozen Control Data and Cray supercomputers, early parallel and cluster computers, and the first ‘petaflop’ supercomputer, among many others—found use at Los Alamos in the past seven decades, but very little is known about how these computers were used and what computing innovations took place at the Lab (see Annals articles by Metropolis 1982 on Los Alamos; Sweeney 1983 and Voorhees 1983 on LANL’s IBM 701; and MacKenzie 1991 on nuclear weapons labs). In the spring of 2014, LANL and CBI initiated a long-term cooperative research project to better understand LANL’s place in the world of high-performance computing and in computing writ large. This paper reports on early results from on-site research at Los Alamos this summer, including surveys of archival resources, preliminary oral history interviews with Lab staff, and other efforts to identify and gain access to historically valuable artifacts, documents, projects, and people.
This project seeks to analyze more than famous supercomputers. During the Cold War, the Lab’s pervasive culture of secrecy made it difficult for outsiders to learn of the Laboratory’s significant work in operating systems, mass storage, computer security, program languages, networking, graphics, and other developments in computing. CBI is working closely with computer scientists in LANL’s High-Performance Computing division. Research this summer made an appraisal of the lab’s DEMOS operating system, written for the Cray-1 supercomputer and seemingly terminated. Yet, far from a “dead end,” DEMOS provided a model for the Fast File System of BSD Unix and became a platform for research in distributed computing. Research this summer into networking at LANL yielded two unexpected insights: First, that security and performance concerns at the Lab made it wary to adopt the inherently insecure and less efficient (compared to Lab-developed protocols) networking model of TCP/IP. Second, that LANL played a key behind-the-scenes role in standardizing the super-fast interconnects needed for high performance computing, thus influencing the wider computing industry. We aim to uncover the complexities and unique features of LANL’s computing while fashioning historical narratives that engage the interest of Lab staff and advance arguments of scholarly interest.
-
Chuck House (InnovScapes Institute), “The Cisco Heritage Project” housec1839@gmail.com
PAPER ABSTRACT: Ten books exist, stimulated by Cisco’s meteoric stock valuation (dot.com boom). Most authors—journalists —focused on the CEO and Cisco’s M&A strategy. For the ‘networking decade’ (1980-90), forty other companies succeeded in attracting venture capital before Cisco. No book notes that Cisco needed 18 months to sell its first router to a commercial company, another 28 months to find the second. Two founders, with no business experience, fired by Stanford University for ‘borrowing’ parts, labor, and intellectual property, must have seemed a poor bet in mid-86.
Computer historians have focused on CPU hardware manufacturers rather than companies building peripherals, software, networking, databases, or services—largely covering key people (founders, inventors, CEOs) and products (e.g. IBM 360, Intel microprocessor, Apple MacIntosh or iPhone). ‘Popular computer company histories’ have been written by journalists or business historians, rather than by computer historians. We are developing materials for composing a true Cisco Systems history.
Cisco matters to the information age, having produced 80%+ of the world’s enterprise routers and switches. Yet no corporate history exists of Cisco (or of the collective set of companies) describing and analyzing the rapid deployment of Internet infrastructure, clearly an epochal societal turning point. Working with the Computer History Museum and Cisco Foundation, we’ve conducted 120 in-depth interviews of early employees and alumni, correlating forty books and eighty key Internet Society interviews. The interviewees, very broad and inclusive, were selected using an InnovaScapes model for Corporate Culture—Executive, Invester, Customer, Competitor, Employee, and Community perspectives. We conclude that Cisco prevailed due to four differentiators: (1) Both founders brought an Information Technology (IT) rather than Computer Science perspective, (2) co-founder Sandy Lerner commanded a ‘customer advocacy’ perspective, (3) the assembled team embraced ‘crowd-sourcing’ development and support techniques fifteen years early, and (4) belief in and unique adaptation of Corporate Social Responsibility.
-
James R. Lehning (University of Utah), “Technological Innovation and Commercialization: the University of Utah Computer Science Department, 1965–1975” jim.lehning@utah.edu
PAPER ABSTRACT: Histories of computing, and especially of computer graphics, often cite the important contributions made by the Computer Science program/department at the University of Utah in the period between 1965 and 1975, the so-called “golden age” of computer graphics. These accounts are both academic and popular, and comment on “the prolific number of path-breaking discoveries made by Department of Computer Science faculty and graduate students during this period” and “the enormous, ongoing impact of Ph.D. alumni … on the field of computer graphics and computing in general.” These accounts focus on the accomplishments of individuals, such as the founder of the Department, David Evans, his collaborator Ivan Sutherland, and the students who went through the Department’s Ph.D. program, including John Warnock, Ed Catmull, Jim Clark, and others.
While recognizing the important contributions of these individuals, my paper focuses on the larger contexts within which their discoveries and inventions took place, and is part of a larger research project on technology innovation at the University of Utah in the late 20th century. A paper I will give at another conference in September focuses on Thomas Stockham’s work on digital audio recording 1968-1980. Using materials in the University of Utah Archives, including the papers of David Evans and University administrators, the paper proposed here will examine the development of the Computer Science program from the appointment of Evans as Director of the program in 1965 through 1975, when a decline in federal funding ended this initial era of growth of the Department. The paper also will examine the attempt by Evans and Sutherland to transfer the results of their research into the marketplace through the Evans & Sutherland Computer Corporation, founded in 1968. There were several important factors in this process: the ongoing and uneven transition of the University of Utah from a primarily undergraduate university into a major research university, as a relative late-comer to this transition in American higher education; the ability of E&S to find markets for their products and venture capital to support the corporation during the fluctuations in the U.S. economy; and the changing political context, especially the closer scrutiny of defense expenditures that resulted from the Vietnam War, federal budget concerns, and the subsequent decline in defense funding for research in computer science.
-
Michael Castelle (University of Chicago), “Making Markets Durable: Transaction Processing in Finance and Commerce” mcc@uchicago.edu
PAPER ABSTRACT: This paper makes the case for the technological formalization of the transaction—a set of explicit techniques emerging from research on shared databases in the 1970s and early 1980s—as a fundamental prerequisite and facilitator for significant transformations in the scale of finance, commerce, telecommunications, and organizations in general. While the term 'transaction' is frequently used in many discussions of economic activity, it is rarely the direct subject of examination, despite its apparent centrality; this is true whether its role be ascribed to the material mediation of money, of financial capital more generally, or of the rationalized technical activity at the core of the varied large formal organizations through which transactions most commonly pass. The modern technical representation of transaction, however, began as a practical solution to a very specific technical question: how can a computer system successfully handle simultaneous requests contending to rapidly read and write from the same large data resource?
While the 1960s and 1970s saw the emergence and adoption of specialized applications for transactions (the airline booking system SABRE; the mainframe-based transaction monitor CICS, and the hierarchical mainframe database IMS), these systems were developed in a comparatively ad hoc manner; it was only in the late 1970s that research by Jim Gray and others in IBM San Jose’s System R group helped form the set of requisite techniques—for concurrency control, and for recovery from potential failure at multiple system layers — which, by the early 1980s (and to the present day), were encompassed by the acronym `ACID': atomicity, consistency, isolation, and durability. With the success of Tandem Computers — a Silicon Valley firm specializing in so-called “NonStop” systems (and which employed Gray post-IBM) — it became possible for banks, financial exchanges, and large commercial firms to begin to depend on the concurrency and reliability of databases with support for "on-line" transaction processing (OLTP). As Gray put it as early as 1977, "The life of many institutions is critically dependent on such systems.. when the system is down the corporation has amnesia." The 21st-century environment of large-scale, global enterprises and high-frequency, automated financial markets must then be characterized—for those scholars of organization who care to admit it—as one which is completely reliant on the high availability of distributed systems that the formalization and implementation of transaction processing helped make possible.
- Kimon Keramidas (Bard Graduate Center), “The Interface Experience” keramidas@bgc.bard.edu
PAPER ABSTRACT: The last forty years of computing have been defined by the ascendance of the personal computer, a device that finally brought the power of computation out of laboratories and corporate technology centers and into the purview of the individual user. That forty years has seen a blur of technological advances in both hardware and software that has happened so quickly and been so dramatic in its effect on everyday life that we often forget to think about just how we have interacted with these machines differently over time. This presentation would describe two aspects of an upcoming exhibition that aims to defamiliarize some of the most ubiquitous objects in the history of personal computing.The first aspect of the exhibition is the display of five core objects that represent landmarks in particular developments in the material and experiential development of personal computing: Commodore 64, Apple Macintosh Plus, Palm Pilot, Apple iPad, and Microsoft Kinect. While museum exhibitions tend to convey history through the combination of the static presentation of objects with didactic texts, computers as interfaces are defined not by their objectness, but by the way in which they are used and the experience the user has with both the hardware and software of that device. As such, the five devices above will be functional and uploaded with custom software designed for use by visitors to the exhibition. This setup will provide an experience of the machine through the use native hardware (no emulation) while the custom software will play the role of didactic communication that traditional exhibition text panels provide.
Along with the functioning digital experiences on the five core objects, a web platform will provide historical and contextual information on objects in the exhibition as well as information that supplements that which is present in the space. This information will be traversable through a constellation of nodes, with each node representing a device that is connected to other things, places, and people through temporal, thematic, and relationship connections. A final feature of the web platform will be the ability for people to contribute personal recollections about computing experiences to the constellation. These recollections, which will either be connected to specific objects or exist on their own, will add a layer of individual historical expression.
- Katherine McFadden (University of South Carolina), “Hand Sewn Computing: Women’s Hobbies, Needlework and Computer Electronics” mcfaddke@email.sc.edu
PAPER ABSTRACT: Computing electronics is a world of wire and solder; a world that society has traditionally viewed as masculine. Microcontrollers like Lilypad Arduino, however, appeal to the oldest of feminine hobbies: needlework. The introduction of the original Arduino microprocessor to the market in 2005 produced a small shift in gendered participation in microcontroller hobbyism, with female-headed companies such as Adafruit encouraging women to take up computer electronics. Drawing on published interviews, sociological and material culture research, and forum posts (the twenty-first century’s version of the penny post), I will argue that historic trends in hobby adoption and the wide range of uses of the Lilypad Arduino made it a more attractive option to women than its counterparts because of its utility in female favored projects and the familiarity of its application. The Lilypad thus sits at an unexamined intersection of needle work and electrical engineering that is significant to the histories of both gender and computing in the United States. It also played a critical role in the entry of women into computing electronics, hobbyism, and the Makers movement more broadly.
Arduino boards did not initially do much to alter gender dynamics in computing electronics. The Arduino’s best-known contribution to this area was that it provided open source hardware that was within the reach of the computing hobbyist community. Both cheaper and easier to use than earlier microcontroller kits, the first Arduino board was created as a tool for design students and became one of the most important forces behind Maker creation due its accessibility. Despite this broader demographic appeal, users were predominantly male.
In contrast to this initial Arduino board, there have been significantly more female users of another Arduino, the Lilypad. While many women currently in STEM fields reject the idea of “feminizing” (which often amounts to little more than making the object in question pink instead of beige), Leah Buechley, then at MIT, took this idea further. Capitalizing on the Creative Common license, Buechley created the Lilypad Arduino, which could be sewn into a circuit. Conductive thread, utilized in place of solder and wire, is used to connect the microprocessor to LEDs and other gadgets to create a wide range of projects. These projects, often based on conventionally female activities like sewing, fit within the longer historical trend of women using female-coded skills in computing, from basic math of WWII female “computers” to secretarial typing.
- Jonathan Scott Clemens (University of Minnesota), “The Most Blatant Testimony We Have to American Waste:” Moral Panic and Video Arcade Games, 1978–1983” cleme263@umn.edu
Video arcade games (computer games housed in coin-operated cabinets and intended for placement in public areas) were the subject of a moral panic in the United States during the late 1970s and early 1980s. As mainstream American society tried to make sense of these unfamiliar computer technologies, some asserted that they were corrupting the nation’s youth and a threat to the moral fabric of society. Both arcade spaces and video arcade games were claimed to promote violence, drug use, gambling, loitering, stealing, truancy, and laziness.
Gamers, the video game industry, and the coin-op amusement industry countered that these machines were benign or even beneficial. Proponents pointed to their educational aspects, particularly in regards to familiarizing users with computer technologies. They asserted positive benefits to friendly competition and community building, and they drew historical parallels between the public response to video arcade games and previous moral panics over technologies such as pinball machines, movie theaters, and jukeboxes. Despite this resistance, public morality groups successfully organized campaigns that resulted in media attention and numerous cases of anti-video game legislation. Governmental bodies placed strong restrictions on video arcade games in areas such as Baltimore, Boston, and Detroit, and some smaller municipalities enacted outright bans. These actions helped put many arcade operators out of business in 1981 and 1982, diminishing the primary market for video arcade game sales and contributing to an industry crash in 1983. This paper explores both sides of the debate over the social implications of video arcade games in the United States from 1978 to 1983.
Drawing on media reports, trade journals, policy documents, and archival sources, it presents a side to the decline of video arcade games that has gone largely unexplored. It asserts that pro and anti-video arcade game efforts both reflected and created ideas about how these technologies operated, who they were for, and for what purposes they were used. It concludes that this milieu of ideas was a contributing factor in determining technological outcomes. These findings suggest that public responses to computer technologies can have a powerful effect on their social, economic, and developmental paths, and that the history of computing may benefit from increased attention to these dynamics.
- Michael McGovern (University of Cambridge), “Re-framing Power Relations in the Historiography of Computing: Examples from Early Medical Genetics and Calculator User Groups” mcgovern.mikey@gmail.com
PAPER ABSTRACT: In this talk, I want to discuss a couple of cases I have worked on recently in order to suggest some possible further ways for the historiography of computing to frame changing power relations. The first is the introduction of mainframe computing to medical genetics in the 1950s, when in the absence of genetic codes, analyses of Mendelian traits in pedigrees were used to ‘map’ the human genome. Using computing resources to pool and calculate data from researchers around the globe strained the existing moral economy of resource sharing, as the one investigator with control over the mainframe instated hegemony over fellow investigators. The other example is from a group of personal programmable calculator enthusiasts called PPC, who shared programs for HP calculators through a newsletter and autonomously organized programming efforts more than HP could manage. The discovery of a software bug that allowed instructions to be expanded led to an explosion of activity but the group’s incorporation, meant to support its activities, led to its dissolution.
In both of these cases, a technological niche involving programming expertise led to existing relations becoming strained and exhausted, with power becoming articulated when brokering between different sets of values. In medical genetics, it was a problem of isolated expertise and in calculator programming, a bureaucratic strategy for technological growth. The analysis of power is an area of great interest in the social sciences more generally, and I think it is important to show why the perspective offered by the history of technology and its cross sections is unique and essential. Presently, historians of computing and information technologies are uniquely situated to help fellow scholars move beyond conventional wisdom rooted in ideologies associated with the rise of computing and accounts in which technology is treated as a stable determinant. Just as much, we are equipped to give deep technical descriptions that probe at how visions of social order were grounded in emergent or existing technological configurations. We need to take seriously the capricious and mutable nature of programming power, which is both a social and technical reality, as Chris Kelty has suggested through his notion of “recursive publics.” How are power relations affected by the possibility of readily producing actually existing technological alternatives, and is computing really different from other fields in this respect?
- Beatrice Choi (Northwestern University), “Ser Técnico: Localized Technology Transfer, Emerging Technical Actors, and the Brazilian Computer Industry” beatrice.choi@u.northwestern.eduPAPER FULL TEXT: Available for discussion during session.
PAPER ABSTRACT: In considering technology transfer in the Global South, Brazil's recent upsurge in open-source software development raises the historical question of how disparate ideological conceptions of nationalism, market censorship, and innovation have played a role in the dissemination and adoption of what we now consider universally acceptable technology: computers. Through a case study on Brazil's "indigenous" or "hybrid" computers, I build a media-historical analysis that starts with Brazilian military protectionist policy encouraging the local development of computers during the 1970s-1980's. I then lead up to the current cultural, political, and technological climate of global-minded free/livre open-access software (FLOSS) to argue about the various "local" valences of technology transfer. Ultimately, I contest the reductive idea of a "trickle-down" model of technological adaptation by introducing various technical "actors", or seres técnicos, who emerge in Brazil to address needs rising from specific technological moments to explore larger rhetorical ideations of labor, free speech, and knowledge production.
- William Aspray (University of Texas at Austin), “How to Frame a Study of the History of IT Education and its Relation to Broadening Participation in the IT Workforce in the United States”
PAPER ABSTRACT: With support from the Sloan Foundation, I have recently embarked on a project to write a short book on the history of IT education and its relation to broadening participation in the IT workforce in the United States. The study will centrally cover the following topics: (1) the early years of computer education in America (1940s to the early 1960s) (2) the development of undergraduate programs in computer science (1960s) (3) the transition in emphasis in computer science from a mathematically oriented to an engineering-oriented discipline. (4) the role of the national funding agencies in the shaping of computer education and worker production. (5) responses to both shortages of information workers and a more interdisciplinary notion of information research and (6) the history of interventions to broaden participation in computing. The study will include: (a) other forms of education such as associate degrees, for-profit universities, and non-degree educational offerings by universities and industry; (b) both graduate degrees, which are those that principally feed the research community, and undergraduate degrees, which supply the large number of programmers and other types of IT workers working in many different industries; and to some degree (c) educational history (beyond computer science and iSchools) in other information fields such as computer engineering, operations research, management information science, and library and information science.
The purpose of the SIGCIS talk is to suggest the ways in which I plan to frame the project, leaving lots of time for discussion with the audience about topics and approaches. In particular, and in keeping with the announced theme for the 2014 workshop, I will raise questions about the relations of history of IT with two other well-established disciplines: the social, political, and economic study of the IT workforce; and the social study of gender and ethnic diversity in the science, technology, engineering, and mathematics.
- Alex Campolo (New York University), “White-Collar Foragers: Ecology, Economics, and Logics of Information Visualization” amc989@nyu.edu
PAPER FULL TEXT: Available for discussion during session.
PAPER ABSTRACT: In the 1980s and 1990s computer scientists working in the nascent field of information visualization posed a question that would define their field: how do users to draw meaningful insights from abstract data? This paper describes an influential response to this question in the form of information foraging, a human-computer interaction framework, and the information visualizer, a software interface. This framework and software environment defined logics of information visualization that continue to structure representations of data, guiding diverse forms of intervention and policy. However, information foraging was neither created ex-nihilo nor wholly determined by new hardware capabilities of the period. Instead, the framework borrows concepts, including cost structures, scarcity, and rationality, from the fields of psychology, ecology, and economics. This interdisciplinary constellation of ideas formalized theories of user perception, behavior, and action within a visual economy of information.
- William McMillan (Concordia University), “Technical Trends in the History of Operating Systems” william.mcmillan@cuaa.edu
PAPER ABSTRACT: To complement existing scholarship in the history of computer operating systems, this paper organizes developments from the 1950s to the present day by technical trends or themes, presenting them so that students and professionals in computer science can deepen their understanding of how operating systems work and why they have the characteristics they have.The histories of particular operating systems are sampled selectively in order to illustrate the evolution of thinking in this field, featuring early—but not necessarily the earliest—appearances of features and techniques. Trends in the development of operating systems include 1) automation of, e.g., computer access, process management, and management of secondary storage, 2) memory protection to prevent processes from interfering with one another via unauthorized access to storage, 3) coordination or mutual exclusion to allow processes to share resources effectively, 4) parallelization of process, thread execution, and input/output operations via software and hardware, 5) fortification against intentional intrusion into, and accidental access of, computer systems and data, 6) simplification of state-of-the-art techniques in order to handle lower-cost machines in wider markets, 7) miniaturization by the elimination of major components in order to deliver functionality in real-time and mobile environments, and 8) personalization and direct manipulation of the elements of user interfaces and other aspects of operating systems. Starting from completely manual operation of the earliest computers, systems considered here include those for the first Univac computers, System/360/370, SAGE, SABRE, Multics, Unix, VMS, TOPS-20, Linux, UCSD Pascal, MS-DOS, Windows, the Macintosh OS, and operating systems for mobile platforms.
- Lav Varshney (University of Illinois at Urbana-Champaign), “Block Diagrams in Information Theory: Drawing Things Closed” varshney@illinois.edu
PAPER ABSTRACT: Some have argued that “what every engineer needs is a good set of limit theorems,” that is, limits to how much system design can be improved if all the ingenuity of the engineer were brought to bear on the problem. Limit theorems, however, are deduced within mathematical theories. They are properties of what Galileo called “machines in the abstract,” rather than “machines in the concrete,” characterized in structural and functional terms rather than material ones. Fields like mathematical ecology, information theory, and thermodynamics claim to provide deductive systems in which concepts such as carrying capacity, channel capacity, and Carnot efficiency yield fundamental limits to agriculture, communications, and engines. This work considers fields that seemingly give limits on engineering practice, with a focus on Shannon information theory.
We discuss the cognitive, social, and performative roles that block diagrams have played in the historical development of this engineering systems theory, using a methodology that takes scientific literature as a primary informant, but also uses secondary literature and archival material. The study considers both the content of scientific ideas and the social norms and structures of scientists.
Block diagrams mediate between the physical, practical world of information system engineering and the mathematical world of information theory; they are cognitive tools that create a closed universe of discourse for deduction. Further, block diagrams serve as metonyms for problems in information theory, similar to the cognitive role played by geometry diagrams in Greek mathematics. The diagrams capture the essentiality of communication.
Diagrammatic closure also informs the expansion of information theory to disciplines beyond communication engineering. Gieryn (1999) argued that social groups work to define boundaries so as to carve out monopolistic fields, expand the boundaries of disciplines, expel transgressors, and protect fields from outside control. Elements of all of these forms of boundary work are evident in information theory. Moreover, it is argued that block diagrams play a role in this cultural cartography. Information theory and information theorists are characterized by work that is within the bounds of the diagrams.
Finally, the performative role played by information-theoretic diagrams in the practice of communications engineering is discussed. The source-channel separation theorem, embodied in diagram form, shaped the architecture of communication systems in its image. The separation-based architecture also created the notion of information rate, measured in bits, which has been adopted by nearly all practitioners of communication engineering and information system design.
- Barbara Walker (University of Nevada, Reno), “Gossip, Storytelling, and the Spread of Innovation: The von Neumann and Lebedev Computer Projects in Comparison” bbwalker@unr.edu
PAPER ABSTRACT: I propose to present a section of my book in progress, “A War of Experts: Soviet and American Knowledge Networks in Cold War Competition and Collaboration,” describing one competition in the area of computing history. My presentation would compare the John von Neumann stored program computer project at the Institute for Advanced Study in the US with the somewhat similar Sergei Lebedev project near Kiev in the Soviet Union. I argue that while there was a rough parity between the two countries at this early stage of the competition, the United States already had the advantage due to the greater density and freedom of mathematical and engineering networks in the post-war United States as well as to their integration at key points with commercial networks (unavailable to the Soviets). While this advantage might not have been obvious at the time, it would soon manifest itself when the Soviet Union chose to reverse-engineer the IBM 360/70 rather than to build on the accomplishments of its own computing community when expanding COMECON computing capabilities. My comparison focuses to no small extent on how effectively the relevant networks and their institutional formations facilitated informal conversation in forms ranging from story-telling (such as the mathematical anecdote) to professional consulting. For informal face-to-face interaction, and the social formations that foster it, are one critical factor in the rise of innovation and the expansion of knowledge, I argue.
Trained in Soviet and Modern European history, with a strong background in anthropology, I have worked until now primarily on Russian and Soviet 19th- and 20-century intelligentsia circles (kruzhki) and networks. In reading the works of such scholars of the Russian/Soviet history of science and technology as Loren Graham, Kendall Bailes, Slava Gerovitch, Michael Gordin, and Ksenia Tatarchenko, also through personal engagement with each of these individuals, I have begun to ask the some of the same questions about mathematical and computing communities that I once did about arts, humanities and dissident circles, as well as some new questions. Particularly influential on my current project have been such scholars of computing history as William Aspray, Atsushi Akera, Seymour Goodman, Anne Fitzpatrick, and James Cortada. Primary sources, especially as I look at myth-making on both sides of Cold War computer competition, include the works of Boris Malinowski and George Dyson. I have also done archival work at the Institute for Advanced Study (where I was a member 2013-2014), the Library of Congress, and the Babbage Institute archives, and have used on-line oral history sources extensively.
- Gerardo Con Diaz (Yale University), “Embodied Software: Patents and Software Development, 1946-1970” gerardo.condiaz@yale.edu
PAPER ABSTRACT: In the late 1960s, several prominent attorneys and programmers started using the phrase “embodying software” in reference to a patent-drafting strategy for software inventions. This strategy consisted of claiming a computer in which a program served as the control system instead of claiming the program itself. The computer described in the patent served as an embodiment of the software, and it was the one that received patent protection in lieu of the program. Embodying software was a valuable legal maneuver; patent attorneys at hardware firms and industrial research laboratories feared that programmers who did not embody their software were unlikely to secure patent protection for their programs.
This talk studies the history of embodying software as a patent-drafting technique. Although the term “embodying software” originated in the late 1960s, the practice itself originated at Bell Laboratories during the late 1940s. In 1946, a mathematician named Richard Hamming started developing an error correcting code—a program that would enable the computer to correct certain processing errors in order to avoid a complete stop—for a computer called the Mark V. In 1948, Bell’s patent attorneys encouraged Hamming to submit a patent application for his program, but they insisted that Bernard Holbrook, one of the laboratory’s electrical engineers, should first produce descriptions and diagrams to disclose the circuitry manipulations that Hamming’s program produced in a computer. Hamming and Holbrook submitted their patent application in 1950, and merely one year later the Patent Office issued their patent—the first one for an embodied program.
I will argue that the histories of software patenting and embodied software are constitutive of, and inseparable from, one another. Their shared history invites us to reconsider how and why programmers and their attorneys struggled to obtain software patents during the formative years of the software industry. I propose that the events that historians and historical actors identify as the birth of software patenting in the late 1960s instead mark the software industry’s embrace of embodying software, and that hardware firms and industrial research laboratories had been employing this technique for over a decade. Drawing on patents, archival material, legal scholarship, and trade publications for the computing industry, this talk therefore showcases how the history of software-making carries with it a history of patent-drafting.