News Release

Computing change: Researcher traces history of the personal computer

Haigh explores how and why people have used computers

Peer-Reviewed Publication

University of Wisconsin - Milwaukee

Thomas Haigh

image: Thomas Haigh owns a suitcase-sized "portable" computer from the 1980s. His small handheld PDA (shown on screen) has 2,400 times more processing power and 12,500 times more storage than the 1980s machine. view more 

Credit: Peter Jakubowski, UWM Photo Services

Carbon paper? Punch cards? What are those?

The Internet, personal computers, word processing and spreadsheets are so embedded in today’s society that it’s hard to remember that just 35 years ago they didn’t exist.

Thomas Haigh, assistant professor of information studies at the University of Wisconsin-Milwaukee (UWM), is among a very small number of computer experts in the world who are also historians, studying the role of technology in broader social change. These new experts are tracing how computers have changed business and society.

Researching late 20th century technology has given Haigh the opportunity to talk to many pioneers who developed both computers and the software that powers them. He conducted a series of oral history interviews for the Society for Industrial and Applied Mathematics, and has written about the history of word processing and the development of databases.

One constant Haigh has found in the “froth of change” in technology is that businesses and employees are constantly trying to figure out how to make the new gadgets and processes work for them.

“There’s this feeling that anything more than five years old is irrelevant, but one of the things I’ve found is that people are facing the same types of problems now as they did in the mid-1950s – projects using new technology are usually late and filled with bugs, the return on investment is hard to measure and computer specialists are expensive and speak an alien language.”

Specializing in the history of computers Haigh is one of a growing number of historians tackling the story of 20th century computer technology.

“We’re a small, chummy group,” he says. His special interest group on computers, information and society within the Society for the History of Technology has only 150 members, and that includes graduate students, interested non-academics and computer history teachers as well as researchers.

After earning undergraduate and graduate degrees in computer science, a Fulbright fellowship brought the British Haigh to America, where he earned his doctorate in the history and sociology of science from the University of Pennsylvania.

While most computer histories focus on the hardware – Univac and inventors tinkering in garages – Haigh also looks at the software – from word processing to spreadsheets to databases – that has changed the modern world.

Among other research projects, Haigh is currently working on a social history of the personal computer.

“Despite the shelves of books on the history of the personal computer there has been no serious historical study of how people used their computers or why they bought them.”

Early books made false promises

The first computer books quickly followed the development of the programmable computers in the late 1940s and early 1950s.

“The authors of these pieces wasted few superlatives in celebrating the unprecedented speed and power of these machines,” Haigh wrote in an article for the Business History Review published by the Harvard Business School. “Indeed, the earliest and most influential of the books was entitled ‘Giant Brains, or Machines That Think,’ a title that more sober computer experts spend decades trying to dispel from the public imagination.”

Those science fiction-like promises didn’t accurately reflect the reality, leading to inevitable disappointment, Haigh adds. “People would ask why they spent three years building it and writing computer code and it still made mistakes.”

Haigh’s research has shown that while new computer technology has been sometimes oversold as a complete solution for all business programs, it has also often been brushed aside as too newfangled or expensive for practical use.

“Firms tried to build enormous computer systems, into which they would place information on every aspect of their operations, and from which would flow exactly the information (including models and simulations) required by each manager,” he wrote in the Business History Review.

At the same time, managers were often reluctant to invest in new technology. Charles W. Bachman, creator of IDS, the first data base management system and winner of the ACM Turing award, the top prize in computer science, told Haigh one such story in an oral history interview.

Bachman commented on an early data management project that a department manager discontinued. “Maybe, (it was) because he thought it was too risky and was going to cost him too much money.”

In the 1960s and 1970s, says Haigh, people viewed computers as just another business machine, like an adding machine. Computers also were seen as big central processors and nobody really foresaw today’s iPods and MP3 players.

Haigh says he studies the history of technology and computers for the same reason any historian researches the past. “It’s a platitude, but if we don’t understand who we are and where we’re coming from, how can we understand where we’re going. That’s true of religion, culture, Iraq and it’s equally true of science and technology.”

###

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.