“Big data” is one of the buzzwords of 21st century research. In the sciences, it has been the subject of a special issue of Nature; in the social sciences and humanities, the National Endowment for the Humanities has sponsored a “Digging into Data Challenge” to encourage “big data” research in these fields. Reports on the impact of big data on research have been written by everyone from the Council on Library and Information Resources to Microsoft Research. Many of these pieces of writing emphasize the unprecedented ability that ever-more-powerful computers have given us to collect and analyze massive quantities of data.
But tools for working with “big data” long precede the invention of the modern, integrated-circuit-based computer*. The Hollerith Machine, a “computer” that could rapidly tabulate information recorded on punched cards, was invented in 1888 to solve a pressing big data problem of the day: how to tabulate the Decennial Census data gathered from the U.S.’s rapidly-growing population. This sort of punched card technology was used to process Census data for fifty years.
You can read more about how the Hollerith Machine worked in the “History” section of the U.S. Census Bureau’s website.
Not long after the Hollerith Machine was invented, academic researchers were considering how to apply this new tool and its successors to their own “big data” research problems. In our next Throwback Thursday post, we’ll look at some research from 1935 that used punchcards to analyze “big data.”
*Invented by Grinnell alumnus Robert Noyce.