Big Data, Little Data, No Data
Scholarship in the Networked World
MIT Press 2015
New Books in CommunicationsNew Books in EducationNew Books in Politics & SocietyNew Books in Science & TechnologyNew Books in Science, Technology, and SocietyNew Books in Systems and CyberneticsNew Books in TechnologyNew Books Network April 20, 2015 Jasmine McNealy
Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v’s: volume, velocity, variety, and veracity. Volume refers to the massive scale of the data that can be collected, velocity, the speed of streaming analysis. Variety refers to the different forms of data available, while veracity considers the bias and noise in the data. Although many would like to focus on these details, two other v’s,validity and volatility, hold significance for big data. Validity considers the level of uncertainty in the data, asking whether it is accurate for the intended use. Volatility refers to how long the data can be stored, and remain valid.
In her new book, Big Data, Little Data, No Data: Scholarship in the Networked World (MIT Press, 2015), Professor Christine L. Borgman, Presidential Chair in Information Studies at the University of California, Los Angeles, examines the infatuation with big data and the implications for scholarship. Borgman asserts that although the collection of massive amounts of data is alluring, it is best to have the correct data for the kind of research being conducted. Further, scholars must now consider the economic, technical, and policy issues related to data collection, storage and sharing. In examining these issues, Borgman details data collection, use, storage and sharing practices across disciplines, and analyzes what data means for different scholarly traditions.