The National Data Center proposals between macro modelling and micro targeting

AbstractAs computer technology got more advanced during the 1960s, scientists used it to build economic models, run simulations and make predictions. But the gist of any method was a solid data base. Therefore, a committee of the Social Science Research Council developed a proposal for a Federal Dat...

Full description

Bibliographic Details
Main Author: Benedikt Josef Neuroth
Format: Article
Language:English
Published: Taylor & Francis Group 2023-12-01
Series:Cogent Arts & Humanities
Subjects:
Online Access:https://www.tandfonline.com/doi/10.1080/23311983.2023.2286077
Description
Summary:AbstractAs computer technology got more advanced during the 1960s, scientists used it to build economic models, run simulations and make predictions. But the gist of any method was a solid data base. Therefore, a committee of the Social Science Research Council developed a proposal for a Federal Data Center within the United States government. The Bureau of the Budget developed the idea further into a National Data Center proposal provoking a well-known debate on privacy. Less is known about how the proposals originally emerged. One goal was the access to microdata on individual units which were necessary for statistical operations such as correlation or matching of data sets. The article argues that the data center demonstrates the shift towards statistics based on micro units to build macro models. Scientists, however, faced obstacles such as rules concerning confidentiality and disclosure of data as well as a decentral structure of federal statistics, and their interests differed from the interests of politics.
ISSN:2331-1983