Jaime López
Data Product Developer
Centereach, NY

Sections:

Profiles:

Epistemic vigilance against the new empiricism of Big Data is necessary

The Big Data trend represents the emergence of a new empiricism. It is advocated as the computational means to know everything about anything, free of bias, and with a high degree of certainty. It is believed that having enough data renders interpretations or prior theories unnecessary.

Big Data is thought of as a direct emergence of knowledge, free of uncertainty, that makes traditional forms of scientific inquiry obsolete. It shifts focus from causal explanations to correlational analysis and diminishes the relevance of context. Furthermore, non-mediated knowledge is expected to be objective and fair.

Rieder and Simon (2017) state that this new empiricism is creating an increasingly opaque algorithmic environment, which they refer to as a "black box society." They note that big data collections are always small or partial, algorithms may perpetuate the prejudices of their creators, and forecasts are never certain (except in narrow and controlled environments).

They write that this aura of truth and objectivity contributes to an algorithmic culture that renders as normal and socially acceptable. In a black box society, systems work in a mysterious way, the distinction between state and market fades, and people submit to the rule of measurable data, the authors say.

"Government and corporate secrecy paired with technical inscrutability; obsolescent legal safeguards that are no match for new forms of digital feudalism; algorithmic scapegoating to avoid responsibility and curtail agency – these are the main ingredients of a thoroughly black-boxed data economy, in which opaque technologies are spreading, unmonitored and unregulated" (p. 95).

To deal with the negative effects of the Big Data transformation, Rieder and Simon call for epistemic vigilance. It is necessary to be aware of the potential dangers and pitfalls of an increasingly data-driven society. They suggest that people should have access to both the data used and the algorithms applied in Big Data systems, and develop competencies to understand Big Data analytical processes as a specific way of producing knowledge that is "neither inherently objective nor unquestionably fair" (p. 96).

All this implies legal reforms, educational measures, and technological interventions. Regarding the latter, technology should make the invisible visible and implement ethical solutions as a form of governance by design.

References:

Rieder, G. & Simon, J. (2017). Big Data: A New Empiricism and its Epistemic and Socio-Political Consequences. In W. Pietsch, J. Wernecke, & M. Ott (Eds.), Berechenbarkeit der Welt? Philosophie und Wissenschaft im Zeitalter von Big Data (pp. 85–105). Springer VS. https://doi.org/10.1007/978-3-658-12153-2_4

Jones, M. L. (2018). How We Became Instrumentalists (Again): Data Positivism since World War II. HIST STUD NAT SCI, 48(5), 673–684. https://doi.org/10.1525/hsns.2018.48.5.673