Digital cultures of technology and knowledge
Algorithmic Cultures of Knowledge
How did the computer revolutionise science? The “algorithmisation” of science was not simply a consequence of the use of computers in various disciplines, but a complex process during which new social networks between machines, humans and work routines developed.
Algorithmic Cultures of Knowledge – The Influence of Computers on the Development of Sciences
- Digital cultures of technology and knowledge
- Mathematik und Informatik
Although it has long become commonplace that the advent of the computer revolutionised the sciences, this fundamental process for the history of science and technology has barely been researched by historians of science so far. The project analyses the differences the computer made for scientific practice and the various disciplines’ agenda setting, and how the use of computer-based algorithmic methods changed scientific, technical and medical action.
The principal hypothesis is that (social) networks between scientists and their objects of study with computers, software and the “computer personnel” as well as the development of formalised working routines were of outstanding importance for the “algorithmisation” of the sciences. First field studies show that there was by no means a uniform process of a “computer revolution” across all sciences and countries, but that there were different processes of change with different consequences for individual disciplines instead of the common image of an ubiquitous computer simulation in all fields of science established in the history and theory of science so far. In addition, the “insights revolution” brought about by the use of computer-based algorithmic methods in certain areas of science and technology seems to have led to a “qualitative reduction” in the falsifiability of scientific models and theories, opening a kind of “digital Pandora’s box”.
Mathematical Statistics versus Machine Learning – The Influence of the Computer on the Development of Two-Cultures of Data Analysis
In the last decades of the 20th century, two different approaches to the analysis of complex processes in the scientific system were established: from the field of statistics, the so-called data analysis developed; here, statisticians assumed that their data were generated by a stochastic data model for which a probability distribution had to be estimated. Research into artificial intelligence established the field of machine learning. Its adherents assumed that data were generated by a complex, but completely unknown “mechanism”. With this approach, future results are predicted by learning algorithms without a structured understanding of the processes at hand. This project is a study into the history of the disciplines of “statistics” and “machine learning” during the 20th century. It asks for the historic development of both “cultures” of statistical modelling in their respective scientific disciplines and for the epistemological differences.
Calculating as Art and Science: “Scientific Computing” in German Astronomy 1870–1960
The research project examines the development of calculating in German astronomy 1870–1960. “Scientific calculating” is analysed as astronomy’s third basic research methodology (besides observation and theory), one that had been indispensable for the interpretation of observations and the development of theories even before the invention of modern electronic computers. In addition, the project comparatively analyses the development of “calculating” in other scientific systems and disciplines. A comprehensive monograph is scheduled to be published in 2022/23.
- Seising, Rudolf: Warren Weaver’s “Science and Complexity” revisited. In: Seising, Rudolf; Sanz, Veronica (eds.): Soft Computing in Humanities and Social Sciences (Studies in Fuzziness and Soft Computing, Vol. 273) Berlin, Heidelberg [et al.]: Springer 2012, pp. 55–87.
- Hashagen, Ulf: The Computation of Nature, Or: Does the Computer Drive Science and Technology? In: Bonizzoni, Paola; Brattka, Vasco; Löwe, Benedikt (eds.): The Nature of Computation: Logic, Algorithms, Applications. Berlin, Heidelberg: Springer 2013, pp. 263–270.
- Inthorn, Julia; Tabacchi, Marco Elio; Seising, Rudolf: Having the Final Say: Machine Support of Ethical Decisions of Doctors. In: Pontier, Matthijs; van Rysewyk, Simon (eds.): Machine Medical Ethics. Basel: Springer 2015 (Intelligent Systems, Control and Automation: Science and Engineering Vol. 74), pp. 181–206.
- Inthorn, Julia; Seising, Rudolf (eds.): Digitale Patientenversorgung. Zur Computerisierung von Diagnostik, Therapie und Pflege. Bielefeld: transcript (Medical Humanities 3) [to be published in 2021].
Hashagen, Ulf; Seising, Rudolf (eds.): Algorithmische Wissenskulturen? Der Einfluss des Computers auf die Wissenschaftsentwicklung. Cham: Springer [to be published in 2021].
- Modellieren, Simulieren, Muster finden: Historische, anthropologische und philosophische Reflexionen; Section at the annual conference of the DGGMNT, Lübeck, 18.09.2016
Organisation: PD Dr. Ulf Hashagen, PD Dr. Rudolf Seising
- Algorithmische Wissenskulturen? Der Einfluss des Computers auf die Wissenschaftsentwicklung; Workshop 12–14 October 2017, Deutsches Museum Munich
Organisation: PD Dr. Ulf Hashagen, PD Dr. Rudolf Seising
- Wer (oder was) versorgt uns(ere) Patienten? Computerisierung von Diagnostik, Therapie und Pflege. Interdisciplinary workshop for the historical, philosophical, sociological, juridical and ethical reflection of computerisation in areas of medicine and care, 8.11.–10.11.2018
Organisation: Rudolf Seising, Julia Inthorn (JGU Mainz)