Abstract
Taking the visual appeal of the ‘bell curve’ as an example, this paper discusses in how far the availability of quantitative approaches (here: statistics) that comes along with representational standards immediately affects qualitative concepts of scientific reasoning (here: normality). Within the realm of this paper I shall focus on the relationship between normality, as defined by scientific enterprise, and normativity, that result out of the very processes of standardisation itself. Two hypotheses are guiding this analysis: (1) normality, as it is defined by the natural and the life sciences, must be regarded as an ontological, but epistemological important fiction and (2) standardised, canonical visualisations (such as the ‘bell curve’) impact on scientific thinking and reasoning to a significant degree. I restrict my analysis to the epistemological function of scientific representations of data: This means identifying key strategies of producing graphs and images in scientific practice. As a starting point, it is crucial to evaluate to what degree graphs and images could be seen as guiding scientific reasoning itself, for instance in attributing to them a certain epistemological function within a given field of research