Normal Accidents of Expertise

Minerva 48 (3):239-258 (2010)
  Copy   BIBTEX

Abstract

Charles Perrow used the term normal accidents to characterize a type of catastrophic failure that resulted when complex, tightly coupled production systems encountered a certain kind of anomalous event. These were events in which systems failures interacted with one another in a way that could not be anticipated, and could not be easily understood and corrected. Systems of the production of expert knowledge are increasingly becoming tightly coupled. Unlike classical science, which operated with a long time horizon, many current forms of expert knowledge are directed at immediate solutions to complex problems. These are prone to breakdowns like the kind discussed by Perrow. The example of the Homestake mine experiment shows that even in modern physics complex systems can produce knowledge failures that last for decades. The concept of knowledge risk is introduced, and used to characterize the risk of failure in such systems of knowledge production.

Other Versions

No versions found

Similar books and articles

Introduction to "The Politics of Expertise".Stephen Turner - 2013 - In Stephen P. Turner (ed.), The Politics of Expertise. New York, USA: Routledge.
We should redefine scientific expertise: an extended virtue account.Duygu Uygun Tunç - 2022 - European Journal for Philosophy of Science 12 (4):1-30.
プロダクションシステムにおける動的特性のベリフィケーション.前川 貴宏 伊之井 清孝 - 2001 - Transactions of the Japanese Society for Artificial Intelligence 16:1-10.
Banana KBS Diagnosis and Treatment.Rafiq Madhoun - 2015 - International Journal of Academic Pedagogical Research (IJAPR) 2 (7):1-11.

Analytics

Added to PP
2010-12-11

Downloads
597 (#45,243)

6 months
105 (#57,095)

Historical graph of downloads
How can I increase my downloads?