Measuring the Agreement of Mathematical Peer Reviewers

Axiomathes:1-15 (forthcoming)
  Copy   BIBTEX

Abstract

We investigate the possibility of arguing for or against the philosophical position that mathematics is an _epistemic exception_ on the basis of agreement data from the mathematical peer review process and argue that Cohen’s \(\kappa \), the standard agreement measure used for inter-rater agreement, is unable to detect epistemic exceptionality from peer review data.

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 100,752

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Refinement: Measuring informativeness of ratings in the absence of a gold standard.Sheridan Grant, Marina Meilă, Elena Erosheva & Carole Lee - 2022 - British Journal of Mathematical and Statistical Psychology 75 (3):593-615.
Author's Response-Peer review: Agreement and disagreement.D. V. Cicchetti - 1996 - Behavioral and Brain Sciences 19 (3):536-536.
The ethics of peer review in bioethics.David Wendler & Franklin Miller - 2014 - Journal of Medical Ethics 40 (10):697-701.
The problem of humiliation in peer review.Debra R. Comer & Michael Schwartz - 2014 - Ethics and Education 9 (2):141-156.

Analytics

Added to PP
2023-01-05

Downloads
47 (#467,133)

6 months
11 (#337,502)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Benedikt Löwe
University of Amsterdam

Citations of this work

Add more citations