Thales Sehn Körting
Is there an "Almost Perfect" agreement in a classification?
- Author: Vários
- Narrator: Vários
- Publisher: Podcast
- Duration: 0:07:52
- More information
Informações:
Synopsis
I discuss the extensive use of the Table Strength of Agreement based on different Kappa values, provided by: Landis, J.R. and Koch, G.G., 1977. The measurement of observer agreement for categorical data. Biometrics, pp.159-174. According to Google Scholar, this paper has more than 53.000 citations (up to October, 2019). In my opinion this table has been used sometimes with a different purpose than the original paper, which, according to the authors, "have been illustrated with an example involving only two observers", and "these divisions are clearly arbitrary". The original paper is available at https://www.jstor.org/stable/pdf/2529310.pdf Follow my podcast: http://anchor.fm/tkorting Subscribe to my YouTube channel: http://youtube.com/tkorting The intro and the final sounds were recorded at my home, using an old clock that belonged to my grandmother. Thanks for listening