Abstract
This paper builds on an agreement coefficient proposed by Krippendorff (Content analysis: an introduction to its methodology, 2013) for measuring the reliability of unitizing and coding continuous phenomena, for example, of texts, videos, or sound recordings. It serves three purposes: It modifies Krippendorff’s definition which turned out to behave not as expected when applied to more than two observers, coders, or annotators. It extends this reliability measure to a family of four coefficients able to assess the reliabilities of diverse properties of unitized continua. It adds a way to obtain the confidence intervals of these coefficients as well as the probability of failing to reach targeted reliability levels. And it describes and provides access to free software that calculates all values of this family of reliability coefficients.









Similar content being viewed by others
References
Cappella, J.N., Turow, J., Jamieson, K.: Call-in political talk radio: background, content, audiences, portrayal in mainstream media (Report series no. 5). University of Pennsylvania, Annenberg Public Policy Center, Philadelphia (1996)
Cohen, J.: A coefficient of agreement for nominal scales. Educ. Psychol. Meas. 20, 37–46 (1960)
Cohen, J.: Weighted kappa: nominal scale agreement with provision for scaled disagreement or partial credit. Psychol. Bull. 70(4), 213–220 (1968)
von Eye, A., Mun, E.Y.: Analyzing Rater Agreement: Manifest Variable Methods. Lawrence Erlbaum Associates, Mahwah (2006)
Fleiss, J.L.: Measuring nominal scale agreement among many raters. Psychol. Bull. 76(5), 378–382 (1971)
Guetzkow, H.: Unitizing and categorizing problems in coding qualitative data. J. Clin. Psychol. 6, 47–58 (1950)
Krippendorff, K.: Content Analysis: An Introduction to Its Methodology 3rd Edition. Sage Publications, Thousand Oaks, CA. Replacement of Section 12.4 to be introduced into its 2nd printing. http://www.asc.upenn.edu/usr/krippendorff/U-alpha.pdf (2013). Accessed 14 May 2015
Krippendorff, K.: Agreement and information in the reliability of coding. Commun. Meas. Methods 5(2), 93–112. http://repository.upenn.edu/asc_papers/278 (2011). Accessed 14 May 2015
Krippendorff, K.: On the reliability of unitizing continuous data. In: Marsden, P.V. (ed.) Sociological Methodology, vol. 25, pp. 47–76. Blackwell, Cambridge (1995)
Krippendorff, K.: Content Analysis: An Introduction to its Methodology. Sage, Beverly Hills (1980)
Krippendorff, K.: Bivariate agreement coefficients for reliability of data, chapter 8. In: Borgatta, E.R., Bohrnstedt, G.W. (eds.) Sociological Methodology, vol. 2, pp. 139–150. Jossey-Bass, Inc., San Francisco (1970)
Mathet, Y., Widlöcher, A., Fort, K., Francois, C., Galibert, O., Grouin, C., Kahn, J., Rosset, S., Zweigenbaum, P.: Manual corpus annotation: giving meaning to the evaluation metrics. In: Proceedings of COLING 2012. ACL, Mumbai. https://aclweb.org/anthology/C/C12/C12-2079.pdf (2012). Accessed 10 May 2015
Scott, W.A.: Reliability of content analysis: the case of nominal scale coding. Public Opin. Quart. 19, 321–325 (1955)
Widlöcher, A., Mathet, Y.: The glozz platform: a corpus annotation and mining tool. In: Concolato, C., Schmitz, P. (eds.). DocEng’12, Sept. 4–7, 2012, ACM Symposium on Document Engineering, ACM, Paris (2012), , pp 171–180. http://dl.acm.org/citation.cfm?doid=2361354.2361394. Accessed 10 May 2015
Author information
Authors and Affiliations
Corresponding author
Additional information
An erratum to this article can be found at http://dx.doi.org/10.1007/s11135-015-0289-7.
Rights and permissions
About this article
Cite this article
Krippendorff, K., Mathet, Y., Bouvry, S. et al. On the reliability of unitizing textual continua: Further developments. Qual Quant 50, 2347–2364 (2016). https://doi.org/10.1007/s11135-015-0266-1
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11135-015-0266-1