SciELO - Scientific Electronic Library Online

 
vol.39 issue2Systematic Evidence vs. Beliefs or Popular Knowledge: The Case of the Moon and Psychiatric PathologyAcute Neutropenia in a Patient with Delirium Treated with Quetiapine author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • On index processCited by Google
  • Have no similar articlesSimilars in SciELO
  • On index processSimilars in Google

Share


Revista Colombiana de Psiquiatría

Print version ISSN 0034-7450

Abstract

CAMPO-ARIAS, Adalberto  and  HERAZO, Edwin. Intra- and Inter-Rater Concordance. rev.colomb.psiquiatr. [online]. 2010, vol.39, n.2, pp.424-432. ISSN 0034-7450.

Introduction: Intra- and inter-rater concordance studies are important in order to measure the reliability or the reproducibility of evaluations (interviews or scales applied by a rater) in psychiatry. Objective: To present some principles regarding the validation process of diagnostic interviews or scales applied by a rater, and regarding the handling and comprehension of more useful statistical tests. Method: Review of literature. Results: Concordance is understood as the grade of agreement or disagreement among evaluations made to the same subject successively by an evaluator or among two or more interviewers. This process is part of the validation of instruments, scale reliability, in order to identify possible cases or to confirm the presence of a mental disorder. Inter-rater concordance refers to the case when two or more psychiatrists realize an interview independently and almost simultaneously to a person; this can help to estimate the grade of agreement, convergence or concordance (and disagree, divergence or discordance) among the evaluations and the consequent diagnostics. Intra-rater concordance is the grade of agreement on the diagnosis made by the same rater in different times. Cohen's kappa is used to estimate concordance, and values higher than 0.50 are expected in general. To reliably estimate Cohen's kappa is necessary to know previously the expected prevalence of mental disorder, the number of evaluations or raters, and the number of possible diagnosis categories.

Keywords : Psychometrics; scales; reproducibility of results; validation studies; review.

        · abstract in Spanish     · text in Spanish     · Spanish ( pdf )

 

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License