SciELO - Scientific Electronic Library Online

 
vol.12 número4Presentación índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Não possue artigos similaresSimilares em SciELO
  • Em processo de indexaçãoSimilares em Google

Compartilhar


Universitas Psychologica

versão impressa ISSN 1657-9267

Univ. Psychol. vol.12 no.4 Bogotá oct./dez. 2013

 

On Incentives, Political and Economical Externalities, and Research Output Assessment Procedures

It has been long shown by Merton and other sociologists, philosophers and science historians that scientific communities can be subject to similar analyses to those applied to other communities. Moreover, these processes involve attending to both internal (conceptual and methodological) and external (pressures and influences by other academic communities) factors. The externalities created by the latter are now standalone products generated by the dynamics of interaction amongst communities.

Several editorials and papers in top-ranking journals have mentioned that output and its communication are influenced by external funding, social responsibility, impact factors, interest conflicts, and even social and academic networks (Editorial, 2013, 2014; Piwowar, 2013; Thelwall, Haustein, Larivière, & Sugimoto, 2013; Van Noorden, 2013). But this is not only related to publication, but to research generation as well.

A close and tangible example is the incentive systems that communities that assess and control resource allocation for research have put in place, and that end up gearing researchers' behaviour towards quantity, quality, location and type of publication, and raising strong conflicts of interest (Editorial, 2014). Another example is how accreditation systems promoted an large increase in the number of publications in order to accommodate more output and to show an institutional commitment to research, which created pressure for academic communities to publish. This also resulted in differences amongst measurement systems, and nowadays there is no agreement amongst the measurements offered by Google Scholar, Scopus, and Thompson Reuters' Impact Factor (Bornmann et al., 2009; Silva, 2012).

In order to preserve some minimal output quality, reviewing procedures must be exhaustive, exogamic and intersubjectively contrasted via blind peer-review; these requirements aim to decrease the effect of some externalities — for instance, those stemming from interests held by competing communities, or from ideological, political, or even personal, animosities. This is why peer-reviewers should not know the authors, their institutions, or their countries of precedence. The task is complicated by the fact that reviewers must have experience in the area and solid methodological training, which is not simple, especially in not so well developed communities or groups who have created their own language for communication.

If we add economical variables, the problem gets worse. Researchers get incentives for publication and reviewers, in most cases, do not get anything for reviewing. This is another example of incentives as externalities akin to political, ideological or personal interests, that must be borne in mind by the editorial teams as part of the review process.

Academic communities in consolidation are unfortunately more vulnerable to adjust their practices as a function of incentives or externalities, especially when rooted in fragile institutions. This is another reason for exogamic double blind reviewing systems. This ideal process will need to be supplemented by other transparency measures, but neither reviewers nor authors seem to be prepared for complete transparency in publications. Hopefully, this will be achieved through self-regulation of the scientific processes.

I want to point out that no one is naïve these days. Researchers know the value of incentives, but we cannot get drawn into making our work worse because of them, instead of producing pertinent, relevant, quality work, or even worse, placing all responsibility on the system we so easily criticise. It is only us who are responsible for the consequences of decisions we make as researchers, reviewers and editors. Our ethical and social command is to denounce the implications of these externalities and of the researchers' behaviour.

WILSON LÓPEZ-LÓPEZ
Editor


References

Bornmann, L., Marx, W., Schier, H., Rahm, E., Thor, A., & Daniel, H.-D. (2009). Convergent validity

of bibliometric Google Scholar data in the field of chemistry-Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published elsewhere, using Google Scholar, Science Citation Index, Scopus, and Chemical Abstracts. Journal of Informetrics, 3(1), 27-35. doi:10.1016/j.joi.2008.11.001

Editorial. (2013). Enemy of the good. Nature, 503, 438.

Editorial. (2014). Conflict of interest. Nature, 505, 132.

Piwowar, H. (2013). Value all research products. Nature, 493, 159.

Silva, A. L. C. (2012). El índice-H y Google Académico: una simbiosis cienciométrica inclusiva. Acimed, 23(2), 308-322.

Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PloS One, 8(5), e64841. doi:10.1371/journal.pone.0064841

Van Noorden, R. (2013). PLOS profits prompt revamp Budget crunch hits Keeling's curves. Nature, 503, 320-321.

Bornmann, L., Marx, W., Schier, H., Rahm, E., Thor, A., & Daniel, H.-D. (2009). Convergent validity of bibliometric Google Scholar data in the field of chemistry-Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published elsewhere, using Google Scholar, Science Citation Index, Scopus, and Chemical Abstracts. Journal of Informetrics, 3(1), 27-35.         [ Links ]

Editorial. (2013). Enemy of the good. Nature, 503, 438.         [ Links ]

Editorial. (2014). Conflict of interest. Nature, 505, 132.         [ Links ]

Piwowar, H. (2013). Value all research products. Nature, 493, 159.         [ Links ]

Silva, A. L. C. (2012). El índice-H y Google Académico: una simbiosis cienciométrica inclusiva. Acimed, 23(2), 308-322.         [ Links ]

Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PloS One, 8(5), e64841.         [ Links ]

Van Noorden, R. (2013). PLOS profits prompt revamp Budget crunch hits Keeling ' s curves. Nature, 503, 320-321.         [ Links ]