SciELO - Scientific Electronic Library Online

 
vol.15 issue2La identidad en los límites del sentido author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • On index processCited by Google
  • Have no similar articlesSimilars in SciELO
  • On index processSimilars in Google

Share


Universitas Psychologica

Print version ISSN 1657-9267

Univ. Psychol. vol.15 no.2 Bogotá Apr./June 2016

 

Measuring knowledge production: limitations and challenges

Agencies in charge of providing funding for science and technology, be them international, national or institutional, always face the challenge of reporting their investments on research. This leads to the creation of indicators that can be used to characterize said research, and this is why scientometrics and bibliometrics have taken such an important role (Bookstein, 1977; Vera-Villarroel, Lillo, LopezLopez, & Silva, 2011)using Lotka's law for illu stra-tion. A general model describing scientific produc tivity is defined and modified to be consistent with a generalized form of Lotka's law; the model in this form is shown to be stable with regard to at least two forms of social change. Then the consequences of invariance under change of time span are investigated, and it is shown that the function occurring in Lotka's law, with an arbitrary exponent, is essentially the only one with desireable properties that is consistent with the symmetry constraint.", "author" : [ { "dropping-particle" : "", "family" : "Bookstein", "given" : "Abraham", "non-dropping-particle" : "", "parse-names" : false, "suffix" : "" } ], "container-title" : "Journal of the American Society for Information Science (pre-1986. It cannot be denied that it is very important to have production indicators both for projects and for finished products (that is, the number of submitted, reviewed, rejected, approved, and funded projects), as well as participants, amounts of money, derived projects - such as papers, book chapters, books and other communicational artifacts (leaflets, videos and the like) - patents, or other technological innovation products. Sometimes, even student training elements can be included.

Evidently, these descriptive measures do not tell the story of what happens with that production, and this is why we need to become acquainted with the types of academic and social appropriations it creates. When discussing this, certain confusions, which I will turn to next, appear. Academic appropriations are mainly measured by using citation indicators (cites per document, citation sources, citations related to knowledge area, citations across time) - these have been traditionally expressed as Journal Impact Factor (JIF) or Scimago Journal Rank (SJR). Additional indicators have been developed, such as h-Index or SNIP (Source Normalized Impact per Paper) (Bar-Ilan, 2008), and others are bound to be created as well. The main issue with these indicators is that they are used as a marketing tool for journals, institutions, and even countries. However, we have so many measures and information systems nowadays that we should account for all of them and assess them according to their features and limitations. For instance, the immediacy index, which measures how fast a paper is cited, is problematic for our context because information systems delay uploads from journals not belonging to the largest publishers. It is evident that JIF or SJR should consider the number of papers published by a journal every year, because getting a high number of citations has a different impact for journals publishing fewer articles and for journals publishing more articles. The editorial effort for the latter is higher, and citation numbers should grow proportionally to the number of citations. Some editors have become experts in "citation engineering" often manipulating citations and articles, and this task becomes easier with fewer articles.

On the other hand, if the scope of journals is broader, more general, and if they serve diverse communities, increases in citation and improvements on indexes become more difficult. This is because even if the journal fulfills one its missions -that of being a well-regarded channel of communication for a discipline- , this diversity is not only reflected in subject matters, but on communication practices by research subcommunities within the same discipline. This can create fluctuations in the citation dynamics of the journal.

If the communication strategies of the journal change and their emphasis switches from a high number of papers and citations per year, to papers as the unit of analysis, the way journal metrics are expressed should change, along with the relationship between authors and their audiences - authors would have an additional role in communicating their products. As such, the most consolidated research groups or the institutions with the most resources could gain an advantage in the implementation of these communication strategies.

Another limitation has to do with coverage. If a system includes a higher number of journals or documents in its analysis, it can better account for the influence of those contents on a community. This is the case of Google citations (Silva, 2012), since this system tracks all citations found in documents, even those in non-indexed journals (Romero-Torres, Acosta-Moreno, & Tejada-Gómez, 2013). This is especially relevant for social science and humanities, where production other than journal articles is commonplace. However, Google citations are not generally normalized, and the system does not provide information on self-citations (a very questionable practice that has recently started to be penalized), which opens up the possibility of manipulation. We need to single out the work being done in this area by the EC3 research group, who has been working on the normalization of this information in order to provide a more trustworthy outlook. Scopus indicators evidently have a larger number of journals than Web of Science (WoS), but it is easy to predict that WoS will expand its databases in order to better account for this relationship. And the same limitations apply for measuring researchers and institutions.

Evidently, these indicators do not speak about the quality of the contents of a journal or of an author's production. They only refer to communication practices, and they certainly do not tell anything about social appropriation of knowledge. It is discouraging that criticism is being made from ignorance or from incoherent ideological arguments, in the context of the growing complexity of these dynamics.

Social appropriation of knowledge needs to be measured according to its complexity. Scimago, for instance, has tracked knowledge in patents or patents used in new knowledge. Other options are measuring wealth generated by new knowledge or number of jobs created by it. Some impacts are, however, more difficult to measure, for example, impact on cultural practices, public policy, laws, or social innovation. These measures do not only depend on knowledge management systems, but on the availability of data. As an example, a sizable part of laws, decrees, and the like are not based on knowledge created by research systems, and it is financial and political interests that intersect and have an influence on them. Finding measurements of the influence on cultural practices is even harder, because not only do we not have systematic data, but also this kind of data is difficult to generate. It is possible that using studies on quality of life or subjective welfare, together with objective indicators of morbidity, life expectancy, citizen participation, or freedom of speech, might give us a clearer picture of the impact of new knowledge on cultural practices of a community or society. Nevertheless, we are far from being able to develop indicators that can tell the story of both the academic and social indicators of research products. This is the direction that part of the academic world should follow - social science, agencies, and institutions with a role in knowledge management should open up spaces of debate, proposal, and development of more complex indicators that fix many of the limitations and problems exhibited by the current ones.

Wilson López López
Editor


References

Bar-Ilan, J. (2008). Informetrics at the beginning of the 21st century-A review. Journal of Informetrics, 2(1), 1-52. doi:10.1016/j.joi.2007.11.001

Bookstein, A. (1977). Patterns of Scientific Productivity and Social Change: A Discussion of Lotka's Law and Bibliometric Symmetry. Journal of the American Society for Information Science (Pre-1986), 28(4), 206.

Clasificación integrada de Revistas Científicas [EC3] (s.f.) Retrieved from https://ec3metrics.com/circ/

Romero-Torres, M., Acosta-Moreno, L. A., & Tejada-Gómez, M.-A. (2013). Ranking de revistas científicas en Latinoamérica mediante el índice h: estudio de caso Colombia. Revista Española de Documentación Científica, 36(1), e003. doi:10.3989/redc.2013.1.876

Scimago (s.f.). Retrieved from http://www.scimagojr.com/

Silva, A. L. C. (2012). El índice-H y Google Académico: una simbiosis cienciométrica inclusiva. Acimed, 23(2), 308-322.

Vera-Villarroel, P., Lillo, S., López-López, W., & Silva, L. M. (2011). La producción científica en psicología latinoamericana: Un análisis de la investigación por países. Revista Latinoamericana de Psicologia, 43(1), 95-104.

Bar-Ilan, J. (2008). Informetrics at the beginning of the 21st century-A review. Journal of Informetrics, 2(1), 1-52. doi:10.1016/j.joi.2007.11.001         [ Links ]

Bookstein, A. (1977). Patterns of scientific productivity and social change: A discussion of Lotka's law and bibliometric symmetry. Journal of the American Society for Information Science (Pre-1986), 28(4), 206-210.         [ Links ]

Clasificación integrada de Revistas Científicas [EC3] (sf). Recuperado de https://ec3metrics.com/circ/         [ Links ]

Romero-Torres, M., Acosta-Moreno, L. A., & Tejada-Gómez, M. -A. (2013). Ranking de revistas científicas en Latinoamérica mediante el índice h: estudio de caso Colombia. Revista Española de Documentación Científica, 36(1), e003. doi:10.3989/ redc.2013.1.876        [ Links ]

Scimago (s.f.). Recuperado de http://www.scimagojr.com/        [ Links ]

Silva, A. L. C. (2012). El índice-H y Google Académico: una simbiosis cienciométrica inclusiva. Acimed, 23(2), 308-322.         [ Links ]

Vera-Villarroel, P., Lillo, S., López-López, W., & Silva, L. M. (2011). La producción científica en psicología latinoamericana: un análisis de la investigación por países. Revista Latinoamericana de Psicología, 43(1), 95-104.         [ Links ]

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License