SciELO - Scientific Electronic Library Online

 
vol.50 issue3Relationship between expressive and receptive and pre-reading skillsDifferences between men and women in decision-making processes in patients with substance-related disorders author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

Related links

  • On index processCited by Google
  • Have no similar articlesSimilars in SciELO
  • On index processSimilars in Google

Share


Revista Latinoamericana de Psicología

Print version ISSN 0120-0534

rev.latinoam.psicol. vol.50 no.3 Bogotá July/Dec. 2018

https://doi.org/10.14349/rlp.2018.v50.n3.2 

Artículos

Análisis de la productividad e impacto de las revistas de psicología colombianas entre 2000 y 2016

Productivity Analysis and Impact of Colombian Psychology Journals between 2000 and 2016

César Acevedo-Trianaa 

Michelle Torresa 

Maria Constanza Aguilar-Bustamanteb 

Camilo Hurtado-Parradoc,  e 

Luis Manuel Silvad 

Wilson López-Lópezd  * 

a School of Psychology, Universidad Pedagógica y Tecnológica de Colombia, Tunja, Colombia.

b Faculty of Psychology, Universidad Santo Tomás de Aquino, Bogotá, Colombia.

c Faculty of Psychology, Fundación Universitaria Konrad Lorenz, Bogotá, Colombia.

d Faculty of Psychology, Pontificia Universidad Javeriana, Bogotá, Colombia.

e Department of Psychology, Troy University, Alabama, United States.


Resumen

La creciente producción de las revistas de psicología colombianas en las últimas décadas lleva a preguntas sobre la relevancia e impacto regional e internacional de su productividad. Considerando la falta de esfuerzos recientes para evaluar el resultado de estas revistas en diferentes fuentes de información, el presente estudio analizó la producción, colaboración e impacto de las revistas de psicología colombianas entre 2000 y 2016 utilizando información de las siguientes bases de datos: Scopus, Scielo, Redalyc, y Journal Scholar Metrics. Analizamos 3915 artículos publicados en 13 revistas. Se incluyeron las revistas que estaban indexadas en varias de estas bases de datos al tiempo durante el tiempo de observación, lo que permitió realizar comparaciones. Observamos la diversificación y el crecimiento de las revistas en todos los indicadores propuestos, con diferentes grados de visibilidad y calidad según los criterios utilizados. Aunque todas las revistas en general mostraron resultados regionales e internacionales similares, algunas se destacaron en todos los indicadores, lo que a su vez invita a otras revistas a mejorar sus indicadores. Concluimos que la producción colombiana en psicología es visible a nivel regional. Además, las revistas deben aumentar algunos de los indicadores para poder compararlos con otros puntos de referencia de acceso abierto regionales e internacionales, ya que es el modelo bajo el cual se concibieron originalmente.

Palabras Clave: Análisis cienciométricos; Revistas de Psicología; Citación e impacto

Abstract

The growing output of Colombian psychology journals over past decades leads to questions regarding the regional and international relevance and impact of their productivity. Considering the lack of recent efforts to assess the output of these journals across different information sources, the present study analyzed the production, collaboration, and impact of Colombian psychology journals between 2000 and 2016 using information from the following databases: Scopus, Scielo, Redalyc, and Journal Scholar Metrics. We analyzed 3915 articles published across 13 journals. A journal was included in the analysis if it was indexed in several of these databases, which allowed for multiple comparisons. We observed journals' diversification and growth across all the proposed indicators, with different degrees of visibility and quality depending on the criteria used. Although all the journals generally showed similar regional and international results, some excelled across indicators, which in turn challenges other journals to improve their scores. We conclude that Colombian production in psychology is visible on a regional level. Moreover, journals need to increase some of the indicators so they can be compared with other regional and international open-access benchmarks, which is the model in which they were originally conceived.

Keywords: Scientometric analysis; Journals of Psychology; Citation and impact

High-quality academic activity requires impact channels that are of a similar level. Monitoring the productivity and impact of scientific production is conducted globally as this is a necessary input for education, funding, and public policy (Heberger, Christie, & Alkin, 2010; Vargas-Quesada & Moya-Anegón, 2007). One approach to assess the impact and dissemination of academic output is using bibliometric indicators, which allow the course of a discipline or area to be analyzed across periods of time using publication patterns. These methods have been accepted in multiple areas of knowledge as valid approaches to evaluate research (Garfield, Malin, & Small, 1978; Krampen, 2008; Long, Plucker, Yu, Ding, & Kaufman, 2014; Zurita, Merigó, & Lobos-Ossandón, 2016).

Several aspects beyond simply the number of articles published can be analyzed using this bibliometric approach. For example, knowledge accumulated in a given area tends to have a close relationship with knowledge produced in other disciplines. In turn, technological developments require the integration of multiple sources. Accordingly, productivity analyses using a bibliometric approach allow areas of greater or lesser proximity to be identified, and, thus, the interdisciplinarity of knowledge (Heberger et al., 2010); furthermore, similar analyses have been conducted on cooperation between authors or types of research (Garcia, López-López, Acevedo-Triana, & Nogueira Pereira, 2017; Robayo-Castro, Rico, Hurtado-Parrado, & Ortega, 2016). This cooperation could be understood as joint efforts towards the common goal of scientific productivity (Garcia, López-López, Acevedo-Triana, & Bucher-Maluschke, 2016). One form of cooperation is publication co-authorship, which has been used to assess collaboration between researchers and contrasts with other forms of cooperation, such as joint research projects, development of regional associations, creation of academic events, and scholar exchanges (Garcia, Acevedo-Triana, & López-López, 2014). Overall, bibliometric indicators are the input for productivity analyses across multiple levels -collaboration, productivity, or internationalization-which are all of relevance for educational institutions, governments, collegiate units, and researchers.

Despite agreement on the necessity to conduct periodic productivity assessments, there is an ongoing debate about their outcomes and purpose (Butler, 2008; Hicks, 1999; Moed, 2008). The discussion goes beyond limitations related to the scope, data, or methodology implemented, and includes the possibility to compare measurements across areas, the influence of these assessments on the institutions or faculty, and whether bibliometric information indeed reflects how research is used and has an impact, without considering publications of a different nature that typically are not indexed in traditional repositories (Bar-Ilan, 2008; Davidson et al., 2014; Thelwall, Haustein, Lariv-ière, & Sugimoto, 2013). Part of the problem is that analyses are implemented using information from heterogeneous sources and, therefore, are inherently limited in scope. The impact of Latin American publications has frequently been underestimated for multiple reasons (Alperin et al., 2015), but especially because traditional analyses only use journals indexed in large databases, and the main indicators considered are solely based on citation information. Fortunately, a promising field in information studies now assesses dissemination of knowledge not only focusing on citations across indexed journals, but also including academic social networks (e.g., almetric.com). Results from these analyses have effectively shown an increased circulation of knowledge and, more generally, have supported the notion that this comprehensive approach is relevant, and thus should be considered in the assessment of academic productivity (Alperin, 2015).

In Psychology, the study of several fields of application has been supplemented with bibliometric information as an input for the reformulation of historical contents, the determination of optimal communication channels, and analyses of production trends and research. The outcomes of this approach have been used to guide policy on research and productivity (Allik, 2013; Mori & Nakayama, 2013; Navarre-te-Cortés et al., 2010; Nederhof, Zwaan, De Bruin, & Dek-ker, 1989; Nederhof, 2006; Schui & Krampen, 2010; Yeung, Goto, & Leung, 2017).

Scientific productivity in Colombia has increased during the past few decades, and there is now a context in which the assessment of scientific output across different areas is increasingly needed (Alperin et al., 2015; López-López, Silva, García-Cepero, Aguilar-Bustamante & Aguado, 2010; Salazar-Acosta, Lucio-Arias, López-López, & Aguado-López, 2013). Considering its rapid growth in productivity over recent decades, psychology is an area of special relevance (López-López, García-Cepero, et al., 2010; López-López, Silva, et al., 2010). It seems that part of this effect can be explained by the growing interest and improvement in editorial processes. The emergence of academic and editorial networks seems to have contributed, which have increased the exchange of experiences and the possibility of overcoming common difficulties (López-López et al., 2010).

Colombian psychology publications have emerged under the Open Access (OA) model and have been mostly supported by educational institutions and not by academic organizations (Alperin et al., 2015; López-López, Silva, et al., 2010; Van Noorden, 2012a). These journals have been created without a specific insight into the discussion about the fact that the OA model is the standard for the Colombian psychology publications. They have been developed under the assumption that the OA model will improve access to content without the intervention of the editorials and will aim to address the need to disseminate local research without imposing payment limitations (Suber, 2015). It is worth noting that these journals have very good content, which is a result of their double-blind peer reviewing process, international editorial committees, and compliance with the high standards that international databases impose to be included in their systems (Alperin et al., 2015). It is also evident that this model challenges the traditional paradigm of knowledge access and the role of publishers, not only in terms of indexing, but also regarding visibility. The "mega-journals" have instigated a push towards the OA movement (Aguado-López, Becerril-García, & Aguilar Bustamante, 2016; Bjõrk, 2015) because they have developed a high-quality OA model, which aims to oppose the control that academic elites exert over some top-tier journals that follow the paid-access model. Several journals with high-quality and corresponding high citation and visibility have emerged under the OA model (e.g., Plos One, Biomed Central, Frontiers in), which have been inspired by the debate regarding increasing access to information (Laakso et al., 2011; Piwowar et al., 2018). Although OA has been useful in promoting global coverage and scope, it has recently been associated with dubious quality; however, these isolated cases are not representative of the general status of the OA model, which is currently estimated to cover 30% of world production (Laakso et al., 2011; Piwowar, 2013; Piwowar et al., 2018). In Latin American countries with high-productivity, such as Brazil, journals with the best reputation and quality follow the OA model, which suggests that strengthening of this type of journals is a positive effort (Neto, Willinsky, & Alperin, 2016). This has been the model followed by several journals in Colombia.

There have been previous efforts to analyze the increasing productivity of Colombian psychology journals (Guerrero & Jaraba, 2009; Morales, Jaraba-Barrios, Guerrero-Castro, & López-López, 2012; Quevedo-Blasco, López-López, & Quevedo-Blasco, R. & López-López, 2011). These studies identified different transformation periods in recent decades. Only some journals have survived these changes to date because of their efforts to adjust their editorial, communication, collaboration and impact practices to world-level standards (Chi & Young, 2013; Koch, 1992; Leydesdorff, 2004). The observed increments in productivity, coupled with the disappearance of several journals, suggest that there has not only been an increase in the number of articles submitted and published, but also a push towards improvement in the quality of the publications. An additional aspect worth noting from these analyses is the increase in the reported collaboration indicators.

Considering several years have elapsed since the last relevant analyses were conducted (i.e., 2008), this paper provides an updated assessment for the period between 2000 and 2016. We analyzed productivity and impact of Colombian psychology journals using multiple systems of international and regional indexing (Scopus and Journal Scholar Metrics, and Redalyc and Scielo, respectively) and compared them across the period of observation. Our effort to show the evolution and success of the journals that are indexed in systems that report citation information is expected to guide the decisions of other journals that are currently working towards inclusion in these systems.

Method

Materials and Procedure

We selected 13 Colombian psychology journals (table 2) for analyses using the following criteria: (a) the journal had been active between 2000 and 2016; (b) the contents of the journal in that timeframe were accessible; (c) the journal was covered by at least two of the following databases: Scopus, Scielo, Journal Scholar Metrics or Redalyc; (d) a regional (Scielo) or international (Scopus or Journal Scholar Metrics) citation indicator was available for the journal. All the Colombian journals analyzed in this paper use a double-blind peer-review model. Full-text articles published during the selected period were downloaded from journals' webpages or repositories. Each paper was independently classified by one of the authors as either theoretical, empirical, or bibliometric. Book reviews, editorials and conferences were excluded from the analyses. A quality-control test was conducted by a different author, and 20% of all the papers were randomly chosen for reassessment. We obtained a 95% inter-rater agreement. The definitions of the indicators obtained and analyzed in the present study are presented in Table 1 (including equations when applicable). Both indicators and analyses followed those described by Salas et al., (2017). In terms of productivity, the unit of analysis was the paper (article). The number of authors and their associations were used to calculate cooperation and collaboration indexes. Papers with authors from different countries were counted towards each country's production. Thematic contents of the articles were not considered due to the diversity of the fields covered.

Table 1 Indicators and definitions used in the study. 

The number of citations was the standard measure for a publication's impact (Buela-Casal, Medina, Viedma, Godoy, Lozano, & Torres 2004; Buela-Casal & López, 2005). However, due to the volatile and heterogeneous nature of this measure, citation information was examined across the sources available (Lluch, 2005) - i.e., citations reported in Scielo, Scopus, and Journal Scholar Metrics (http://www. journal-scholar-metrics.infoec3.es) were considered. Citation has multiple uses, additional to impact assessment; it also serves as an indicator of cohesion in academic communities via the so-called citation networks (Chi & Young, 2013). In the present study, citation was analyzed via information provided by the different databases.

Indicators solely provided by Scopus, such as the SCImago Journal Rank (SJR) and the Source Normalized Impact per Paper (SNIP), were used to compare the relevant journals.

We used total and normalized H-impact indicators (excluding self-citations) provided by Journal Scholar Metrics for journals only indexed in the regional bases Scielo or Redalyc, and not in Scopus. These indicators can also reveal details about the course of these publications and allow for comparisons with journals included in Scopus. This is an approach that has not been implemented in previous studies, but has potential to provide a more standard picture of the journals reviewed in the present study.

Results

Productivity

Table 2 presents journals' productivity across the considered timeframe (2000-2016). An increase in the number of papers published in recent years and a strengthening in their overall impact is evident in the data. A total of 3,915 papers written by 10,687 authors were examined. These articles represent 19% of the total amount of psychology papers indexed in Redalyc (20,587), and 11.5% of the Colombian papers referenced in the same database, which is an indication of the productivity level of Colombian journals as optimal channels to publish psychology.

Table 2 Results of productivity of the analyzed journals. 

Note: Rev. Lat. Psic. (Revista Latinoamericana de Psicología); Avan. Psic. Lat. (Avances en Psicología Latinoamericana); Acta Col. Psic (Acta Colombiana de Psicología); Univ. Psych. (Universitas Psychologica); Suma Psic. (Suma Psicológica); Int. Jour. Psych. Res. (International Journal of Psychological Research); Rev. Col. Psic. (Revista Colombiana de Psicología); Psic. desde Car. (Psicología desde el Caribe); Diversitas (Diversitas perspectivas en Psicología); Psych: Avan de la Disc. (Psychologia: Avances de la disciplina); Rev. CES Psic. (Revistas CES de Psicología); Tesis Psicol. (Tesis Psicológica); Pens. Psicol. (Pensamiento Psicológico). ICyEE (Indice de contribución y Esfuerzo Editorial).

It is worth noting that, in general, journals include research papers that report both direct and indirect observations or experience to spread knowledge - i.e., empirical -rather than theoretical papers; non-experimental research combining and incorporating existing theories to spread knowledge - suggesting a preference for using journals as a channel to disseminate empirical research outcomes in psychology. Bibliometric publications - i.e., manuscripts based on empirical research that analyzes publications, research outputs, and/or researchers - also had an important presence in the journals reviewed. They generally intend to provide assessments on the production and impact of the discipline. Figure 1 shows the number of papers published per year across the period of observation (2000-2016) and per journal. These data indicate a dramatic rise in the number of publications, increasing from 60 papers in 2000 to more than 370 in 2016, which represents an increase of more than 600%. Even though many journals were founded after 2000, or their contents were unavailable for the entire period of observation, data indicate an increase in articles published for all journals, independently of their trajectory.

Figure 1 Number of papers published in journals in Colombia between 2000 - 2016. Panel (a) number of documents per year; panel (b) number of papers per journal. 

Figure 2 shows the geographical distribution of the reviewed papers. Eleven countries (Colombia, Spain, Mexico, Brazil, Argentina, Chile, USA, Peru, Portugal, Venezuela, and France) concentrate 95% of the production of psychology publications. To a lesser extent, some non-Hispanic countries contribute mainly in English.

Figure 2 Distribution of author's country of origin for the papers published in the Colombian journals during 2008-2016. Note: the red arrow indicates 95% of all items. The data per journal can be found in the supplementary table 1 (Table S1). 

As for the number of papers, some journals have special issues on distinct subjects, with specialized academic units directing and editing contents. As we will discuss later, these issues could have a specific impact on citations by focusing on a specific topic.

Finally, Redalyc offers an Index of Contribution and Editorial Effort (ICyEE), which accounts for the number of papers published in relation to the average number of articles published by journals of a given discipline. As shown in Table 2, only two journals have scores above 1.0, which is the average for psychology journals. These two publications were Revista Latinoamericana de Psicología (1.24) and Universitas Psychologica (2.41), which account for twice as many papers, and therefore twice as much editorial effort. Some journals with close-to-average ICyEE scores (1.0) are Avances en Psicología Latinoamericana, Diversitas, and Acta Colombiana de Psicología. The journal with the lowest indicator is Tesis Psicológica, which has an index lower than half the average (.442). It is noteworthy that this increase and contribution to the discipline has been reached without changes in the journals' economic models, but instead by optimizing editorial processes and economic resources; the aim has been growth in divulgation, collaboration, and impact.

Collaboration

Data on collaboration are presented in Table 3, which indicates an overall growing trend in the related indicators across time (average number of authors, Lawani Index, and Subramanyam Index). Authorship grew from 2.5 authors in 2006-2010 to 2.8 in 2011-2016, and some journals had averages above 3 authors.

Table 3 Indicators of collaboration and internationalization, segmented by periods. 

Although the average number of authors (Lawani [LI] and Subramanyam [SI] indicators) show the evolution of collaboration, LI calculates the weighted average of authors per article in each period, which makes it a more accurate index than the simple average of authors per article. The SI determines the number of papers that have two or more authors; that is, it indicates a percentage of papers written in collaboration. Accordingly, in the first years of the analyzed period (2000-2005), about 58% of the articles were written by two or more authors, between 2006-2010 this indicator increased to 67%, and between 2011 and suggests a change in the research process. The fact that this change seems to be occurring not only in Colombia, but across the region, is a signal of a change in the culture of research and publication.

Finally, Redalyc's internationalization index (Table 3) shows that journals with the highest ranking in Colombia, that is, with more citations and coverage, have a higher level of internationalization. However, journals not covered by Scopus - Revista CES de Psicología and Pensamiento Psicológico - showed higher internationalization scores than other indexed journals such as Suma Psicológica or Revista Colombiana de Psicología, which could suggest that indexing is more of a reflection of administrative editorial work and does not have so much to do with impact, quality, or visibility of the journals. This finding could be related to the number of citations generated in different databases, which would suggest that indexation has an impact on regional ranking.

Although an analysis of the countries of affiliation and institutions of these collaborations was not performed, Figure 2 shows that Colombian journals are mainly a regional communication channel because the affiliation is from Ibero-American countries. In this sense, it is worth highlighting Redalyc's effort in promoting collaboration indicators both between institutions and between countries.

Impact

One of the objectives of this work was to assess the chosen publications and consider their impact. Traditionally, the number of citations and indexes related to this measurement have helped to identify the impact that a publication may have on a given academic community. However, citation is only one form of journal impact, and more detailed analyses with more inputs may provide a broader assessment. We agreed with most of the criticisms on citation information as an indicator of quality and impact, and, as such, we decided to combine several citation systems to control single source bias. We also included indices provided by Scopus that have been developed as supplementary sources of information, namely CiteScore, SJR, and SNIP.

We pooled citation information from three main sources (see Figure 3, 4, and 5). Namely, Scopus as an international indexing database that covers seven psychology journals in Colombia: Universitas Psychologica, Revista Latinoamericana de Psicología, Acta Colombiana de Psicología, Avances en Psicología Latinoamericana, Revista Colombiana de Psicología, Suma Psicológica, and International Journal of Psychological Research; Journal Scholar Metrics as an international indicator, which is based on data extracted from Google Scholar Metrics and compiles information beyond Scopus' indexed journals (covers 13 Colombian journals); and Scielo citation indices, which offer regional information and covers 6 Colombian journals.

Figure 3 Impact indicators of Colombian journals in Scopus. Panel (a) number of citations per year; panel (b) Scopus's CiteScore, takes into account the number of citations in a year with respect to the documents published in the three immediately preceding years; (c) Scimago Journal Ranking (SJR) of each journal, which evaluated its impact in relation to other journals; (d) Standardized impact of the source by article (SNIP) using Scopus, which measures the potential impact of citations for psychology journals and the area in which the journal has been classified. In this case, values close to 1 mean that the impact of the journal is consistent with the development of the area to which the journal belongs. 

Figure 4 Impact indicators of Colombian journals using Journal Scholar Metrics. Panel (a) h5-index in the period 2010-2014 and means the number of citations that receive h number of papers that have at least h citations each; panel (b) H-Citation, which indicates the sum of the number of citations received for the papers that make up the journal's H5-index; panel (c) H-median, which means the median value of citations for the number of papers collected in the H5-index; (d) Quartile of the journal in the Journal Scholar Metrics classification. If the journal is located in the first quartiles, the indicator will have higher prestige in the social science assessment. 

Figure 5 Number of citations of journals indexed in Scielo. 

Figure 3a shows the annual number of citations per journal registered by Scopus. These data indicate that Revista Latinoamericana de Psicología and Universitas Psychologica clearly differentiate from the other analyzed journals, especially during the last period of observation - 2010 and 2016 - in terms of the accelerated increase in number of yearly citations, which reached nearly 400 in 2016. The fact that this increasing pattern is steeper for Universitas Psychologica than for Revista Latinoamericana de Psicologia, is due more to its recent foundation in 2007. This suggests that Universitas' growth has been faster. The behavior of the five remaining journals is very similar as they have a yearly growth in the number of citations, but do not exceed 80 citations at the end of the observation period (2016). Figure 3b shows the CiteScore information for each journal across the period of observation, yearly citations are compared with 3-year blocks of published documents. These data show that the two most cited journals - Revista Latinoamericana de Psicología and Universitas Psychologica behave differently. Revista Latinoamericana has progressively achieved a CiteScore value, which was close to 1.0 by the end of the observation period. This finding suggests that this journal is cited the same number of times in a year as documents from the previous three years. This would entail a balance between the number of articles published, or a reduction in the number of articles published per year. A lower number of articles affects the citation-article relationship, as well as the more than 50-year history of the journal. The opposite pattern is observed in Universitas Psychologica, which has decreasing CiteScore values across the last years of the observation period. This is the result of a regular increase of articles each year, which is related to the strategy of growing to become a broader communication channel.

Data regarding the SJR index is shown in Figure 3c. Information on the last year of observation (2016) shows six journals in a similar range (between .1 and .3; Avances en Psicología Latinoamericana; Acta Colombiana de Psicología; Universitas Psychologica; Suma Psicológica; International Journal of Psychological Research and Revista Colombiana de Psicología), only Revista Latinoamericana de Psicología is clearly on a higher level (between .4 and .5). Contrasting this finding with previous indicators (Citation Index and CiteScore) suggest that Scopus-indexed journals have larger citation networks.

Finally, data on the potential citation impact index is shown in Figure 3d. This information shows a separation between most journals and Revista Latinoamericana, and is similar to how leading journals in the region operate (Salas, Ponce, Méndez-Bustos, Vega-Arce, Pérez, López-López, & Cárcamo-Vásquez, 2017); four are mid-level journals (Avances en Psicología Latinoamericana, Universitas Psychologica, Suma Psicológica, International Journal of Psychological Research), and Acta Colombiana de Psicología and Revista Colombiana de Psicología are low-level journals. Overall, these data on SNIP show how the Revista Latinoamericana de Psicología has grown over recent years, whereas several indicators for other journals have stayed nearly unchanged, except for growth in the number of citations.

Figure 4 shows Journal Scholar Metrics indicators. As Figure 4a indicates, Universitas Psychologica has the highest 5-year h-index, suggesting an important position outside Scopus. This indicator has a median number of citations - see Figure 4c. As shown in Figure 4b, the International Journal of Psychological Research has the highest h-citation index, followed by Universitas Psychologica and Revista Latinoamericana de Psicología. This result suggests that the ranking of some journals could be measured outside the Scopus' system. It is also striking that setting aside these three journals, there is no difference in impact when comparing the journals that are in Scopus and those that are not. This again suggests that indexing does not ensure a greater impact; rather, it reflects a regional placement, which in turn reflects international placement.

Data from the Journal Scholar Metrics for Social Sciences (Figure 4d) shows that only Universitas Psychologica is located in a high quartile (Q2). Four other journals are located in Q3 (International Journal of Psychological Research, Acta Colombiana de Psicología, Pensamiento Psicológico, and Revista Latinoamericana de Psicología) and the eight remaining journals in Q4 (Tesis Psicológica, Revista CES de Psicología, Suma Psicológica; Psychologia, Avances de la Disciplina, Revista Colombiana de Psicología, Psicología desde el Caribe, Avances en Psicología Latinoamericana, and Diversitas). This is a measurement that differs from the quartiles established by Scopus and Web of Science, which in turn illustrate how indicators differ depending on the source. However, the similarities in the two indicators (of quartiles) is related to the number of documents, citations and the contribution made by each publication to the growth of a certain area. The discrepancy indicates the need to gather several measures to overcome database biases. An important correction that appears in these indicators is related to the exclusion of self-citations (Pérez-Acosta & Parra Alfonso, 2018), and although it could be considered to have some misgivings, it does not represent a major change among the journals (Figures 4a and Figure 4b).

Lastly, regional impact was evaluated through Scielo and the number of citations reported (see Figure 5). Although Universitas Psychologica and Revista Latinoamericana de Psicología showed high citation levels, with a 100-citation difference between them in 2016, the remaining 12 journals show very similar citation levels, both for journals included an not included in Scopus. This result could reflect journals' similar regional and international impact. The fact that this relationship is not reflected in the journals indexed in Scopus, suggests that Colombian journals' visibility level in Scopus is determined by the position of Latin American journals, and there is no relative difference between those indexed in Scopus and those that are not.

The contribution and editorial effort index that Redalyc calculates supplements the regional information that has just been analyzed. This index represents the contribution of a journal to its area, and thus can be taken as an impact indicator. Several of the regional indicators suggest that journals in Colombia lag behind, but two have an above-average level, which indicates a leading role in the development of psychology at the regional level and supports the use of these indicators to generate growth strategies (Garfield, 2003).

Discussion

We analyzed the development and growth of Colombian Psychology journals between 2000 and 2016. Overall, we found increased author collaboration rates and visibility during the period of observation, together with a growing impact and positioning of these journals in Latin American and international psychology communities. The comparison between the chosen databases shows that the behavior of the reviewed journals is similar in Scopus and Scielo, but not in Google Scholar Metrics.

With respect to documents published within the last 16 years, in addition to the observed growth, it is possible to infer an improvement in Colombian journals' editorial policies. Their presence in International databases suggests an overall implementation of better editorial practices and processes leading to publication. As previously noted, all the analyzed journals, except Revista Latinoamericana de Psicologia, are open access (OA), and their publishing costs are covered by the institutions that own them. Changes in the journals' output throughout the examined period may be explained by changes in editorial practices linked to OA and/or a transition to digitalization of the publications. Another consequence of this economic model is an increase in papers written in languages other than Spanish, which can be indirectly measured by the country the authors are from.

In terms of collaboration, the patterns of the author-per-paper indicators and the scores in the Lawani and Subramanyam indices are similar to those reported in other studies; namely, collaborations tended to increase over time (Lawani, 1986; Polanco-Carrasco, Gallegos, Salas, & López- López, 2017; Salas et al., 2017; Subramanyam, 1983). Kliegl and Bates (2011) indicate that the levels of collaboration in the top psychology journals reached their peak in the 1990s. Although the present observation window begun in the 2000s, there is a similar increasing trend in collaboration, which has been pointed out in other studies and is supported by the cooperation and previous reports on psychology journals in Colombia (Ávila-Toscano, Marenco-Escuderos, & Madariaga Orozco, 2013; Garcia et al., 2016, 2017; Guerrero & Jaraba, 2009; López-López, de Moya Anegón, Acevedo-Triana, Garcia, & Silva, 2015). Although the data are not presented in this study, this collaboration has been found to be going through a change, moving from intra-institutional to national and international collaborations; as such, this category is necessary for the optimization of various kinds of resources. This transition in the number of authors may, in turn, reflect policies by academic institutions that promote collaborative research and facilitate access to funding when proposals come from several research groups or institutions. In Colombia, for example, COLCIENCIAS (the State entity with the highest funding for science and technology projects) scores proposals higher when they are formulated jointly, combining researchers, research groups and/or participating institutions.

We reported collaboration levels that are consistent with the growth of regional productivity, which are, in turn, aligned with those based on traditional databases (Buela-Casal & López, 2005; Vera-Villarroel, López-López, Lillo, Silva, López-López, & Silva 2011). This observed collaboration is not limited to the co-authorship of papers, which has been the standard bibliometric measure. It could also be the resulting effect of the establishment of collaborative efforts with a more long-term focus (Guerrero Bote, Olmeda-Gómez, & de Moya-Anegón, 2013; López-López, de Moya Anegón, Acevedo-Triana, Garcia, & Silva, 2015).

An aspect that indirectly reflects author collaboration, which was not considered here, is the publication of special issues, in contrast to regular issues. In a recently study, Sala et al. (2017) showed that impact, measured by citation and special issues time of publication, is more efficient than that of regular issues, but the number of authors per article is lower. It seems plausible that despite some Colombian journals have established publication of special issues as part of their editorial policy, this may have had a double effect; namely, cooperation on the same topic, but a decrease in collaboration rates per article.

Psychology Journals in Colombia have emerged under the OA model; thus, authors do not pay publication fees - an exception is Revista Latinoamericana de Psicología. Accordingly, the editorial effort has been greater with limited economic resources, a challenge partially solved by devising new tools aimed at decreasing response times (Neto et al., 2016; Piwowar et al., 2018). Research has demonstrated that the OA model has been successful in terms of dissemination and quality, notwithstanding opposition by large publishers who have hegemonically retained and restricted access to content. However, editorials with international influence have begun to slowly integrate OA models (Alperin & Rozemblum, 2017; Laakso et al., 2011; Mukherjee, 2009; Neto et al., 2016; Piwowar et al., 2018).

There have been substantial changes in the patterns of production, dissemination, and impact of academic publications (Gallegos, Berra, Benito, & López-López, 2014). Most are related to the way in which informal dissemination of publications is conducted today (e.g., disertations, thesis or non academic publication or social media; Alperin, 2015). Some studies propose that psychology, because of its nature, requires increasingly broader co-citation systems than other sectors (Chi & Young, 2013), suggesting the need for enhancements in the way impact is measured across disciplines (Krampen, 2010).

Citation in psychology depends heavily on dissemination, and its contemporary relevance is related to the efforts to control the problem of replicability and reproducibility of psychological phenomena; as discussed in other forums, citations can be more beneficial and less dogmatic (Acevedo-Triana, López-López, & Cardenas, 2014; Begley & Ioannidis, 2015; Open Science Collaboration, 2015). The use of citation as a measure of research dissemination and academic communities' cohesion is consistent with other measurement strategies such as cooperation (Alperin & Rozemblum, 2017). Furthermore, across academic research, and professional systems, citation has been used to establish incentive systems for researchers (Gingras, Larivière, Maca-luso, & Robitaille, 2008).

Worth noting is the fact that we did not implement an analysis of the so-called top quotes or top articles that generate a chain of influence in researchers and that determine what is known as Mainstream Psychology (Bornmann, Wagner, & Leydesdorff, 2017). Some of the debates on citation as an indicator of quality, divulgation, and impact can be reviewed elsewhere, and there are some criticisms that we share but that are beyond the scope of the study (Emilio Delgado López-Cózar, Robinson-García, & Torres-Salinas, 2012; Fetscherin & Heinrich, 2015; E. Garfield, 2007; E. Garfield et al., 1978; Nicolaisen, 2009; Seglen, 1997).

New efforts are currently undertaken towards increasing quality and visibility of publications via the creation of rankings that follow diverse methodologies. These efforts have resulted from acknowledging the need to review and update measurement and impact systems for publications (Aguado López et al., 2013; López-López, 2014, 2015; Zou & Peterson, 2016), and to optimize indicators that allow matching different types of journals using a homogeneous measurement system (Romero-Torres, Acosta-Moreno, & Tejada-Gómez, 2013). Likewise, these efforts are aimed at optimizing research resources, changing a paradigm of knowledge-possessing researchers into one where cooperation and collaborative networks determine the level and quality of research (Duffy, Jadidian, Webster, & Sandell, 2011). In this regard, Alperin and Rozemblum (2017) have pointed out the need to assess Latin American journals in their historical, teleological, and conceptual region-specific contexts as a way to potentiate technological and scientific development.

One of the explicit assumptions of this study was the recognition of the diversity of academic activity in psychology and its subareas (clinical-, social-, organizational-, educational-, or neuropsychology). Although our analysis did not intend to homogenize this activity (Krampen, 2008; Günter Krampen, von Eye, & Schui, 2011), bibliometric data presented here should be used to improve editorial practices that aim to better position and disseminate subdisci-plines (Lluch, 2005).

Additionally, the use of Google Journal Scholar Metrics (Ayllón Millán, Ruiz-Pérez, & Delgado López-Cózar, 2013) in the present study aimed to overcome potential biases in data extraction and analyses based on Scopus and Web of Science databases, which is the traditional method (Gorraiz & Schloegl, 2008; Hernández-González, Sans-Rosell, Jové-Deltell, & Reverter-Masia, 2016; Salvador-Oliván & Agustín-Lacruz, 2015).

One possible limitation of this study is that journal inclusion was based on availability of impact indicators from at least two indexing systems. This decision was based on the notion that a lack of standardized impact indicators across all journals would have precluded comparisons across them; instead, new arbitrary categories would have been needed. Although we acknowledge that these indicators are not indispensable, avoiding their use entails the cost of assessing impact with no widely accepted indicators. Accordingly, there is a need to develop alternative approaches to assess impact via emerging-academic and social networks, which have been underestimated options (Alperin, 2015).

Conclusions

Psychology journals in Colombia are a reference point-for publications across Latin America in terms of dissemination and impact. Coverage in international databases does not ensure greater regional visibility; instead, it optimizes growth in terms of journal networking and co-citation. An increase in productivity has also increased the pressure on the journals' editorial teams, which has led to improvement in response times and quality. The increment in the number of documents per journal seems to be a feature of most of the journals. Analyses of journals' productivity, collaboration, and impact need to comprise different sources to control for partial information, and also be based on samples from different sources. Although some of the reviewed journals seem to have higher scores across the different indicators we analyzed, it is evident that depending on the indicator or the database utilized, journals may change their positioning in comparison to the others (de Araújo & Sardinha, 2011).

References

Acevedo-Triana, C. A., López-López, W., & Cardenas, F. P. (2014). Recomendaciones en el diseño, la ejecución y la publicación de investigaciones en Psicología y ciencias del comportamiento. Revista Costarricense de Psicología, 33(2), 155-177. [ Links ]

Aguado-López, E., Becerril-García, A., & Aguilar Bustamante, M. C. (2016). Universitas Psychologica: un camino hacia la internacionalización. Universitas Psychologica, 15(2), 321-338. http://doi.org/10.11144/Javeriana.upsy15-2.upci. [ Links ]

Aguado López, E., Becerril García, A., Rogel Salazar, R., Garduño Oropeza, G., Zúñiga Roca, M. F., Babini, D., ... Melero, R. (2013). Una métrica alternativa y comprehensiva para el análisis de la actividad científica: la metodología redalyc-fractal. Cápsulas de Investigación, (2). Retrieved from http://ri.uaemex.mx/handle/123456789/242. [ Links ]

Allik, J. (2013). Personality Psychology in the First Decade of the New Millennium: A Bibliometric Portrait. European Journal of Personality, 27(1), 5-14. http://doi.org/10.1002/per.1843. [ Links ]

Alperin, J. P. (2015). Geographic variation in social media metrics: an analysis of Latin American journal articles. Aslib Journal of Information Management, 67(3), 289-304. http://doi.org/10.1108/AJIM-12-2014-0176. [ Links ]

Alperin, J. P., Babini, D., Chan, L., Gray, E., Guédon, J.-C., Joseph, H . ... Vessuri, H. (2015). Open Access in Latin America: a Paragon for the Rest of the World. The Winnower. http://doi.org/doi:10.15200/winn.143982.27959. [ Links ]

Alperin, J. P., & Rozemblum, C. (2017). La reinterpretación de visibilidad y calidad en las nuevas políticas de evaluación de revistas científicas. Revista Interamericana de Bibliotecología, 40(3), 231-241. http://doi.org/10.17533/udea.rib.v40n3a04. [ Links ]

Ávila-Toscano, J., Marenco-Escuderos, A., & Madariaga Orozco, C. (2013). Bibliometric indicators, coauthorship networks and institutional collaboration in Colombian psychology journals. Avances En Psicología Latinoamericana, 32(1), 167-182. http://doi.org/10.12804/apl32.1.2014.12. [ Links ]

Ayllón Millán, J. M., Ruiz-Pérez, R., & Delgado López-Cózar, E. (2013). Índice H de las revistas científicas españolas según Google Scholar Metrics (2008-2012). Retrieved from http://digibug.ugr.es/handle/10481/29348#.WpQwdoPOXIULinks ]

Bar-Ilan, J. (2008). Informetrics at the beginning of the 21st century-A review. Journal of Informetrics, 2(1), 1-52. http://doi.org/10.1016/j.joi.2007.11.001. [ Links ]

Begley, C. G., & Ioannidis, J. P. a. (2015). Reproducibility in Science: Improving the Standard for Basic and Preclinical Research. Circulation Research, 116(1), 116-126. http://doi.org/10.1161/CIRCRESAHA.114.303819. [ Links ]

Björk, B.-C. (2015). Have the "mega-journals" reached the limits to growth? PeerJ, 3, e981. http://doi.org/10.7717/peerj.981. [ Links ]

Bornmann, L., Wagner, C., & Leydesdorff, L. (2017). The geography of references in elite articles: What countries contribute to the archives of knowledge. In Press, 1-27. Retrieved from http://arxiv.org/abs/1709.06479. [ Links ]

Buela-Casal, G., & López, W. L. (2005). Evaluación de las revistas científicas iberoamerianas de psicología, iniciativas y estado actual. Revista Latinoamericana de Psicologia, 37(1), 211-217. [ Links ]

Buela-Casal, G., Medina, A., Viedma, M. I., Godoy, V., Lozano, S., & Torres, G. (2004). Factor de impacto de tres revistas españolas de Psicología. Psicothema, 16(4), 680-688. http://doi.org/10.1157/13068849. [ Links ]

Butler, L. (2008). Using a balanced approach to bibliometrics: quantitative performance measures in the Australian Research Quality Framework. Ethics in Science and Environmental Politics, 8(1), 83-92. Retrieved from http://www.int-res.com/abstracts/esep/v8/n1/p83-92/. [ Links ]

Chi, R., & Young, J. (2013). The interdisciplinary structure of research on intercultural relations: A co-citation network analysis study. Scientometrics, 96(1), 147-171. http://doi.org/10.1007/s11192-012-0894-3. [ Links ]

Davidson, P., Newton, P., Ferguson, C., Daly, J., Elliott, D., Homer, C., ... Jackson, D. (2014). Rating and Ranking the Role of Bibliometrics and Webometrics in Nursing and Midwifery. The Scientific World Journal, 2014, 1-6. http://doi.org/10.1155/2014/135812. [ Links ]

de Araújo, C. G. S., & Sardinha, A. (2011). H-Index of the citing articles: A contribution to the evaluation of scientific production of experienced researchers. Revista Brasileira de Medicina Do Esporte, 17(5), 358-362. http://doi.org/10.1590/S1517-86922011000500013. [ Links ]

Delgado López-Cózar, E., Orduña Malea, E., Marcos Cartagena, D., Jiménez Contreras, E., & Ruiz Pérez, R. (2012). JOURNAL SCHOLAR: Una alternativa internacional, gratuita y de libre acceso para medir el impacto de las revistas de Arte, Humanidades y Ciencias Sociales (12 May 2012 No. 5). Retrieved from http://hdl.handle.net/10481/20375. [ Links ]

Delgado López-Cózar, E., Robinson-García, N., & Torres-Salinas, D. (2012). Manipular Google Scholar Citations y Google Scholar Metrics: simple, sencillo y tentador. Retrieved from http://digibug.ugr.es/handle/10481/20469#.WpQwZIPOXIU. [ Links ]

Duffy, R. D., Jadidian, A., Webster, G. D., & Sandell, K. J. (2011). The research productivity of academic psychologists: Assessment, trends, and best practice recommendations. Scientometrics, 89(1), 207-227. http://doi.org/10.1007/s11192-011-0452-4. [ Links ]

Fetscherin, M., & Heinrich, D. (2015). Consumer brand relationships research: A bibliometric citation meta-analysis. Journal of Business Research, 68(2), 380-390. http://doi.org/10.1016/j.jbusres.2014.06.010. [ Links ]

Gallegos, M., Berra, M., Benito, E., & López-López, W. (2014). Las nuevas dinámicas del conocimiento científico y su impacto en la Psicología Latinoamericana. Psicoperspectivas, 13(3), 106-117. http://doi.org/10.5027/PSICOPERSPECTIVAS-VOL13-ISSUE3-FULLTEXT-377Links ]

Garcia, A., Acevedo-Triana, C. A., López-López, W., Acevedo-Triana, C. A., & Garcia, A. (2014). Cooperación en las Ciencias del Comportamiento Latinoamericanas: una Investigación Documental. Terapia Psicológica, 32(2), 165-174. [ Links ]

Garcia, A., López-López, W., Acevedo-Triana, C. A., & Bucher-Maluschke, J. S. N. F. (2016). Cooperation in the Latin American behavioral sciences: Motivation, evaluation and difficulties. Suma Psicológica, 23(2), 125-132. http://doi.org/10.1016/j.sumpsi.2016.08.002. [ Links ]

Garcia, A., López-López, W., Acevedo-Triana, C., & Nogueira Pereira, F. (2017). Cooperation in Latin America: the Scientific Psychology Network. Diversitas, 13(1), 113-123. http://dx.doi.org/10.15332/s1794-9998.2017.0001.09. [ Links ]

Garfield, E. (2003). The meaning of the Impact Factor. International Journal of Clinical and Health Psychology, 3(2), 363-369. http://doi.org/10.1080/09515080020007599. [ Links ]

Garfield, E. (2007). The evolution of the science citation index. International Microbiology, 10(1), 65-69. http://doi.org/10.2436/20.1501.01.10. [ Links ]

Garfield, E., Malin, M., & Small, H. (1978). Citation Data as Science Indicators. In Y. Elkana, J. Lederberg, R. K. Merton, A. Thackray, & H. Zuckerman (Eds.), Toward a metric of science: The advent of science indicators (pp. 179-207). New York: Wiley. [ Links ]

Gingras, Y., Larivière, V., Macaluso, B., & Robitaille, J.-P. (2008). The effects of aging on researchers' publication and citation patterns. PloS One, 3(12), e4048. http://doi.org/10.1371/journal.pone.0004048. [ Links ]

Gorraiz, J., & Schloegl, C. (2008). A bibliometric analysis of pharmacology and pharmacy journals: Scopus versus Web of Science. Journal of Information Science, 34(5), 715-725. http://doi.org/10.1177/0165551507086991. [ Links ]

Guerrero Bote, V. P., Olmeda-Gómez, C., & de Moya-Anegón, F. (2013). Quantifying the benefits of international scientific collaboration. Journal of the American Society for Information Science and Technology, 64(2), 392-404. http://doi.org/10.1002/asi.22754. [ Links ]

Guerrero, J., & Jaraba, B. (2009). La producción científica de la psicología colombiana: un análisis bibliométrico de las revistas académicas, 1949- 2008. Bogotá. Retrieved from http://www.ascofapsi.org.co/observatorio/documentos/2011/Estudio_Bibliometria.pdfLinks ]

Heberger, A. E., Christie, C. A., & Alkin, M. C. (2010). A bibliometric analysis of the academic influences of and on evaluation theorists' published works. American Journal of Evaluation, 31(1), 24-44. http://doi.org/10.1177/1098214009354120. [ Links ]

Hernández-González, V., Sans-Rosell, N., Jové-Deltell, M. C., & Reverter-Masia, J. (2016). Comparación entre Web of Science y Scopus, Estudio Bibliométrico de las Revistas de Anatomía y Morfología. International Journal of Morphology, 34(4), 1369-1377. http://doi.org/10.4067/S0717-95022016000400032. [ Links ]

Hicks, D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44(2), 193-215. http://doi.org/10.1007/BF02457380. [ Links ]

Kliegl, R., & Bates, D. (2011). International cooperation in psychology is on the rise. Scientometrics, 87(1), 149-158. [ Links ]

Koch, S. (1992). The Nature and Limits of Psychological Knowledge. Lessons of a century qua "science." In S. Koch (Ed.), A century of Psychology as Science (pp. 75-97). Washintong: American Psychological Association. [ Links ]

Krampen, G. (2008). The evaluation of university departments and their scientists: Some general considerations with reference to exemplary bibliometric publication and citation analyses for a Department of psychology. Scientometrics, 76(1), 3-21. http://doi.org/10.1007/s11192-007-1900-z. [ Links ]

Krampen, G. (2010). Acceleration of citing behavior after the millennium? Exemplary bibliometric reference analyses for psychology journals. Scientometrics, 83(2), 507-513. http://doi.org/10.1007/s11192-009-0093-z. [ Links ]

Krampen, G., von Eye, A., & Schui, G. (2011). Forecasting trends of development of psychology from a bibliometric perspective. Scientometrics, 87(3), 687-694. http://doi.org/10.1007/s11192-011-0357-2. [ Links ]

Laakso, M., Welling, P., Bukvova, H., Nyman, L., Björk, B.-C., & Hedlund, T. (2011). The development of open access journal publishing from 1993 to 2009. PLoS One, 6(6), e20961. http://doi.org/10.1371/journal.pone.0020961. [ Links ]

Lawani, S. M. (1986). Some bibliometric correlates of quality in scientific research. Scientometrics, 9(1-2), 13-25. http://doi.org/10.1007/BF02016604. [ Links ]

Leydesdorff, L. (2004). Top-down decomposition of the Journal Citation Report of the Social Science Citation Index: Graph- and factor-analytical approaches. Scientometrics, 60(2), 159-180. http://doi.org/10.1023/B:SCIE.0000027678.31097.e0. [ Links ]

Lluch, J. O. (2005). Some considerations on the use of the impact factor of scientific journals as a tool to evaluate research in psychology. Scientometrics, 65(2), 189-197. http://doi.org/10.1007/s11192-005-0267-2. [ Links ]

Long, H., Plucker, J. A., Yu, Q., Ding, Y., & Kaufman, J. C. (2014). Research Productivity and Performance of Journals in the Creativity Sciences: A Bibliometric Analysis. Creativity Research Journal, 26(3), 353-360. http://doi.org/10.1080/10400419.2014.929425. [ Links ]

López-López, W. (2014). The measurement of scientific production: Myths and Complexities. Universitas Psychologica, 13(1). [ Links ]

López-López, W. (2015). La psicología iberoamericana: una realidad en expansión. Informacio Psicológica, 109(2), 2. [ Links ]

López-López, W., de Moya Anegón, F., Acevedo-Triana, C., Garcia, A., & Silva, L. M. (2015). Visibility and Cooperation in Iberoamerican Psychology. Psicologia: Reflexão e Crítica, 28(S), 72-81. [ Links ]

López-López, W., García-Cepero, M. C., Aguilar-Bustamante, M. C., Silva, L. M., Aguado López, E., Iberoamericana, L. A. P., . Aguado, E. (2010). Panorama general de la producción académica en la psicología iberoamericana, 2005-2007. Papeles Del Psicólogo, 31(3), 296-309. [ Links ]

López-López, W., Silva, L. M., García-Cepero, M. C., Aguilar-Busta-mante, M. C., & Aguado, E. (2010). Panorama general de la producción académica en la psicología colombiana indexada en psicoredalyc. Acta Colombiana de Psicología, 13(2), 35-46. [ Links ]

Moed, H. F. (2008). UK Research Assessment Exercises: Informed judgments on research quality or quantity? Scientometrics, 74 (1), 153-161. http://doi.org/10.1007/s11192-008-0108-1. [ Links ]

Moed, H. F. (2010). Measuring contextual citation impact of scientific journals. Journal of Informetrics, 4(3), 265-277. http://doi.org/https://doi.org/10.1016/j.joi.2010.01.002. [ Links ]

Morales, Y. J. G., Jaraba-Barrios, B., Guerrero-Castro, J., & López-López, W. (2012). Entre internacionalización y consolidación de comunidades académicas locales. sobre la revista latinoamericana de psicología. Revista Colombiana de Psicologia, 21(1), 97-110. [ Links ]

Mori, H., & Nakayama, T. (2013). Academic Impact of Qualitative Studies in Healthcare: Bibliometric Analysis. PLoS One, 8(3). http://doi.org/10.1371/journal.pone.0057371. [ Links ]

Mukherjee, B. (2009). Do open-access journals in library and information science have any scholarly impact? A bibliometric study of selected open-access journals using google scholar. Journal of the American Society for Information Science and Technology, 60(3), 581-594. http://doi.org/10.1002/asi.21003. [ Links ]

Navarrete-Cortés, J., Fernández-López, J. A., López-Baena, A., Quevedo-Blasco, R., Buela-Casal, G., Navarrete-Cortes, J., ... Buela-Casal, G. (2010). Global psychology: A bibliometric analysis of web of science publications. Universitas Psychologica, 9(2), 553-567. [ Links ]

Nederhof, A. J. (2006). Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review. Scientometrics, 66(1), 81-100. http://doi.org/10.1007/s11192-006-0007-2. [ Links ]

Nederhof, A. J., Zwaan, R. A., De Bruin, R. E., & Dekker, P. J. (1989). Assessing the usefulness of bibliometric indicators for the humanities and the social and behavioural sciences: A comparative study. Scientometrics, 15(5-6), 423-435. http://doi.org/10.1007/BF02017063. [ Links ]

Neto, S. C., Willinsky, J., & Alperin, J. P. (2016). Measuring, Rating, Supporting, and Strengthening Open Access Scholarly Publishing in Brazil. Education Policy Analysis Archives, 24(54), 1-25. http://doi.org/http://dx.doi.org/10.14507/epaa.24.2391. [ Links ]

Nicolaisen, J. (2009). Bibliometrics and Citation Analysis: From the Science Citation Index to Cybermetrics. Journal of the American Society for Information Science and Technology. http://doi.org/10.1002/asi.21181. [ Links ]

Open Science Collaboration. (2015). Estimating the reproducibili-ty of psychological science. Science, 349(6251), 4716-1-4716-8. http://doi.org/10.1126/science.aac4716. [ Links ]

Pérez-Acosta, A., & Parra Alfonso, G. (2018). Autoindexación o autoindización de autores: definición e impacto en la difusión del conocimiento publicado. Revista CES Psicología, 11(1), 1-4. [ Links ]

Piwowar, H. (2013). Value all research products. Nature, 493, 159. [ Links ]

Piwowar, H., Priem, J., Larivière, V., Alperin, J. P., Matthias, L., Norlander, B., ... Haustein, S. (2018). The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles. Peer J, 6, e4375. http://doi.org/10.7717/peerj.4375. [ Links ]

Polanco-Carrasco, R., Gallegos, M., Salas, G., & López- López, W. (2017). Las revistas de psicología en Chile: historia y situación actual. Terapia Psicológica, 35(1), 81-93. http://doi.org/10.4067/S0718-48082017000100008. [ Links ]

Quevedo-Blasco, R., López-López, W., & Quevedo-blasco, R. & López-López, W. (2011). Situación de las revistas iberoamericanas de Psicología en el Journal Citation Reports de 2010. Universitas Psychologica, 10(3), 937-947. [ Links ]

Robayo-Castro, B., Rico, J. L., Hurtado-Parrado, C., & Ortega, L. A. (2016). Impacto y calidad de la productividad académica de los investigadores en Colombia en neurociencia comportamental utilizando modelos animales. Universitas Psychologica, 15(5). http://doi.org/10.11144/Javeriana.upsy15-5.icpa. [ Links ]

Romero-Torres, M., Acosta-Moreno, L. A., & Tejada-Gómez, M.-A. (2013). Ranking de revistas científicas en Latinoamérica mediante el índice h: estudio de caso Colombia. Revista Española de Documentación Científica, 36(1), e003. http://doi.org/10.3989/redc.2013.1.876. [ Links ]

Sala, F. G., Lluch, J. O., Gil, F. T., & Ortega, M. P. (2017). Characteristics of monographic special issues in Ibero-American psychology journals: visibility and relevance for authors and publishers. Scientometrics, 112(2), 1069-1077. http://doi.org/10.1007/s11192-017-2372-4. [ Links ]

Salas, G., Ponce, F. P., Méndez-Bustos, P., Vega-Arce, M., Pérez, M. de los Á., López-López, W., & Cárcamo-Vásquez, H. (2017). 25 Años de Psykhe: Un Análisis Bibliométrico. Psykhe, 26(1), 1-17. http://doi.org/10.7764/psykhe.26.L1205. [ Links ]

Salazar-Acosta, M., Lucio-Arias, D., López-López, W., & Agua-do-López, E. (2013). Informe sobre la producción científica de Colombia en revistas iberoamericanas de acceso abierto en redalyc.org, 2005 - 2011 (1st ed.). Toluca: Universidad Autónoma del Estado de México. [ Links ]

Salvador-Oliván, J. A., & Agustín-Lacruz, C. (2015). Correlation between bibliometric indicators in web of science y scopus journals. Revista General de Información y Documentación, 25(2), 341-359. http://doi.org/10.5209/rev_RGID.2015.v25.n2.51241. [ Links ]

Schui, G., & Krampen, G. (2010). Thirty years of International Journal of Behavioral Development: Scope, internation-ality, and impact since its inception. International Journal of Behavioral Development, 34(4), 289-291. http://doi.org/10.1177/0165025409344828. [ Links ]

SCImago Research Group. (2007). Description of SCImago Journal Rank indicator. Granada, España. Retrieved from http://www.scimagojr.com/SCImagoJournalRank.pdf. [ Links ]

Seglen, P. O. (1997). Citations and journal impact factors: questionable indicators of research quality. Allergy, 52(11), 10501056. http://doi.org/10.1111/j.1398-9995.1997.tb00175.x. [ Links ]

Sistema de Información Científica Redalyc. (2018). Colecciones de revistas en Psicología de la Red de Revistas Científicas de América Latina (Redalyc). Retrieved from http://www.redalyc.org/pais.oa?id=30&tipo=coleccion. [ Links ]

Suber, P. (2015). Acceso Abierto. Universidad Autónoma del Estado de México. Retrieved from http://ri.uaemex.mx/handle/123456789/21710. [ Links ]

Subramanyam, K. (1983). Bibliometric studies of research collaboration: A review. Journal of Information Science, 6(1), 33-38. http://doi.org/10.1177/016555158300600105. [ Links ]

Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PloS One, 8(5), e64841. http://doi.org/10.1371/journal.pone.0064841. [ Links ]

Van Noorden, R. (2012a). Europe joins UK open-access bid. Nature, 487(7407), 285. http://doi.org/10.1038/487285a. [ Links ]

Van Noorden, R. (2012b). Journal offers flat fee for "all you can publish." Nature, 486, 166. [ Links ]

Vargas-Quesada, B., & Moya-Anegón, F. De. (2007). Visualizing the Structure of Science. Visualizing the Structure of Science. http://doi.org/10.1007/3-540-69728-4. [ Links ]

Vera-Villarroel, P., López-López, W., Lillo, S., Silva, L. M., López-López, W., & Silva, L. M. (2011). La producción científica en psicología latinoamericana : Un análisis de la investigación por países. Revista Latinoamericana de Psicologia, 43(1), 95-104. [ Links ]

Yeung, A. W. K., Goto, T. K., & Leung, W. K. (2017). The changing landscape of neuroscience research, 2006-2015: A bibliometric study. Frontiers in Neuroscience, 11(MAR). http://doi.org/10.3389/fnins.2017.00120. [ Links ]

Zou, C., & Peterson, J. B. (2016). Quantifying the scientific output of new researchers using the zp-index. Scientometrics, 106(3), 901-916. http://doi.org/10.1007/s11192-015-1807-z. [ Links ]

Zurita, G., Merigó, J. M., & Lobos-Ossandón, V. (2016). A biblio-metric analysis of journals in educational research. In Lecture Notes in Engineering and Computer Science (Vol. 2223, pp. 403-408). [ Links ]

Recibido: 08 de Mayo de 2018; Aprobado: 10 de Agosto de 2018

* Corresponding author. E-mail address:lopezw@javeriana.edu.co

Creative Commons License Este es un artículo publicado en acceso abierto bajo una licencia Creative Commons