ENTRAR            

 


 

Revista INDEX DE ENFERMERIA (Edici�n digital) ISSN: 1699-5988

 

 

 

EDITORIALS

Documentos relacionados

 Click on author to see biographical summary

 Version en espa�ol

 Ir a Sumario

 

 

Enviar correo al autor

 

 

 

 

Controversies in the Assessment of Knowledge: Allegations concerning an applied science

Manuel Amezcua
Facultad de Ciencias de la Salud, Universidad de Granada, España

Index de Enfermería [Index Enferm] 2011; 20(1-2): 7-11

 

 

 

 

 

 

 

How to cite this document

 

 

Amezcua, Manuel. Controversies in the Assessment of Knowledge: Allegations concerning an applied science. Index de Enfermería [Index Enferm] (digital edition) 2011; 20(1-2). In </index-enfermeria/v20n1-2/0710e.php> Consulted by

 

 

 

    More and more nursing authors, especially from academia, come to the Documentary Studies Group (GED) of Index Foundation requesting help with the frustration that negative evaluations from the Spanish governmental agencies produce and that are not consistent with the spirit that has awakened in academia with the incorporation of our country into the European Higher Education.1 The disappointment is due to the fact that these authors believe they have made a significant contribution to the field of scientific production, with articles that have had a large impact and have been highly consulted and consequently cited often. However, the evaluators do not consider quality and discard them due to the fact that they are not published in magazines included in a particular repertoire. This repertoire is Science Citation Index (SCI) by Thomson Reuters (formerly Thomson ISI and before Thomas Scientific), the well known and often criticized U.S. company that, at least in Spain, has established itself as the exclusive provider of scientific information in the government sector.2 Consequently, its instruments are used by governmental agencies as the only valid tools to assess the publications of all the existing academic disciplines.
    Following consultations, there seems exist a certain feeling of hope that mediation at our institutional level could solve the problem. But the GED can do little besides understand the frustration of those affected and empathize with their misery. And survive, enduring the threats of Thomson Reuters (the company owns the rights to the intellectual concept of Impact Factor and therefore the exclusive rights to obtaining one),
3 the ignorance of certain rating agencies about our instruments, and even direct attacks by one of the few institutional structures of nursing research, when it should be the most interested in promoting the GED.4 Although in reality, we must recognize that there is much that the GED is doing to cushion the effects of dramatic government assessments, like the production of alternative instruments to visualize what SCI conceals in Iberoamerican nursing.5 All of this is done with the hope that if one day the position of the Spanish administration changes and becomes more flexible, which would be desirable, there will be a valid and sufficient alternative to allow fairer evaluations that are consistent with the expectations of the nursing researchers.
    While this happens, some of the affected ask, "What can we do?" The response is obvious, what people do in the face of injustice: resist, express their disagreement towards the assessment using reasoned arguments and tirelessly propose alternatives that are more suitable. It appeals to fundamental rights that nursing should be evaluated in its disciplinary context, using applied science and that its fundamental goal should be to transform clinical practice to benefit the patients using the diversity of techniques and instruments available and accepted by the scientific community to determine the quality of the publications. It fails to say the independence of where and how to publish and what instruments can report thereon (whether it be a database, citation index, directory or repository).
    In the current case, the problem started in 2006 when the National Evaluation of Research Activity (Ministry of Education and Science of Spain) decreed that "contributions will be preferred when they are articles in magazines of recognized work, accepting those that occupy important positions in the areas listed by scientists in the Subject Category Listing of the Journal Citation Reports of the Science Citation Index (Institute for Scientific Information-ISI, Philadelphia, PA, USA). Electronic journals are considered when they appear in the lists of ISI."
6 It insists, "As general counsel, for articles in the Biomedical Sciences, both basic and clinical, it shall be considered necessary as a minimum to pass the assessment the presence of two publications in high impact journals from those contained in any of the headings of the Science Citation Index."6 This decree does not use any more rationality to justify this highly restrictive criterion other than to assume that SCI is the only valid source to discern the value of periodicals, supposing that the better ones will be those that occupy a higher position in their repertoires. This step invalidates any other source of knowledge or possible procedure, making it so that other resources are not used for what they were created. Thus, henceforth, whatever Thomson Reuters (ISI according to the decree) does will be correct, for the sake of having created this business. Meanwhile, the multiple attempts that have arisen or will arise in the future to propose valid alternatives of evaluation will be futile, even though they are supported in gestated processes using scientific logic. And it will not be because they are not a project of ISI, knowing that if someday they are purchased or transferred they would regain the fact value that they are now denied. Moreover, the criterion requires a dependent relationship between the evaluating entity and the company providing information, sustained by blind faith in the product, since property rights prevent reproduction in independent audits.7 Of course, in the secular world in which we live, faith is embodied in the acquisition by the Spanish government of a licence to use ISI products, whose price is dizzying for researchers who are accustomed to managing limited resources.
    At a time in which most academic politics are more concerned with rankings than the social usefulness of knowledge, this corruption may make sense. But it should be clear that renouncing the intrinsic value of the publications in favour of indirect evaluation systems has costs. The effect that the Decree of 2006 has produced in practice is that the responsibility for the evaluation of scientific activity has transferred from a public to a private company. Put in another way, in Spain it is Thomson Reuters, not the rating agencies, who determines the value of the publications of candidates for academic positions, research funds or professional incentives. This is done through a policy of including or excluding magazines in a repertoire that has been catalogued as opaque and questionable by a broad critical sector within the scientific community.
7,8 For this reason, in 2009 Index Foundation sent the Ministry of Science and Education a report, assigned to the Documentary Studies Group, which was led at the time by Dr. Galvéz Toro, with allegations against the Decree. This document has had some circulation as grey literature and is still shared by researchers and academics who, having been evaluated unjustly, use it to substantiate their claims. What we do not know is the result that it produces, but supposing that it at least it will not produce major damage, we insert it as an annex with the hope that it can be useful.
    The Spanish Ministry's restrictive criteria are possible of causing multiple problems and, using ample evidence, these criteria can be questioned in their ability to determine what should be measured: the weight of publications from the point of view of their quality.
4,8-10
    Consider a few:
    -The quality of the articles should not be determined indirectly by the publication that it is published in (in the same journal articles are usually published with unequal quality) but by the relevance and rigor of the research contained in them, criteria that should be taken into account at the hour of evaluating them but has not.
    -The scientific impact measures the effect that journals have in terms of citations received,
4,8 which is misused when evaluating the quality of a particular article or a researcher, professional or academic. Because of this, relevant scientific entities, like the European Association of Science Editors (EASE) in 2007 or the International Council for Science (ICSU) the next year have suggested the cautious use of the Impact Factor, taking precautions against the effects it produces on the behaviour of scientists.12
    -The Impact Factor reports citations received by journals in the two years following its publication, while a work, depending on its location and the area of knowledge where it is situated13 will have a variable, but more extensive, citation period.
    -The value of a citation is not determined by the instrument by which it is identified but by its own existence (the scientific criteria that should be respected is that larger number of citations means a greater impact).
14
    -The authors themselves are the most interested in knowing the impact of their work and therefore they should be required to have the responsibility of providing necessary evidence for evaluation: the impact of copyright (index H or Hirsch) is shown as a solid performance indicator due to the citations received, allowing verification through audit.15,16
    -The evaluation of scientific contribution should adhere to scientific logic and stimulate its own procedures, even better if these emanate from a consensus between diverse disciplinary groups and areas of knowledge that make up the scientific community.
    -There exists a diversity of instruments that are used by the scientific community to manage knowledge and, unless proven otherwise, all are useful in the function of interest in which the user approaches them. In principle there is no one definitive tool that has the capacity of assessing global scientific knowledge in a reliable and infallible way. Possibly the combination of available instruments provides the best guarantee of accuracy.
    -Knowledge management instruments should be used as mere tools that should never replace the responsibility that governments and administrations have to put together procedures of evaluation in accordance with the particularities of the context in which they function.
    For all the allegations that we can build, there is little comfort to those who choose to face the today and now of the evaluation systems. And the worst is that, unless we change the criteria, the chances that a Spanish nurse will be able to produce a reviewable publication are very slim. The last SCI Nursing repertoire incorporates just eighty publications, of which none are Spanish and the only one in Spanish has the second to last position in the repertoire. Spanish nurses produce around 4,500 publications a year, according to the CUIDEN database, which is more than double the total of articles produced by the SCI-Nursing journals. Something is wrong. We do not believe that the Spanish Ministry knows this data, but they should know that their ignorance and intransigence is forcing Spanish nurses (and other professionals) to write in other languages
17 and publish in journals of other countries or in other disciplines in order to be valued in our country. The criterion is producing knowledge drain. Knowledge that has been funded here, using patients from here, and that is necessary here where it is produced. These are the rules of the game in an applied discipline like nursing.
    Another question is whether the Spanish model responds to a global model, as many people think, and the resounding answer is no. Each country tends to manage its own knowledge with its own criteria and according to its scientific policies. In the Latin American context Brazil can be highlighted as a case in which, though it matches the Spanish model in the importance of the value of biometrics when evaluating publications, this value is made of flexible and consensual criteria, which emphasizes the display of quality knowledge.
18,19 Also in the European and Anglo-Saxon context we find more and more alternatives to exclusive assessment of publications through scientific impact, which we give some examples, in advance of the report that the GED is preparing to be published later.
    The UK has designed a new system for assessing the quality of research in higher education institutions referred to as REF (Research Excellence Framework), proposing different evaluation guidelines, the results of which are used for the selective allocation of research funds in said institutions. The model of clinical evaluation emphasizes through comprehensive reports the effect the research (or publication) has had in terms of social, institutional and literary impact.
20 Under the slogan "Quality not Quantity" the German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) proposed in February of 2010 a new evaluation system that emphasizes publications of excellence, with emphasis on the actual description of the research, reducing the importance of large volumes of publications and posts in repertoires of magazines and other numerical indices ("It's the content that matters" in the words of the president of DFG, Prof. Matthias Kleiner).21 The German proposal follows the British model and that adopted by the National Science Foundation (US), all of which open a nearby threshold of hope for the scientists who deal with models as irrational as the Spanish model.
    Because of this we are convinced that, if only for the influence of countries with greater weight in the international scientific scene, the evaluation criteria will be relaxed. When this occurs we will know that those who vehemently defend the evaluation system that is damaging science in general and applied disciplines such as nursing, will hide and we will never know who fed them, just as we could never put a face to the intransigent censors of Franco's Spain. Well, this may seem a bit excessive. The truth is that, lit by hope, the Foundation will continue to produce tools such as CUIDEN, CANTARIDA or CUIDEN INDEX, making improvements and refining management systems of nursing information in the scientific space of Latin America to promote assessments tighter and more consistent with the realities of nurses who speak languages other than English.
22,23 And with the good hope that increasingly more countries, such as Spain, will find in them some positive attribution to their use. To achieve this goal, much will depend on the Latin American and Spanish nurses themselves,24 who with words, with their legitimate demands, overcoming conformity in a culture of obedience of the past, will be the only ones to give value to the criteria and tools that most benefit them. We hope that it will be those who identify and prioritize what nurses produce, but in their own context.

References

1. Granero Molina, José; Fernández Sola, Cayetano; Aguilera Manrique, Gabriel. Evaluación frente a calificación en el nuevo Espacio Europeo de Educación Superior (EEES), una reflexión ético-crítica. Index de Enfermería 2010; 19(1): 37-41.
2. Ruiz-Pérez, Rafael; Delgado López-Cózar, Emilio; Jiménez-Contreras, Evaristo. Criterios del Institute for Scientific Information para la selección de revistas científicas. Su aplicación a las revistas españolas: metodología e indicadores. International Journal of Clinical and Health Psychology 2006; 6(2): 401-424.
3. Amezcua, Manuel. Index y la construcción de una Comunidad de Conocimiento Abierta. Index de Enfermería 2007; XVI(58): 7-10.
4. Amezcua, Manuel. ¿Para qué sirve el Indice de Impacto de una revista? Index de Enfermería 2010; 19(2-3): 83-7.
5. Gálvez Toro A, Amezcua M, Hueso Montoro C. CUIDEN Citación y la valoración de las publicaciones científicas enfermeras. Index de Enfermería 2005; 51: 7-9.
6. Ministerio de Educación y Ciencia de España. Resolución 20404 de 17 de noviembre de 2006, de la Presidencia de la Comisión Nacional Evaluadora de la Actividad Investigadora, por la que se establecen los criterios específicos en cada uno de los campos de evaluación. Boletín Oficial del Estado, 23.11.2006; 280: 41071-8.
7. Rossner, Mike; Van Epps, Heather; Hill, Emma. Show me the data. The Journal of Cell Biology, 2007; 179(6):1091-1092.
8. Quispe Gerónimo, C. ¿Es el Factor de Impacto un buen indicador para medir la calidad de las revistas científicas?: análisis de algunos problemas generados por su uso. Infobib 2004; 3. Disponible en
https://eprints.rclis.org/bitstream/10760/5002/1/articulo1.pdf [Consultado el 16.06.2011].
9. Amin M, Mabe M. Impact factors: use and abuse. Medicina 2003; 63: 347-54.
10. Camí J. Impactolatría: diagnóstico y tratamiento. Med Clin 1997; 109(13): 515-524.
11. Seglen PO. Why the impact factor of journals should not be used for evaluating research. BMJ 1997; 314 (7079): 498-502.
12. Impact factor. Wikipedia. Disponible en
https://en.wikipedia.org/wiki/Impact_factor [Consultado: 26.06.2011].
13. Van Nierop, Erjen. Why do statistics journals have low impact factors? Statistica Neerlandica. 2009; 63(1): 52-62.
14. Gálvez Toro, Alberto. El poder de una cita. Desarrollo Científ Enferm 2006; 14(7): 243-244.
15. Gálvez Toro, Alberto; Amezcua, Manuel. El factor h de Hirsch: the h-index. Una actualización sobre los métodos de evaluación de los autores y sus aportaciones en publicaciones científicas. Index de Enfermería, 2006; 55: 38-43.
16. Gálvez Toro A, Amezcua M, Salido Moreno MP, Hueso Montoro C. Impacto de Autor CUIDEN Citación. Trayectorias científicas relevantes y excelencia a través del Factor h (h-index) de Hirsch en el espacio científico iberoamericano. Index de Enfermería 2006; XV(55): 76-82.
17. Gálvez Toro A. El español proscrito. Evidentia. 2007 may-jun; 4(15). Disponible en:
/evidentia/n15/343articulo.php [consultado 16.06.2011].
18. Souza EP, Paula MCS. Qualis: a base de qualificação dos periódicos científicos utilizada na avaliação CAPES. InfoCAPES 2002; 10(2):7-25.
19. Erdmann AL, Marziale MHP, Pedreira MLG, Lana FCF, Pagliuca LMF, Padilha MI, Fernandes JD. Evaluation of scientific periodicals and the brazilian production of nursing articles. Rev Latino-am Enfermagem 2009 maio-junho; 17(3): 403-9.
20. Research Excellence Framework: Impact pilot exercise. Example case studies from Clinical Medicine. Higher Education Funding Council for England (HEFCE). November 2010. Disponible en
https://www.hefce.ac.uk/research/ref/impact/ClinicalMedicine.pdf [Consultado: 26.06.2011].
21. "Quality not Quantity" - DFG Adopts Rules to Counter the Flood of Publications in Research. Press Release No. 7 | 23 February 2010. Deutsche Forschungsgemeinschaft. Disponible en
https://www.dfg.de/en/service/press/press_releases/2010/pressemitteilung_nr_07/index.html [Consultado: 26.06.2011].
22. Gálvez Toro, Alberto; Hueso Montoro, César; Amezcua, Manuel. Indicadores CUIDEN de repercusión de las revistas de enfermería del área lingüística del español y del portugués. Index de Enfermería 2004; XIII(46): 76-80.
23. Jerez Fonseca, Lady Catalina; Montoya Sanabria, Sandra Milena. Impacto y repercusión de 37 revistas de Enfermería del Espacio Científico Iberoamericano. Año 2008. Index de Enfermería 2010; 19(2-3): 221-225.
24. Rizo, MM. Debemos defender nuestras revistas científicas. Index de Enfermería 2010; 19(1): 69-70.

Annex 1
Index Foundation 11/07/2009

Notes on the Resolution 20404 of 17.11.2006 of the President of the National Evaluating Commission of Research Activity, with the goal of establishing specific criteria in each field of evaluation.

    Point 1. There is a tendency to consider "more valuable" the contributions that are published in a journal included in the repertoire of Journal Citation Reports of Science Citation Index. They implicitly discriminate against the contributions made in other journals.
    Point 2. The repertoire of Journal Citation Reports of Science Citation Index mainly measures citations. Because of this, it makes little sense to value the act of publishing or not in the magazines of this repertoire, but instead the citations that are received.
    Point 3. In Health Sciences there is a clear distinguishing factor. They tend to cite locally because they include clinical and applied science that, strictly speaking, defines an area of knowledge that is more regional than global. Anglo-Saxon professionals most frequently cited nearby citations, in context and application. The British tend to cite themselves more than the English-speakers of the USA. Spanish nurses tend to cite more input in Spanish than in English. The already retired and paradigmatic director of BMJ, Richard Smith, affirmed that in this sense doctors and in general health professions, are not scientists but instead clinicians.
    Point 4. Regional and local contributions, published by clinical professionals, are fundamental to the development of the health of a region. In Spain the clinical research of nurses or doctors, published in magazines in this environment, have effect on clinical practice, demonstrating their development and their level of application. And on the benefactors, the patients.
    Point 5. Any policy that disparages or undervalues local or regional inputs will result in social prejudice of scientific research activities that health professionals and nurses conduct. Moreover, it could lead to the weakening and even the disappearance of the scientific-professional press and, having no value, the professionals would stop publishing in it. The case of Spanish nurses is a paradigmatic model: we cannot speak of the existence of nursing that takes charge of itself, in the scientific and professional environment, until bibliographic databases like CUIDEN are developed, and dozens of journals are available for professionals to communicate.
    Point 6. Current technological development shows that it is inadequate to reduce evaluation of an author and their contributions to one assessment method (Journal Citation Reports of Science Citation Index, in the case cited in the decree under discussion). Due to how important it is that an author be cited and used in the future by others, it must be expressed that the assessment is the effect of the author, their scientific production. Otherwise, the Ministry itself could be accused of acting with trespassing when it obliges authors to be evaluated by a private business and only by the criteria they have defined. (Thomson ISI). At this level, it would be important to ask the author being evaluated to provide data on their work, i.e. the number of citations or the h-index of Hirsh that they have. There are many ways to understand the citations that an author has received: the products of Thomson ISI can be consulted, the Google tool Google Scholar (which counts the number of citations that a work receives online), the database CUIDEN CITATION (that contains thousands of citations in every area of knowledge that the Iberoamerican nursing journals contain) or whichever other system (aggregators also allow these calculations).
    The criteria set out by the Decree are thus limited to: define what citation index the author should be included in, and what their Hirsch index should be in each discipline. This has already been studied in some disciplines and, for example, we know what value of these indexes is necessary to receive a Nobel Prize in physics. The calculation of the index should be made by the author being evaluated, reporting the veracity of the citation. What are evaluated, therefore, are not one or more public contributions, but the trajectory of an author at a specific time. A professor with three six-year terms must have a specific h-value of Hirsch or a determined number of citations, always more than a professor with two six-year terms. The rest, published in one way or another, may be less important. However, any citation that has been cited a dozen times shows its importance. Or if an author has an h of 24, even if he has only published one article and one monograph in the six-year period, it may indicate that he is of value within the system of knowledge.
    The only problem that this method plants is that it is necessary, first of all, to count on the support and guidance of national specialized teams of documental research and bibliometrics for each discipline. In a way these groups are well identified in our scenario: CSIC for humanities, IHCD López Piñero for the clinical medicine areas, Index Foundation for nursing and related areas, and others.
    Point 7. It is not understood why the electronic journals, if we speak of innovation and diffusion of information, are not well valued in this Decree. In a few years it is possible that paper will have almost disappeared in scientific divulgation. In fact, the majority of the archived collections of health sciences are contracted in full text online before print journals.

 

 

Principio de p�gina 

 

 

error on connection