<?xml version="1.0" encoding="ISO-8859-1"?><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<front>
<journal-meta>
<journal-id>1657-0790</journal-id>
<journal-title><![CDATA[Profile: Issues in Teachers' Professional Development.]]></journal-title>
<abbrev-journal-title><![CDATA[profile]]></abbrev-journal-title>
<issn>1657-0790</issn>
<publisher>
<publisher-name><![CDATA[Departamento de Lenguas Extranjeras, Universidad Nacional de Colombia.]]></publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id>S1657-07902017000200006</article-id>
<article-id pub-id-type="doi">10.15446/profile.v19n2.57178</article-id>
<title-group>
<article-title xml:lang="en"><![CDATA[Making Sense of Alternative Assessment in a Qualitative Evaluation System]]></article-title>
<article-title xml:lang="es"><![CDATA[Entendiendo los procedimientos alternativos de valoración de desempeño en un sistema de evaluación cualitativo]]></article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Rojas Serrano]]></surname>
<given-names><![CDATA[Javier]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
</contrib-group>
<aff id="A01">
<institution><![CDATA[,Centro Colombo Americano  ]]></institution>
<addr-line><![CDATA[Bogotá ]]></addr-line>
<country>Colombia</country>
</aff>
<pub-date pub-type="pub">
<day>00</day>
<month>12</month>
<year>2017</year>
</pub-date>
<pub-date pub-type="epub">
<day>00</day>
<month>12</month>
<year>2017</year>
</pub-date>
<volume>19</volume>
<numero>2</numero>
<fpage>73</fpage>
<lpage>85</lpage>
<copyright-statement/>
<copyright-year/>
<self-uri xlink:href="http://www.scielo.org.co/scielo.php?script=sci_arttext&amp;pid=S1657-07902017000200006&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://www.scielo.org.co/scielo.php?script=sci_abstract&amp;pid=S1657-07902017000200006&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://www.scielo.org.co/scielo.php?script=sci_pdf&amp;pid=S1657-07902017000200006&amp;lng=en&amp;nrm=iso"></self-uri><abstract abstract-type="short" xml:lang="en"><p><![CDATA[In a Colombian private English institution, a qualitative evaluation system has been incorporated. This type of evaluation poses challenges to students who have never been evaluated through a system that eliminates exams or quizzes and, as a consequence, these students have to start making sense of it. This study explores the way students face the new qualitative evaluation system and their views on alternative assessment as a way to help them make headway with their English learning process]]></p></abstract>
<abstract abstract-type="short" xml:lang="es"><p><![CDATA[En una institución educativa privada dedicada a la lengua y la cultura inglesas en Colombia, se incorporó un sistema de evaluación cualitativo para ayudar al estudiante a alcanzar sus metas de aprendizaje. Este sistema plantea desafíos a los estudiantes que nunca han sido evaluados con un modelo sin exámenes escritos y, como consecuencia, tienen que desarrollar una serie de habilidades para entenderlo. Este estudio explora la manera como los estudiantes se enfrentan al nuevo sistema de evaluación y su opinión sobre este respecto a su proceso de aprendizaje]]></p></abstract>
<kwd-group>
<kwd lng="en"><![CDATA[Alternative assessment]]></kwd>
<kwd lng="en"><![CDATA[qualitative evaluation]]></kwd>
<kwd lng="en"><![CDATA[quantitative evaluation]]></kwd>
<kwd lng="es"><![CDATA[evaluación cualitativa]]></kwd>
<kwd lng="es"><![CDATA[evaluación cuantitativa]]></kwd>
<kwd lng="es"><![CDATA[métodos alternativos de valoración]]></kwd>
</kwd-group>
</article-meta>
</front><body><![CDATA[  <font face="verdana" size="2">      <p><a href="http://dx.doi.org/10.15446/profile.v19n2.57178" target="_blank">http://dx.doi.org/10.15446/profile.v19n2.57178</a></p>      <p align="center"><font size="4"><b>Making Sense of  Alternative Assessment in a Qualitative Evaluation System</b></font></p>      <p align="center"><font size="3">Entendiendo  los procedimientos alternativos de valoraci&oacute;n de desempe&ntilde;o en un sistema de  evaluaci&oacute;n cualitativo</font></p>      <p align="center"><b>Javier Rojas Serrano</b><sup>*</sup>    <br>  Centro Colombo Americano, Bogot&aacute;, Colombia</p>      <p align="center"><sup>*</sup><a href="mailto:jrojas@colombobogota.edu.co">jrojas@colombobogota.edu.co</a></p>      <p align="center">This article was received  on April 4, 2016, and accepted on October 20, 2016.</p>          <p>How to cite this article (APA, 6th ed.):    <br>       Rojas Serrano, J. (2017). Making sense  of alternative assessment in a qualitative evaluation system. <i>PROFILE Issues in Teachers' Professional  Development, 19</i>(2), 73-85.  <a href="http://dx.doi.org/10.15446/profile.v19n2.57178" target="_blank">http://dx.doi.org/10.15446/profile.v19n2.57178</a>.</p> 	               ]]></body>
<body><![CDATA[<p>This is an Open Access article distributed under the  terms of the Creative Commons license Attribution-NonCommercial-NoDerivatives  4.0 International License. Consultation is possible at  <a href="http://creativecommons.org/licenses/by-nc-nd/4.0/" target="_blank">http://creativecommons.org/licenses/by-nc-nd/4.0/</a>. </p> <hr>         <p>In a Colombian private English  institution, a qualitative evaluation system has been incorporated. This type  of evaluation poses challenges to students who have never been evaluated  through a system that eliminates exams or quizzes and, as a consequence, these  students have to start making sense of it. This study explores the way students  face the new qualitative evaluation system and their views on alternative  assessment as a way to help them make headway with their English learning  process.</p>     <p><i>Key  words:</i> Alternative assessment, qualitative  evaluation, quantitative evaluation.</p> <hr>     <p>En una instituci&oacute;n educativa privada dedicada  a la lengua y la cultura inglesas en Colombia, se incorpor&oacute; un sistema de  evaluaci&oacute;n cualitativo para ayudar al estudiante a alcanzar sus metas de  aprendizaje. Este sistema plantea desaf&iacute;os a los estudiantes que nunca han sido  evaluados con un modelo sin ex&aacute;menes escritos y, como consecuencia, tienen que  desarrollar una serie de habilidades para entenderlo. Este estudio explora la  manera como los estudiantes se enfrentan al nuevo sistema de evaluaci&oacute;n y su  opini&oacute;n sobre este respecto a su proceso de aprendizaje. </p>     <p><i>Palabras clave:</i> evaluaci&oacute;n cualitativa, evaluaci&oacute;n cuantitativa, m&eacute;todos alternativos de valoraci&oacute;n.</p> <hr>     <p><font size="3"><b>Introduction</b></font></p>     <p>One of the main difficulties of a  qualitative evaluation system, like the one adopted by the Adult English  Program in the Centro Colombo Americano - Bogota, is facing a student who fails  a course or who passes with some difficulties and still does not totally accept  that he or she needs to keep improving. Convincing a failing student as to why  he or she must repeat a course is a hard task because in a qualitative approach  to evaluation, sometimes clear evidence is not recorded, stored, or managed,  and evaluation ends up looking like just the teacher's gut feelings on paper.  The same happens with students who pass with the minimum performance required  and, as they pass, do not feel they need to keep working hard in certain areas.  On many occasions, they might even feel that they did not deserve to pass,  contrary to what his/her teacher decided. Most of these concerns usually come  from students who have always been evaluated by specific numbers or through  products like final outcomes, exams, or quizzes and would rather see this type  of evaluation in the Colombo as well.</p>     <p>However, this end-of-the-cycle  dilemma need not be a nightmare provided that students who are not familiar  with the qualitative evaluation system understand how it works and what  practices are behind the teacher's decision to fail or pass a student. This  research project intends to understand how students make sense of and try to  adapt to a qualitative evaluation system, which would eventually avoid  traumatic experiences on the side of the teacher as well as of students  themselves in light of the different assessment tools offered by the program.</p>     <p>Even though this type of dilemmas may  not apply to most English teaching departments and institutions, in the sense  that most places continue approaching evaluation from a quantitative system  which offers less controversy and more practicality at the moment of grading  students, at the Centro Colombo Americano - Bogota, the qualitative system of  evaluation is being constantly revisited and discussed in order to help the  faculty and the students get a grip on it. The fact that most students and teachers  in the program come from quantitative backgrounds&mdash;from our schools and majors  to our latest work experience&mdash;poses specific challenges and threats that make this  type evaluation difficult to follow because it regards attitude, culture, task  accomplishment, and the whole process and does not consider quizzes and exams.</p>     <p>After having undergone several  feedback sessions with students, the same questions keep on popping up. As a  matter of fact, these questions have eventually become the research questions  leading this study:</p> <ul>       ]]></body>
<body><![CDATA[<li>How do students coming  from a quantitative evaluation background respond to a qualitative evaluation  system? </li>       <li>How do students make  sense of the alternative assessment activities that teachers plan in order to  draw conclusions about their own performance?</li>     </ul>     <p>At first glance, these objectives lead  us to think that possible research constructs are the <i>quantitative evaluation system </i>most students are familiar with, the <i>new qualitative evaluation system</i> used by the Colombo, and the students' <i>reflection,  analysis, and final reaction towards the new system.</i></p>     <p><font size="3"><b>Literature Review</b></font></p>     <p><font size="3">Quantitative and Qualitative Approaches to Evaluation</font></p>     <p>In spite of the fact that this issue  may seem new or even restricted to the Centro Colombo Americano scenario, a lot  of related literature and theory have been spread and dealt with for some years  now in the education field.</p>     <p>Teachers at the Centro Colombo  Americano have gradually become more and more confident evaluating students  through a qualitative system, but it is also true that for many of us at the  beginning, it was a real challenge to get used to it. In fact, it is still hard  in some cases to make up one's mind as to who is passing or failing a course,  given the fact that there is still an inner struggle between the teacher's  quantitative background and the current qualitative practice. This inner  struggle may come from the benefits we see in both systems that are summarized  by Brown (2004), who points out that while traditional-quantitative evaluation  provides higher practicality and reliability, alternative-qualitative systems  provide better washback and authenticity.</p>     <p>Nevertheless, an aspect that may add  up to the difficulties in implementing a qualitative approach to evaluation is  explained by Cohen (1994), who suggests that "people have reached a spoken or  unspoken agreement that traditional methods look like the right way to assess.  In fact, teachers may choose methods that reflect the way they were assessed as  students (p. 29)."</p>     <p>This situation, however, has not only  been faced by the Centro Colombo Americano. Brown (2004) argues that approaches  to evaluation in different institutions worldwide have shifted focus onto a  more alternative view of evaluation and have become distanced from traditional  evaluation systems because alternative ways of evaluation foster assessment  tools that can be extended to real life, are more meaningful, and regard the  products as well as the process. In addition, non-traditional assessment that  might not look like testing (with all the stress and anxiety that it carries),  provides space for students to develop their creativity, increases critical  thinking skills, and enables more multicultural connections (Brown, 2004). This  view is evident when we regard the products, tasks, and projects that students  carry out in our institution. However, the same author insists that  non-traditional and qualitative assessment requires considerable time and  effort on the part of teachers and might look less convincing to students.</p>     ]]></body>
<body><![CDATA[<p>Another aspect to consider when  defining traditional-quantitative and alternative-qualitative approaches to  evaluation has to do with what is done with the information obtained. The  discussion, then, as proposed by Areiza Restrepo (2013), focuses on formative  as opposed to summative assessment if we consider the <i>purpose</i> of it. Traditional evaluation tends to render summative  assessment in the sense that the information collected is used to decide who  passes or fails and based on the quizzes or exam results, the student should  draw conclusions about what language aspect to review and reinforce. A  qualitative assessment system, on the other hand, goes beyond and,  additionally, produces feedback that will help students identify strengths or  weaknesses through a conference with the teacher in which all aspects of the  learning process are discussed; apart from the use of language, these aspects  also include the areas that cannot be covered in an exam and are connected to  the ability to implement learning strategies throughout the process, team work  abilities, use of resources, punctuality, and so on. This type of assessment is  therefore more realistic and authentic in the sense that to be successful in  real life the mere knowledge of a language is not enough, but that a wider set  of social and organizational skills is needed along with it.</p>     <p><a href="#tab1">Table 1</a>, taken from Brown (2001),  summarizes the most important characteristics of the quantitative and  qualitative approaches.</p>     <p align="center"><a name="tab1"><img src="img/revistas/prf/v19n2/v19n2a05t01.jpg"></a></p>     <p><font size="3">Students' Awareness and Reflection on Alternative  Assessment</font></p>     <p>I have mentioned that the formative  and summative assessment discussion is at the core of the analysis of  quantitative and qualitative evaluation approaches. Areiza Restrepo (2013)  points out that whereas summative assessment is designed to determine whether  students have achieved the goals of the program by the end of a cycle or the  course, formative assessment is, conversely, designed to be diagnostic,  remedial, regulatory, and ongoing. Therefore, at a first glance, we may infer  that summative assessment provides a kind of reflection in which students,  instead of planning actions to improve, may end up just showing regrets about  the things they did wrong or did not do at all, whereas formative assessment  gives students the opportunity to come up with action plans to keep their  strengths and to tackle their weaknesses timely. This conclusion was actually  drawn from his study in which participants, after being exposed to formative  assessment actions, showed their abilities to identify and understand their strengths  and weaknesses. This situation eventually led students to enhance their  learning and have a sense of success.</p>     <p>Similarly, a study carried out by  Baleghizadeh and Zarghami (2012), in which Iranian students were introduced to  conferencing as a formative assessment tool, showed how these students got  better results on a grammar test than the ones who were not exposed to  teacher-student conferences during their process. It is explained by the  researchers that the differing results between the two groups of students have  to do with the fact that the students who were involved in conferencing as an  alternative way of assessment were encouraged to take more responsibility for  their process by self-assessing, reflecting, monitoring, and setting goals to  improve their learning.</p>     <p>However, in spite of the deep  understanding of formative assessment and practices shown by the aforementioned  authors, the programs in which their studies are framed still evaluate students  through final exams that are the ultimate tools that teachers use to let  students pass a course, no matter how much reflection, awareness, and action  planning the formative practices that they implemented rendered.</p>     <p>When eliminating exams and quizzes,  as is the case of the program in which the present study takes place, teachers  and students have to rely entirely on formative assessment tools, and this is  what makes this paper singular and novel and what gives relevance to the  research questions that this study proposes.</p>     <p><font size="3"><b>Research Context</b></font></p>     <p>The program in which this study was  carried out is made up of 18 levels from basic to advanced English. Each level  lasts one month approximately and covers between three and four content units.  At the end of each unit, students develop a task in which they show their  understanding and mastery of the vocabulary, grammar, and strategies learned  throughout the unit; they are designed to establish connections between what  has been learned and students' reality. These tasks are usually filed in a  portfolio or uploaded to an online group together with reflections,  peer-assessment, and self-assessment notes. Every unit or two, teachers usually  have conferences with students to brief them about their progress in the course  in aspects as varied as communication, teamwork, punctuality, use of  strategies, class performance, and homework development. At the end of each  cycle, a final student-teacher conference takes place in which the student's  progress is discussed. With this evaluation system, more often than not,  students know if they are passing or failing before this final conference and  this moment becomes a way to offer students suggestions and advice on what to  focus on and how to keep improving. However, sometimes, students also object to  the teacher's decisions because, from their viewpoint, they made good progress  and deserve to pass, and a quantitative evaluation tool has not been used to  end such controversy with a clear number or letter and mistakes highlighted on  an exam or quiz.</p>     ]]></body>
<body><![CDATA[<p><font size="3">Research Framework</b></font></p>     <p>This research project may be  connected to what scholars have called <i>case  study</i>. Even though it would have been desirable to have carried out changes  and innovations in pedagogical activities, which is the ultimate goal of action  research, in our particular case it was also important to understand the  problem in depth before moving on to an intervention.</p>     <p>Case study, as defined by Stake  (1999), allows the study of the "peculiarity and complexity of a singular case  in order to figure out its activity in important circumstances" (p. 15). This  way, the peculiarity and complexity of a restricted group of students from a  restricted number of courses may provide us with important data to understand  how a bigger number of students from a wider range of course would react in the  face of a different evaluation system from the one they are used to.</p>     <p><font size="3">Methodological Design</b></font></p>     <p>For the sake of this research study,  seven people were selected as participants and were asked to fill out the corresponding  consent form. This selection was made taking into consideration the level of  students, and their being newcomers to the Colombo. It was considered that  students in intermediate and high intermediate levels could offer more  elaborated insights since they may have more academic experience than others  and some of them may have even studied abroad. Most participants were studying in  either undergraduate or graduate programs by the time of this research and just  a few had already graduated. Thus, participants had enough experience with  quantitative evaluation that may contrast with the type of evaluation carried  out at the Centro Colombo Americano- Bogota.</p>     <p>On the other hand, participants  needed to be newcomers in the institution, since the study intended to find out  how these students, coming from a quantitative evaluation tradition, adapt to  and assimilate a qualitative evaluation system.</p>     <p>Participants were coded as follows:  Two intermediate students = Sk1 and Sk2. Five high intermediate students = Ch1,  Ch2, Ch3, Ch4, and Ch5.</p>     <p>To avoid confusion, readers just need  to know that a code with Sk belongs to an intermediate level (between A2 and B1 in the Common European Framework &#91;CEF&#93;) and a code with Ch belongs to a high intermediate level  (about to get a B2 in the CEF).</p>     <p><font size="3"><b>Data Gathering Techniques</b></font></p>     <p>The matrix in <a href="#tab2">Table 2</a> represents the  different data gathering techniques and the areas and questions of this  research that were addressed with each of them.</p>     ]]></body>
<body><![CDATA[<p align="center"><a name="tab2"><img src="img/revistas/prf/v19n2/v19n2a05t02.jpg"></a></p>     <p>In order to account for these  research variables and to eventually tackle the research questions, four  different data gathering techniques were implemented in three different groups  and applied to the aforementioned population.</p>     <p><b>Data  gathering technique 1: The survey. </b>The  survey<a href="#pie1" name="spie1" title=""><sup>1</sup></a> intended to explore  students' beliefs and opinions about quantitative and qualitative types of  evaluation and with which ones students were more comfortable, taking into  consideration their academic life experience. These are the survey questions  that were answered by two low intermediate and five high intermediate English  students:</p> <ul>       <li>Taking into consideration  your academic process during your life (High School, College), what type of  evaluation are you more familiar with? </li>       <li>Do you think that having  a specific grade (Letter or Number) reflects clearly the knowledge you have  acquired in any course? </li>       <li>Apart from having a  specific grade (number of letter), what other type of evidence of your progress  can be shown? </li>       <li>Would you feel more  comfortable if the Colombo evaluated all levels with a final exam, or if  homework, activities, and projects were assessed every single class? </li>       <li>For your own academic  progress, which type of evaluation would you consider the most beneficial one? </li>       <li>After having finished a  level in the Colombo, do you consider that you understand the evaluation system  in this institution? </li>     </ul>     ]]></body>
<body><![CDATA[<p><b>Data  gathering technique 2: Teacher's journal. </b>A  journal in which assessment and evaluation moments were recorded for every  class was kept. From day 1 to day 19 (the length of a level in the Colombo),  assessment moments that were planned in every single lesson were recorded daily  in this journal in order to compare these notes with the assessment moments  that students identified and recorded in the self-reflection form that was  given to the participating students for them to keep track of these classroom  activities. </p>     <p><b>Data  gathering technique 3: Artifact (self-reflection form). </b>The  participating students were asked to complete a format in which, on a daily  basis, they had to write down moments of evaluation that they were able to  identify. At the end of the class, students stayed for five more minutes in class  to complete the entry of the day, in which they had to identify the nature of  the assessment moment, the type of assessment tool, and the skills or areas of  English that were assessed through that activity. <a href="#tab3">Table 3</a> is an example of what  students had to do with the format.</p>     <p align="center"><a name="tab3"><img src="img/revistas/prf/v19n2/v19n2a05t03.jpg"></a></p>     <p><b>Data  gathering technique 4: The interview. </b>The  final data gathering technique implemented in this study was a questionnaire.  At the end of the second level with each group, participants were asked to  answer a questionnaire in which they expanded on their opinion about the  evaluation process and its results on their learning process. The questions  posed were:</p> <ul>       <li>You have already finished  two courses of English in the Colombo. Have you been able to identify  evaluation moments during the class that were guided by the teacher? What  moments, in which you have been encouraged to reflect upon your own  performance, can you mention?</li>       <li>Which of these activities  have helped you understand your performance in class?</li>       <li>Have you missed the  numbers or letters to have an idea of what you need to improve and what you are  doing ok? Why?</li>       <li>Without taking into  consideration the establishment of partial or final exams, what kind of  evidence could be used to improve the type of evaluation in the Colombo?</li>     </ul>     <p><font size="3">Validity and Reliability</font></p>     ]]></body>
<body><![CDATA[<p>The validity and reliability of the  study are assured by the synergy among the different data gathering techniques.  They account for both the participants and the teacher-researcher's views, and  were also designed in such a way that one technique is backed up by another  when considering the different variables or constructs of the study, as shown  in the Data Gathering Techniques Matrix (<a href="#tab2">Table 2</a>). This provides this study  with reliability. In addition, validity is also present because the techniques  that were selected and designed tackled the concerns that arose from the  questions and variables of the study.</p>     <p><font size="3"><b>Data Analysis and Results</b></font></p>     <p>After the analysis of the information  collected through the different gathering techniques used, three salient  aspects emerged. These categories are directly connected to the research  variables that were identified at the beginning of this study: Qualitative  evaluation, quantitative evaluation, and the students' reflections on alternative  assessment.</p>     <p><font size="3">Category 1: Generalities</font></p>     <p>In the Colombian context, it is  thought that the quantitative approach is the one by which most academic  institutions are driven when evaluating students, and this view is actually  supported by Areiza Restrepo (2013). In his study, he found research that  unveiled the lack of knowledge and instruction on formative assessment of  Colombian teachers who did not regard assessment as a way to enhance learning  but just as a way to decide who passes or fails a course. Among other reasons,  he found out that it might be happening because, in general, only a few  universities actually offered instruction in this type of evaluation to  language teachers. This perception was confirmed through the answers the  participants in this study gave in the survey. It was noticed in the survey  that all participants were given grades, be it with numbers or letters, that  provided a concept of a final product, and that just a few of them recognized  that their process was also taken into consideration in previous academic  experiences.</p>     <p>Also, in students' previous  experience, important learning aspects such as attitude, punctuality,  commitment, leadership skills, and group work abilities were more often than  not disregarded or not seen as important as a final product, even though  getting to a product involves a good performance in the areas mentioned above.  It is mentioned by one student who wrote on the survey that in his previous  learning experiences, before getting to the Centro Colombo Americano, "many  things are left aside such as participation, the learning process, and the  attitude in class" (Ch3).</p>     <p>In some other cases, students  recalled having received feedback about their commitment in the activities, but  it was not as specific as they found it in their Colombo course:</p>     <blockquote>In the school, our commitment was  evaluated, but it was not as specific as it is here...I mean, here, we are  reminded of how many times we came to class, how many absences, how our  performance in class was, if I could handle the grammar, If I got the  listening, and many other things that add up to a more comprehensive  evaluation...at school, we were evaluated mostly through exams. (Ch3)</blockquote>     <p><a href="#fig1">Figure 1</a> is a summary of the  generalities that were identified through the analysis of the data collected  during the research process. </p>     <p align="center"><a name="fig1"><img src="img/revistas/prf/v19n2/v19n2a05f01.jpg"></a></p>     ]]></body>
<body><![CDATA[<p><font size="3">Category 2: Positive Views on Alternative Assessment</font></p>     <p>After being exposed to the  qualitative evaluation carried out at the institution through the use of  alternative assessment procedures, most students could identify some benefits  and positive aspects that this type of evaluation has on students' performance.  For example, Ch3 said in the survey that she had not missed exams in the  Colombo because "instead of a number, a cold, dry number," she is getting  feedback about what aspects to improve in a more specific and meaningful way.</p>     <p>The new evaluation system places new  students in a new setting that they had to make sense of. Despite this,  students seem to adapt and get alternative assessment in a good way. At the end  of the day, students liked the fact that they could identify their strengths  and weaknesses without having to take an exam, and that actually, they could  identify issues that go beyond grammar and vocabulary. In this sense, the tasks  and presentations were regarded as the most important activities to help  students with their assessment.</p>     <p>In the interview, Ch3 says, for  instance, that:</p>     <blockquote>I feel that the activities that made me  understand the most, or the ones that helped me the most were the presentations  because they were the most complete way to have look at everything, the  grammar, the vocabulary, as well as the expressions, using everyday language,  and seeing every single thing...also, the feedback from our partners was really  nice.</blockquote>     <p>Ch4 and Sk1 had a similar impression  in the interview:</p>     <blockquote>The presentations...for me, it was really  difficult to stand in front of different people to present a specific topic,  especially in English, but it helped me know how much I really learned. (Ch4)</blockquote>     <blockquote>Something that was very important for my  evaluation, was the presentation because it was OK and I got important feedback  about it from my teacher and classmates. (Sk1)</blockquote>     <p>In the survey, some students  acknowledged some of the strategies they have gradually developed in order to  look for ways to improve. To come to this point, class routines and  assessment-based activities played an important role. When asked what evidence  of progress they had, most participants highlighted the fact that they paid  special attention to their teachers' feedback, carried out self-assessment and  self-monitoring, and compared themselves to other members of the class without  being necessarily given direct information about their performance from the  teacher but also from other students and from self-reflection.</p>     <p>Students also thought that  qualitative assessment frees students from anxiety and pressure and that is why  traditional exams do not provide them with real information about their  performance, as Sk1 put it:</p>     ]]></body>
<body><![CDATA[<blockquote>I think that exams generate pressure in  the student and, actually, they do not render good evidence about the real  class performance. (Sk1)</blockquote>     <p>One of the most common benefits  students acknowledge as regards the qualitative approach to evaluation is the  opportunity to reflect upon one's own process; this reflection was encouraged  by the practices and routines that teachers following a qualitative approach  have.</p>     <p>Through the interview, students  identified some of those practices and activities that helped them  self-reflect:</p>     <blockquote>Well, during the class we would always do  some periodic reviews as the units and lessons went by...also, the tasks we had  to do, the presentations...all in all, it helped me reflect upon my own  performance. (Ch3)</blockquote>     <blockquote>For instance, activities were as simple as  a couple of questions; however, in those two questions, one&mdash;not even the  teacher&mdash;knew how much one learned, how well one could handle the grammar, and  express it naturally. (Ch4)</blockquote>     <blockquote>I think that in all activities we did, we  assessed ourselves and did not simply went on with the next activity, but we  shared results, evaluated the results, and each student was perfectly capable  to identify which his/her mistakes had been and how to improve them. (Ch1)</blockquote>     <p>Finally, most of the participating  students (5 out of 7 in the survey) agreed that they would prefer a type of  evaluation in which everything they do is taken into consideration rather than  having only exams and quizzes.</p>     <p><a href="#fig2">Figure 2</a> puts together the positive  views students showed through the data gathered in this study.</p>     <p align="center"><a name="fig2"><img src="img/revistas/prf/v19n2/v19n2a05f02.jpg"></a></p>     <p><font size="3">Category 3: Negative Views on Alternative Assessment</font></p>     ]]></body>
<body><![CDATA[<p>A disadvantage some students mentioned  about a qualitative approach to evaluation is that in some cases students with  mixed language levels may end up in the same class. It might be due to the fact  that when you assess the process and the product, as a teacher following a  qualitative approach, you have to consider the effort students put into it, the  commitment they showed, and the time they devoted, and not only the quality of  the product. On the other hand, if a student who did not follow any process or  show evidence of ongoing work displays a great product, he might eventually  fail the task because of not following the steps to complete the activity, even  if the final product is really good. This situation was perceived by a student  who said in the interview:</p>     <blockquote>I think that the students that have had  very good teachers have learned a lot, and if they had to take an exam from  which their passing or failing depended on they would have passed anyways and  it would be a waste of time. But also, there are people that, in spite of being  very responsible, have some language flaws and for them it would have been good  to repeat at least one course for them to identify those flaws and work on  them, since those gaps in their knowledge may make them confused for the rest  of the process (Ch1)</blockquote>     <p>Through the analysis of the  information obtained, there seems to be an agreement that numbers or letters do  not quite represent accurately how much and how well a student is learning.  However, some of them still miss having exams and quizzes to have a more "accurate"  report on their performance. Some students brought up in the survey that a  qualitative approach to evaluation may be too flexible and lenient and it might  not offer concrete evidence to students. Some of the participants thought that  giving exams may expose students to more academic challenges, as this student  put it in the survey:</p>     <blockquote>I feel that something clearer is necessary  to get my process assessed in a more concrete way and to put students in front  of a more academic challenge. (Ch2)</blockquote>     <p>In the interview, Ch1 also  acknowledged that:</p>     <blockquote>I'm not a friend of exams, being a teacher  myself, and I'm not a friend of giving exams all the time as well to determine  if a student is passing o&#91;r&#93; failing, but I think that every now and then it is  necessary to give an exam from which evidence is taken to decide whether a  student is passing or not...in the two levels I've studied in the Colombo, people  have different levels of English.</blockquote>     <p>This situation, of course, makes some  students feel suspicious about some other alternative assessment practices such  as peer-assessment, in which students have to assess other students'  performance, because, in their view, it makes no sense to get feedback from a  student who may have even more language difficulties than the ones they have.</p>     <p><a href="#fig3">Figure 3</a> shows a summary of the  negative views on alternative assessment that some of the students involved in  this study identified.</p>     <p align="center"><a name="fig3"><img src="img/revistas/prf/v19n2/v19n2a05f03.jpg"></a></p>     <p><font size="3"><b>Pedagogical Implications</b></font></p>     ]]></body>
<body><![CDATA[<p>Every time a new course starts,  students are explained what the evaluation is going to be like. They are told  that there will not be exams or quizzes and that all they do day by day will be  taken into consideration in order to decide whether they are passing the course  or not. Some of them show a big smile when they hear about the exams policies  and think that the course is going to be a piece of cake. However, when they  realize that they not only have to demonstrate understanding of grammar and  vocabulary every now and then on a piece of paper but in every single activity,  every single day, and that they also have to show communication skills,  punctuality, acquisition of daily learning habits, and social and teamwork  skills, some of those smiles start to fade away. Some students adapt to the new  system easily and naturally, but others have some trouble assimilating it. As  Tedick and Klee (1998) put it, students need extensive training and preparation  to adapt to alternative assessment. Lots of guidance is required for students  to start reflecting upon their own process from a critical standpoint, to be  able to think of a clear action plan, and to offer feedback to others as well.  However, when students are constantly trained, they are capable of identifying  moments of assessment and making the most of them, as was evident in the forms  they had to fill out after each class to identify moments of evaluation  throughout the lesson.</p>     <p>In this study, we intended to explore  these students' beliefs and feelings towards the qualitative approach to  evaluation and to understand how well they adapt to it. From the analysis of  the data collected in this research study, these ideas can be drawn:</p>     <p>Students appreciate being given  specific feedback about their performance in different areas of the learning  process in an ongoing fashion.</p>     <p>Some students do not miss the  quantitative grades they used to get in other academic programs. A few of them,  however, believe that sometimes a traditional way to assess students'  performance is still necessary, particularly to bridge the gaps of grammar and  vocabulary among students who are in the same course but show some differences  in their English proficiency.</p>     <p>After proper and ongoing training,  students are able to identify and use assessment moments to spot their own  strengths and weaknesses. </p>     <p>Sometimes students' assimilation of  alternative assessment depends on their cognitive style. Students who have an  analytical learning type, who are more independent, autonomous, and logical may  feel more comfortable with the new system because they can use their classmates'  and teacher's feedback to create action plans and strategies to solve problems  as they go through the learning process. On the contrary, students with an  authority-oriented learning type tend to be more structured and traditional.  Therefore, they may feel more comfortable being assessed through more formal  tools and being told exactly what to do to increase their numbers.</p>     <p>On the teacher's side, a qualitative  approach to assessment and evaluation poses more challenges than a quantitative  one. According to Brown (2001), quantitative approaches are meant to be highly  practical and time-saving, offering standardized kinds of assessment to every  student in a class or a school at the end of each term. On the other hand, the  same author explains that qualitative assessment and evaluation demand a  continuous effort from teachers to assess, evaluate, and give feedback to  students on a more regular basis, which is time-consuming and not practical at  all, although it turns out to be more beneficial to students, who can take  actions to tackle learning issues on the go and not when there is not much to  do. Moreover, teachers need to learn to be more organized and responsible in  order to be able to give prompt and comprehensive feedback, especially to those  students who are struggling to meet the course goals. As teachers&mdash;not grades&mdash;have  to talk to students, they are to be assertive and straightforward, but, at the  same time, supportive and polite. The adaptation to a qualitative system is  then not only a student but a teacher matter as well.</p>     <p><a href="#fig4">Figure 4</a> summarizes the pedagogical  implications that emerged out of the data gathered throughout the research process.</p>     <p align="center"><a name="fig4"><img src="img/revistas/prf/v19n2/v19n2a05f04.jpg"></a></p>     <p><font size="3"><b>Further Research</b></font></p>     ]]></body>
<body><![CDATA[<p>This study has focused on  understanding students' beliefs and adaptation processes to the qualitative  approach to evaluation and has not considered the actions taken in order to  make such an adaptation easier. Given this fact, it would be interesting to  explore some of those actions and the artifacts teachers create in order to  facilitate the assimilation of this approach by newcomers. This way, we could  have a closer view on qualitative assessment tools and their effectiveness, the  actions taken by teachers before, during, and after the feedback conferences  with students in order to make sure they help students improve troublesome  areas, or the use of self and peer assessment moments and how to make them more  effective. In other words, it is hoped that this reflection and descriptive  paper can render a number of action research projects in which senior and  junior teachers facilitate the transition of new students from a traditional to  an alternative system of evaluation and assessment.</p>     <p><font size="3"><b>Conclusion</b></font></p>     <p>No educational endeavor can be  understood without the evaluation process that measures its results. In fact,  the evaluation approach has to mirror the curriculum in which it is used.  Language learning has undergone great changes throughout the years (from the  grammar-translation method to the more communicative approaches) but,  apparently, evaluation and testing still rely on exams and quizzes to get an  idea of students' progress. Nowadays, language learning demands a wider range  of skills that surpass grammar, vocabulary, and communication, and also cover  the ability to learn autonomously, to use technology, to select and apply  learning strategies, to collaborate with others, and to establish social and  cultural bonds, which are areas that cannot be measured and observed by means  of traditional evaluation tools. It is hoped that this paper reached its goal  to show what students think and how they react when confronting an alternative  evaluation system that is entirely different from what they heretofore had known  in their academic life but that is thought to be more coherent with the areas  that new language teachers need to observe and assess.</p> <hr>     <p><a name="pie1" href="#spie1"><sup>1</sup></a>Survey and interview  questions and answers have been translated from Spanish for publication  purposes.</p> <hr>     <p><font size="3"><b>References</b></font></p>     <!-- ref --><p>Areiza Restrepo, H. N. (2013). Role of systematic  formative assessment on students' views of their learning. <i>PROFILE Issues in  Teachers' Development, 15</i>(2), 165-183.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=2251893&pid=S1657-0790201700020000600001&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></p>     <!-- ref --><p>Baleghizadeh, S., &amp; Zarghami, Z.  (2012). The impact of conferencing assessment on EFL students' grammar learning.<i> PROFILE Issues in  Teachers' Development, 14</i>(2),131-144<i>.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=2251895&pid=S1657-0790201700020000600002&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></i></p>     <!-- ref --><p>Brown, D. H. (2001). <i>Teaching by principles: An interactive approach to language pedagogy </i>(2nd ed.). New York, US: Pearson Longman.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=2251897&pid=S1657-0790201700020000600003&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></p>     <!-- ref --><p>Brown, D. H. (2004). <i>Language assessment: Principles and practices</i>. New York, US: Pearson.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=2251899&pid=S1657-0790201700020000600004&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></p>     <!-- ref --><p>Cohen, A. D. (1994). <i>Assessing language ability in the classroom</i>. Boston, US: Heinle.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=2251901&pid=S1657-0790201700020000600005&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></p>     <!-- ref --><p>Stake, R. E. (1999). <i>Investigaci&oacute;n con estudio de casos </i>&#91;Case  study research&#93;. Madrid, ES: Morata.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=2251903&pid=S1657-0790201700020000600006&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></p>     <!-- ref --><p>Tedick, D. J., &amp; Klee, C. A. (1998). <i>Alternative assessment in the language classroom</i>.  Washington D.C.: Center for International Education.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=2251905&pid=S1657-0790201700020000600007&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></p> <hr>     <p><font size="3"><b>About the Author</b></font></p>     ]]></body>
<body><![CDATA[<p><b>Javier  Rojas Serrano </b>holds a BA in Philology and Languages from Universidad Nacional de  Colombia. He has been an English teacher, a supervisor, and an assistant  coordinator at Centro Colombo Americano, Bogot&aacute;. He has also published articles  related to technology, teacher collaboration, and citizenship in the English  classroom.</p> <hr> </font>      ]]></body><back>
<ref-list>
<ref id="B1">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Areiza Restrepo]]></surname>
<given-names><![CDATA[H. N]]></given-names>
</name>
</person-group>
<article-title xml:lang="es"><![CDATA[Role of systematic formative assessment on students' views of their learning]]></article-title>
<source><![CDATA[PROFILE Issues in Teachers' Development]]></source>
<year>2013</year>
<volume>15</volume>
<numero>2</numero>
<issue>2</issue>
<page-range>165-183</page-range></nlm-citation>
</ref>
<ref id="B2">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Baleghizadeh]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
<name>
<surname><![CDATA[Zarghami]]></surname>
<given-names><![CDATA[Z]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[The impact of conferencing assessment on EFL students' grammar learning]]></article-title>
<source><![CDATA[PROFILE Issues in Teachers' Development]]></source>
<year>2012</year>
<volume>14</volume>
<numero>2</numero>
<issue>2</issue>
<page-range>131-144</page-range></nlm-citation>
</ref>
<ref id="B3">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Brown]]></surname>
<given-names><![CDATA[D. H]]></given-names>
</name>
</person-group>
<source><![CDATA[Teaching by principles: : An interactive approach to language pedagogy]]></source>
<year>2001</year>
<edition>2nd</edition>
<publisher-loc><![CDATA[New York ]]></publisher-loc>
<publisher-name><![CDATA[Pearson Longman]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B4">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Brown]]></surname>
<given-names><![CDATA[D. H]]></given-names>
</name>
</person-group>
<source><![CDATA[Language assessment: Principles and practices]]></source>
<year>2004</year>
<publisher-loc><![CDATA[New York ]]></publisher-loc>
<publisher-name><![CDATA[Pearson]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B5">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Cohen]]></surname>
<given-names><![CDATA[A. D]]></given-names>
</name>
</person-group>
<source><![CDATA[Assessing language ability in the classroom]]></source>
<year>1994</year>
<publisher-loc><![CDATA[Boston ]]></publisher-loc>
<publisher-name><![CDATA[Heinle]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B6">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Stake]]></surname>
<given-names><![CDATA[R. E]]></given-names>
</name>
</person-group>
<source><![CDATA[Investigación con estudio de casos &#91;Case study research&#93;]]></source>
<year>1999</year>
<publisher-loc><![CDATA[Madrid ]]></publisher-loc>
<publisher-name><![CDATA[Morata]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B7">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Tedick]]></surname>
<given-names><![CDATA[D. J]]></given-names>
</name>
<name>
<surname><![CDATA[Klee]]></surname>
<given-names><![CDATA[C. A]]></given-names>
</name>
</person-group>
<source><![CDATA[Alternative assessment in the language classroom]]></source>
<year>1998</year>
<publisher-loc><![CDATA[Washington D.C ]]></publisher-loc>
<publisher-name><![CDATA[Center for International Education]]></publisher-name>
</nlm-citation>
</ref>
</ref-list>
</back>
</article>
