Introduction
During the 2020–2021 school year, schools around the world faced the challenges of the new wave of the COVID-19 pandemic (Colao et al., Reference Colao, Piscitelli, Pulimeno, Colazzo, Miani and Giannini2020). While in the first wave, from February to June 2020, teachers and students were caught unprepared, in the second, from September 2020 to June 2021, there was a greater awareness of the risks of a prolonged online teaching activity if not properly designed (Harris and Jones, Reference Harris and Jones2020).
For instance, in Europe, surveys were carried out in Germany and Austria to investigate the most relevant effects of the pandemic and, among the various results, a significant difference between diligent and less diligent students was found when considering their use of online teaching. In particular, home study and the consequent teachers' requests were perceived as particularly challenging. Furthermore, in the transition from face-to-face to online teaching, the most relevant issue was the evaluation of students. Since the students were no longer physically in the classroom, it was indeed more difficult for the teachers to ascertain the students' knowledge: in fact, when at home, it is more likely for students to draw information from other sources than themselves during the moments of evaluation (Huber and Helm, Reference Huber and Helm2020). The blockade of the pandemic has also had a significant impact in Great Britain. On the one hand, the time available for teaching classical languages was reduced, on the other hand the exams remained substantially unchanged. Certainly, having postponed the exam dates by a few weeks did not solve the students' problems (Hunt, Reference Hunt2020). In Switzerland it was not possible to take national tests in schools or university entrance exams, which had an impact on admissions to universities in the fall (Wiberg et al., Reference Wiberg, Lyren and Pantzare2021). In America, many schools have not been able to adapt the summative assessment system they generally used to the tests carried out at home, since the latter had different characteristics from the traditional ones (Wyse et al., Reference Wyse, Stickney, Butz, Beckler and Close2020). In Africa, the need to update the usual teaching practices, inadequate for online teaching, has been highlighted too (Owolabi, Reference Owolabi2020). On the other hand, an attempt was made to look for some sort of solution: for instance, in the Middle East, in Saudi Arabia, a program based on metacognitive strategies was developed to raise students' awareness of syntactic and semantic errors in order to improve their translation performances (Amin, Reference Amin2019).
Overall, in these months of pandemic, the importance of combining the summative evaluation, which controls the amount of knowledge learned, with the formative one has emerged in all its urgency. As Taras (Reference Taras2010) argues, all evaluation begins with a summative one, which is a judgement, whereas formative assessment is actually a summative assessment plus feedback that the students use. Nowadays, in professional refresher courses for teachers, the effort is to make them more and more capable of knowing how to use this last type of assessment, both in face-to-face and in virtual mode (Badii and Lorenzo, Reference Badii and Lorenzo2018; Huertas-Bustos et al., Reference Huertas-Bustos, Lopez-Vargas and Sanabria-Rodriguez2018; Martinez-Borreguero et al., Reference Martinez-Borreguero, Naranjo-Correa, Maestre-Jimenez, Chova, Martinez and Torres2017; Postholm, Reference Postholm2012; Verschaffel et al., Reference Verschaffel, Depaepe and Mevarech2019; Wade-Jaimes et al., Reference Wade-Jaimes, Demir and Qureshi2018). When doing so, it is possible to observe the growth of the students' metacognitive skills that concern the entire learning process, and thus monitor how they assimilate knowledge (Judge, Reference Judge2021; Nieto, Reference Nieto2017; Wong and Zhang, Reference Wong and Zhang2020).
Assessment based on competences
The European Council (2018) has underlined the need for an assessment based on competences characterised by: a vision of knowledge contrary to an academic encyclopedism focused only on contents; and a more practical approach, aimed at solving problems; an active learning, that develops skills, abilities, values and transversal competences, which are applicable in more than one context (Díaz-Barriga, Reference Díaz-Barriga2011; Vidal Ledo et al., Reference Vidal Ledo, Salas Perea, Fernández Oliva and García Meriño2015).
When the assessment is based on competences, it is crucial that students receive formative feedback on the competences they acquired through their study activities. In this way, self-regulated learning is facilitated, and students can reflect on the mistakes they made in an atmosphere of trust and not judgement, which is strictly connected to the grade (Karabenick and Zusho, Reference Karabenick and Zusho2015; Ya-Hui et al., Reference Ya-Hui, Li-Yia, Chao-Chin and Tzu-Ling2012). Furthermore, when teachers ask students appropriate questions about why they did their homework in that way, they are progressively making them active citizens, able to reflect on the ‘Why’ of things (Pellerey, Reference Pellerey2016; Pring, Reference Pring2016; Rodríguez et al., Reference Rodríguez Gómez, Ibarra Saiz and Cubero Ibáñez2018; Taysum, Reference Taysum2012). On the other hand, according to Ciappei and Cinque (Reference Ciappei and Cinque2014), the effort to acquire these competences is not limited to the execution of a task but, with adequate assessment and self-assessment tools, it helps the student to realise that these skills are useful outside of school and in daily life too. Hence, these competences have a lot to do with soft skills (see Table 1). With an interesting ethic perspective, Ciappei identifies the relationship between soft skills and virtue, in an Aristotelian sense: ‘Virtue is a habitual ability (habit) to do good well, while soft skill can be defined as a habitual ability to do good tout court’ (Ciappei and Cinque, Reference Ciappei and Cinque2014, p. 13). Therefore, transversal competences connect the cognitive and emotional sphere with the ethical and organisational skills and the spirit of initiative and communication skills. Investing in them means not losing sight of a more complete view of the person, so as not to be absorbed by the spiral of hyper-specialisation. Also, meta-skills need to be practised more than understood (La Marca and Gülbay, Reference La Marca and Gülbay2018).
Among these competences, the ‘Personal and social competence and the learning to learn’ ability is strategic not only at school, but in any other professional sphere. The definition of the Recommendation of the European Council reads:
Personal, social competence and the ability to learn to learn consists in the ability to reflect on oneself, to manage time and information effectively, to work with others in a constructive way, to remain resilient and to manage oneself's learning and career. (European Council, 2018).
Table 1 shows the similarities between key competences and soft skills. This can become relevant to the student, as it manifests how his/her behaviour at school will have a lot to do with his/her behaviour in daily and professional life.
Furthermore, since these behaviours are observable, it is important to monitor their development over time (Heckman and Kautz, Reference Heckman and Kautz2016). For this purpose, adequate observation tools are needed, which make students aware that the effort made to learn also improves them as a person, making them more reflective and responsible for their learning process.
Evaluation for competences in the field of classical languages
This is particularly necessary for students of classical languages, who are usually assessed on the ability to translate an ancient text, from the source language to the target language, with morpho-syntactic and lexical correctness.
In 2007 the University of Oxford developed an interesting piece of software to move from a traditional paper evaluation to an electronic one, so as to speed up the mistakes' quantitative analysis (Ashdowne, Reference Ashdowne2009; Salema, Reference Salema, Cravo and Marques2017). Sometimes though, the risk is that students may not be aware that, while carrying out translation activities, they are also developing key and metacognitive skills. To avoid this, the moment of test correction and the relative evaluation turns out to be an important moment for reflection on the mistakes made. Often, however, teachers complain that students just look at the grade obtained, without paying particular attention to why they made certain mistakes.
Purpose of the study and research question
The purposes of this study were:
a) to demonstrate that the active involvement of teachers in the creation of metacognitive assessment tools facilitates the use of competence assessment at school;
b) to verify to what extent the use of a metacognitive form, administered to students after the tests' correction, makes students more aware that reflecting on the mistakes which they made increases their key competences and metacognitive skills.
Therefore, this study's research questions were:
1) to what extent does the active involvement of teachers in the creation of metacognitive assessment tools facilitate the use of competence assessment in school?
2) how to make students aware of the fact that reflecting on mistakes made in teaching activities increases their learning-to-learn competence?
Methodology
Study design
Given the aforementioned aims, the action-research method was chosen to carry out this study. Action-research allows researchers to both study a specific situation or problem and try out the actions directed to solve it, in order to improve their practice (Elliott, Reference Elliott1991; Latorre, Reference Latorre2004; Lewin, Reference Lewin1946).
Thus, when the action-research method is used in an educational environment, the teacher can directly address the issues of his/her instructional practice and deal with them in the specific environment where he/she works (Baldacci, Reference Baldacci2013). Moreover, this method was chosen to have the chance to engage the teachers first-hand and thus start from their instructional practice to make an actual change in their environment.
The research took place between December 2020 and July 2021 and participants were selected on a voluntary basis among Italian teachers from the area of Cuneo and Vicenza (both in northern Italy) and Ragusa (in southern Italy) who were participating in the Teachers' Training National Plan. Eventually, 12 teachers decided to participate in the study. Thus, the sample consisted of:
seven female and one male high school teachers
one female middle school teacher
three female primary school teachers (one of them was a special needs students' teacher)
All the participants, after being informed about the research aims, the chosen methodology and about how data would be treated, gave their formal consent to participate, as established by the British Educational Research Association (2018).
Tools and materials
A metacognitive form, the result of the studies following the research documented by the author (Canfarotta, Reference Canfarotta2021, p. 84), was shared with participants.
The original form (Figure 1) consisted of five major columns:
(1) Mistake transcription;
(2) Mistake typology: here students had to choose if the mistake they made was caused by a) distraction, b) no identification or c) ignorance;
(3) Mistake nature: here students had to choose if it was a a) morphological, b) syntactical or c) lexical mistake;
(4) Mistake correction;
(5) Comments, doubts, questions for the teacher.
To stimulate students' metacognitive reflection on their mistakes, the filling-in process was structured so that, after each test, they would write in the form: (a) the mistakes they made, (b) the mistakes' typology, (c) nature, and (d) the correct answer.
The last column of the table was used to write comments, doubts, and questions addressed to the teacher. While doing so, students had the chance to verbalise the process that led them to make the mistake and, after seeing it, they were able to better understand why they went wrong. Also, students had to give to every mistake the value 1, writing the number in the typology and nature correspondent column. This allowed the system to automatically create two pie charts where both the student and the teacher could see the mistakes distribution based on the typology (chart 1) and the nature (chart 2).
Moreover, in the last sheet of the file the teachers and students were using, there was an overall form (Figure 2) where the pie charts automatically updated every time the student or the teacher changed some data. Thus, it was possible to have an up-to-date, overall comprehension of the students' performance. Therefore, the tables also served as a tool to collect quantitative data (cf. Figures 1 and 2).
Students had to fill in a form for each test they took from February to May 2021.
The metacognitive table was firstly shared with participants through Google Drive. Every participant agreed to the structure, even though afterwards everyone adapted the table to his/her school subject and students. For instance, when used with Mathematics in a middle school program (when students are 11–13 years old), the options among which students had to choose when considering the mistake nature were: (a) numbers, (b) space and figures, and (c) correlations. Also, an easier language was used to describe the mistake typologies among which pupils had to choose, e.g.: ‘I got distracted’; ‘I didn't understand the text’; ‘I didn't know how to start/to which topic I had to relate’ (Figure 3).
Similarly, classical language students in upper secondary school (when students are 14–18 years old) filled out the form after taking a translation test.
Figure 4 shows in the first column the error made by the student, in the second column the type of error, in the third the nature of the error, in the fourth the self-correction made by the student, in the fifth the reflection of the student on the process that generated the error, e.g.: ‘I confused the dative for the verb in the first person singular’.
The student has entered the number 1 for each error, specifying the type of error and its nature. The sum of these numbers generated the graphs below. In this way he/she has immediate feedback on the causes of the errors and the arguments that he/she must go to deepen in his study.
During the Action-Research, qualitative and quantitative data were collected using: (a) online meetings, (b) a research journal, and (c) two semi-structured online questionnaires, one for students and one for teachers.
The research journal consisted in a shared Google Sheet that teachers and researchers could fill in at every step of the process. Every teacher had some dedicated rows where he/she could take notes about: (a) how he/she adapted the metacognitive table to his/her context; (b) first results he/she observed; (c) next steps to take; (d) the March and April phase; and (e) conclusions.
The semi-structured online questionnaire for students collected some socio-demographic data such as: gender, area of residence, and school level and then investigated: (a) their degree of appreciation for the metacognitive form; (b) whether the table had been useful and, if so, why; (c) if they had difficulties in filling in the form and, if so, which ones; and (d) if they would recommend the use of the form to their friends.
Besides gathering data about the number of students per class, their gender, the school level and their degree of appreciation for the table, the semi-structured online questionnaire for teachers collected data about: (a) how teachers encouraged their students in using the form; (b) the difficulties students encountered when filling in the form; (c) ideas regarding how to solve these difficulties; (d) the form's strengths; and (e) the form's flaws.
Both questionnaires were constructed and submitted online using Google Forms. Qualitative data were analysed using thematic analysis (Green et al., Reference Green, Willis, Hughes, Small, Welch, Gibbs and Daly2007), whereas quantitative data (percentages and statistics) were extracted from the Google Form itself.
Results
The analysis of both quantitative and qualitative data showed that:
a) Teachers
The teachers' questionnaire was filled in by nine participants out of 12; seven of them think their students had a medium degree of appreciation for the table, whereas two of them state pupils were highly engaged with it.
Data analysis also showed the strategies teachers used to encourage their students to use the form. All of them tried to show the benefits allegedly provided by its use, which are:
the chance for students to reflect on their cognitive process and become more conscious of themselves;
the opportunity to learn from the mistakes;
the possibility to improve their study method.
Teachers were also asked which difficulties students encountered when filling in the form and how these difficulties could be overcome. The detected difficulties were:
Technical: some students struggled to get access to the online form and then had issues in sharing it with the teacher because they were not used to this kind of tool;
Linguistic: some students had problems in understanding the table because Italian was not their native language;
About recognition: some students were not able to recognise the mistake itself and thus its typology and nature. This meant they could not identify its seriousness too;
Self-critical: some students struggled to recognise which difficulties they had in their study method;
Organisational: pupils had difficulties in dealing with the new assessment system and new challenges arose when the teachers had to explain the table in a distance learning setting.
To overcome these issues teachers used basically two strategies:
At the beginning, they did the filling-in process together with the students, with many examples and a close tutoring;
Then, they repeatedly assigned the table, in order to make it a recurring tool.
Finally, teachers' data analysis highlighted the strengths and weaknesses of the form.
Among the first, participants particularly pointed out:
The pie charts that self-created below the table, because they gave immediate, visual feedback;
The opportunity to identify strategies to overcome difficulties;
The chance for every student, both ‘good’ and ‘bad’ ones, to reflect on their learning process and monitor their performance;
The possibility to cooperate among peers;
The fact that the table can be used as an online tool, thus in a lockdown situation, too;
The motivation enhancement they saw in their students.
Teachers had to express the weaknesses of the table, too. Most of all, they pointed out the struggle to implement a new evaluation process and tool. This issue resulted in the necessity of more time than expected to explain to students the new method and to correct the tests.
Another weakness is related to the technical part: in some situations the teachers' digital skills were not sufficient and thus became an obstacle.
b) Students
The students' questionnaire was filled in by 154 subjects and, when asked the degree of appreciation for the metacognitive form, 65% of students said it was medium, 24% high and 11% low. Also, 83% of the students would recommend the form to a friend.
Students were asked to point out if the form had been useful to them and, if so, why. They reported that, using it, they had become more conscious of the reasons why they made some mistakes and developed some strategies they could use in order to improve their study method. Moreover, they stated they got better in self-evaluating and in understanding what they need to review to limit the mistakes.
Data analysis also showed which difficulties students had encountered when using the form. Although most of them (115) stated they did not have any difficulties, some issues emerged, which are:
The difficulty to actually recognise the mistakes and understand their typology and nature;
Technical troubles (e.g. share the table with the teacher or actually get access to the online form);
The difficulty of adapting to a new assessment method.
Lastly, a final evaluation of the action-research took place in July 2021, and showed that:
The Action-Research was an opportunity for discussion between colleagues on a delicate and never-resolved issue, that is to say reflecting on the nature of errors when translating;
It was an engaging and interesting experience, which gave the teachers ideas to work on in the future;
Many of the participants expressed their wish to be able to continue discussing metacognitive teaching.
Discussion
Overall, a reflection on the data of teachers and students showed that:
1. The degree of interest in the use of the metacognitive form is medium-high: what the students say is in line with what teachers highlighted, even though none of them stated that his/her students had a low degree of interest in the table. In particular, it seemed very significant to us that 83% of students would recommend this form to their friends. Numerous benefits were actually found: a correspondence can be found when the supposed benefits of the table are compared with the responses the students gave when asked why the table had been useful. The students confirmed the usefulness of the form, because:
‘By personally correcting mistakes I can better understand what I need to study more’ (student, 15 years old).
‘I don't just focus on the grade but I learn from mistakes' (student, 14 years old).
‘It helps me to improve human qualities in my studies' (student, 14 years old).
‘It also helps me in everyday life’ (student, 15 years old).
‘It enhances my strengths and those I still have to work on’ (student, 15 years old).
An increase in key and metacognitive skills is therefore evident (reflection on error, attention to the learning process, better competence in planning this process with greater rigour, awareness of one's own strengths and growth, even beyond study). This highlights that the students recognise, in their opinion, a certain usefulness in the use of this tool in ordinary teaching. Above all, they emphasise their growing awareness of the reason for their mistakes.
2. On the other hand, the teachers stressed that with this tool it was possible to personalise the teaching more, because they were able to ‘meet’ each student by correcting their personal forms and thus were able to better understand their learning process. In this way, the time spent in correcting the form was offset by the increase in students' motivation to study.
3. Among the difficulties during the initial phase of using the form, the most fragile pupils highlighted some problems in identifying the nature and severity of the error, if not guided by the teacher. This highlights the fundamental guiding role that the teacher assumes in similar activities. In these cases, it is suggested to explain the types of errors to students first, in order to facilitate their identification when filling in the form.
4. The results emerged from the qualitative data of the action-research participants showed their clear awareness of the importance of planning educational online activities during the time of the pandemic (Colao et al., Reference Colao, Piscitelli, Pulimeno, Colazzo, Miani and Giannini2020; Harris and Jones, Reference Harris and Jones2020; Huber and Helm, Reference Huber and Helm2020; Wyse et al., Reference Wyse, Stickney, Butz, Beckler and Close2020).
5. Also, the chosen research methodology, based on the full involvement of the participants, fostered in them the acquisition of: a) scientific research skills because, through learning by doing, they learnt how the various phases of the action-research are carried out (Taysum, Reference Taysum2012); b) an attitude of reflection upon the data obtained by observing the behaviour of the students (Judge, Reference Judge2021); c) a more conscious judgement about their knowledge gaps regarding some aspects of the learning process, related in particular to the self-assessment of students (Wong and Zhang, Reference Wong and Zhang2020).
6. Teachers stressed that this type of intervention to encourage students to reflect on errors makes them better understand that the way in which the disciplinary contents are studied can make them better people too (Pellerey, Reference Pellerey2016). One of the participants expressed this concept vividly by sending the researchers a comment made by an American high school headmaster in a letter to his teachers:
Dear professor, I am a survivor of a concentration camp. My eyes have seen things that no human being should ever see: gas chambers built by educated engineers; children killed with poison by well-trained doctors; infants killed by test-tube nurses; women and children killed and burned by high school and university graduates. Therefore, I distrust education. My request is: help your pupils to become human beings. Your efforts must never produce polite monsters, qualified psychopaths, educated Eichmanns. Reading, writing, arithmetic are of no importance except they serve to make our children more human (Cojean, Reference Cojean1995).
7. This shows that non-bureaucratic work on skills, animated by the desire to make the human qualities of students flourish through study, reliably gives teachers and students the opportunity to grow in awareness and self-regulation (Pring, Reference Pring2016).
8. What is more, the research motivated teachers to get involved in experimenting with metacognitive tools and in learning new, useful and close to students' ways of teaching.
9. On the other hand, the online version of the sheet, with pie charts immediately viewable, guaranteed a quick view of the learning processes to students and teachers so they could start to remedy the gaps.
10. Furthermore, through this way of carrying out the teaching function, knowledge is humanised: it is not just a matter of informing and making known the different epistemologies of the subjects but is a matter of raising questions about the person who is learning (Bergen, Reference Bergen2009; Mellinger, Reference Mellinger2019; Pietrzak, Reference Pietrzak2018).
In conclusion, the challenges generated by COVID 19 have highlighted the need for the whole school staff to update its competences to create a distributed leadership (Harris, Reference Harris2020): in fact, challenges and problems will be successfully faced only if the entire educating community is able to reflect more and more deeply on mistakes made.
Action-research results showed an interesting change in teaching and learning: the reflection that pupils can make upon the mistakes made at school will be an initial preparation to face life's bigger problems. Never as in this case has it been understood that it is the human dimension of effective teaching that makes the difference (Harris and Jones, Reference Harris and Jones2020).
Limitations of the study
Due to some students' low level of digital literacy, two teachers reported an initial difficulty when students had to fill in and send the metacognitive form through Google Drive.
Another difficulty was the lower involvement of foreign pupils (three Chinese with an A1 level of Italian and a very unmotivated Moroccan).
Also, a pupil with disabilities did not fill in the form, although he was interested in the activity. Unfortunately, due to the progress of the pandemic, the continuous alternation of distance and face-to-face learning has made the activity more difficult for teachers and students to follow.
Conclusions
The purposes of this study were: (a) to verify to what extent the use of a metacognitive form, administered after the tests' correction, makes students more aware that reflecting on errors made in teaching activities increases their key and metacognitive skills; (b) to demonstrate that the active involvement of teachers in the creation of metacognitive assessment tools facilitates the use of competence assessment at school.
The results showed that the students of primary and secondary school appreciated the use of the metacognitive form. In particular, from the qualitative analysis carried out it emerges that: a group of pupils, in addition to identifying the nature of the errors, understood more clearly what they had to review to limit the errors and how to act to correct them; another group pointed out that their awareness of their strengths and growth has grown; finally, a last group highlighted how the skills implemented during the compilation of the form were also useful in other areas of daily life, not only at school.
A high degree of appreciation for the online version of the form emerged from the involved teachers: in fact, the pie charts with percentages about the typology and nature of the mistakes provided students with immediate feedback.
Among the limitations of the research, we point out: students' initial difficulty in filling out the form because they had not already learnt to recognise the nature of the mistakes; the teachers' greater dedication of time to correct the pupils' sheet. On the other hand, both students and teachers have pointed out that this method allows a more personal relationship between teacher and learner and therefore becomes an element of motivation to study.
In conclusion, the use of the metacognitive form helps students to reflect on errors more easily and immediately. In this way pupils can become more aware of the fact that they can grow in key and metacognitive competences.
Acknowledgements
We thank the School Managers, the teachers and the students of the schools involved for the time and commitment dedicated to this study. Daniela Canfarotta, PhD in Theory & Practice of Education for Teacher Training, is a Latin and Greek teacher in Higher Secondary School in Bagheria (Palermo, Sicily, Italy). Her research focused the development of key competences and metacognition through Latin and Greek study deepening how different didactics may increase them. She undertook a study period abroad (University of Burgos, Spain and University of Leicester, UK). Current themes of research: Metacognition; Latin and Greek languages; Key competences; Life skills; Didactics. She is the author of several scientific papers. Carla Lojacono, PhD in teachers' training with a thesis about the Flipped Classroom in Higher Education. She currently works as an educator with high school students and teachers helping them in developing soft skills and reflective skills. She also is a trainer in schools within the civic education programs.