Student Feedback

Of the four lenses that is available to the teaching practioner for reflection (Brookfield 2017), it is the ‘Students’ Eyes’ that I find to be immensely valuable. Even before I was formally introduced to the concept of reflection and the importance of developing a culture of reflective practice, I’ve been prompted to pay attention to the student experience. Indeed, my students provide real time feedback about strategies that I implement. Collectively, the feedback I receive from my students propels me to be innovative and do better.

 

Presently, there are several avenues via which students have provided feedback on my teaching. The formal avenues are via (i) Student Staff Liaison Meetings and, (ii) End of Semester Feedback. Student Staff Liaison meetings occur at least twice per semester. Class Representatives for each course collate collective feedback from their peers and present that feedback to a committee. Based on my experience, the student feedback (broadly categorized into issues and commendations) presented at those meetings do not reach lecturers in a timely manner. Therefore, in instances where negative issues outweigh commendations, there is either no opportunity or a very small time window within which a lecturer can take action that will create a positive impact. The second formal mechanism via which student feedback is received is via end-of-semester course evaluations. Again, based on my experience, the vast majority of students do not manage to complete these surveys as evidenced by the low response rates. Only a snapshot of student feedback is therefore captured from a cohort and any action that the educator takes will be targeted to the next cohort of students. There are also several informal mechanisms via which educators may receive student feedback on their teaching. For example, educators can design brief student check-ins such that they can receive timely student feedback and take action when necessary. In the 2021/2022 academic year, I utilized the technology tool of Google Forms to check with students in two undergraduate courses (ESST 1004 and BIOL 3063). The basic questions that I asked were (i) What is working well? (ii) What do you think can be improved upon? The response rate on these informal check-ins were significantly higher than the end-of-semester evaluations. Some of the factors that may have contributed to this included the fact that these check-ins were done at the beginning of a lecture, they were brief (approximately 5 minutes) and they were anonymous. Student emails is another informal means of receiving feedback, albeit it is not anonymous mechanism.

 

Here is a sample of some of the mechanisms via which I have received feedback from students:

 

Despite the challenges associated with the various mechanisms via which student feedback is received, I have paid attention to what the students had to say. For example, my first post-PhD teaching stint commenced at The UWI in January 2019. For that first semester, I maintained the ‘status quo’ by delivering the courses as they had been designed. However, the student feedback that I received that semester provided evidence that innovation was necessary. For example, within the ESST 3102 (Environmental Impact Assessment) course, students were tasked with submitting a 30% Environmental Impact Assessment (EIA) report in week 12 of the semester. This was in addition to two incourse quizzes (worth 20%). Several students indicated that the effort they put into the generation of that EIA report did not match the weighting. I shared similar sentiments and also believed that we could provide the same training to the students by redesigning that aspect of the course. For the next cohort of students (January 2020 – May 2020), I designed a simulated case study for students which they worked on over a 12-week period. Instead of one EIA report, students now submitted 4 shorter reports (each worth 7.5%) that were based on intentionally-designed practical exercises. Another example of an instance where student feedback was invaluable occurred in the 2022/2023 academic year. In semester 1 of the CUTL program, we were introduced to the BOPPPS Model as a tool that could be used to improve lesson planning. As an educator, I thought it was revolutionary as it allowed me to systematically plan all the key elements (Bridge-In, Learning Objectives, Pre-Assessment, Participatory Learning Activities, Post-Assessment) of a lesson and also allowed me to seamlessly deliver that lesson. This experience prompted me to utilize the BOPPPS Lesson Plan in semester 2 of the 2022/2023 academic year in all the undergraduate courses that I taught. In addition to preparing a BOPPPS Lesson Plan, I also gave out a printed copy to the students on lecture days. Students had never heard of the BOPPPS Lesson Plan before but were keen to have them. Students started expecting the BOPPPS Lesson Plan for each lecture. When students missed lectures, they approached me to collect their hard copies. In one course (BIOL 3466), one of the commendations that I received via the Student Staff Liaison meeting was related to the use of the BOPPPS Lesson Plan.

 

Overall, I strive to have a dynamic teaching practice that is continuously improving. For this reason, I pay particular attention to student feedback that calls for improvement. However, it is the positive feedback that I receive from students that undoubtedly provides me with sprinkles of hope.

 

Reference

Brookfield, S. (2017). Becoming a Critically Reflective Teacher: Jossey-Bass.