Pain and Gain: Tips for making practical sense of student ratings
By Jake Glover
I’ve been working with faculty to improve teaching and learning for nearly a decade now and my role with IDEA allows me to speak with literally hundreds of faculty across the world every year. Conversations about a plan of action from a student ratings report is consistently challenging, especially when the ratings are lower than what was anticipated. I can empathize because I’ve walked in their shoes.
I understand the physical reaction when, after a long semester, the ratings show students did not “get it” and did not connect with the course. Even though I know better and have coached faculty to do otherwise, my gut reactions were still – “the students don’t know what they’re talking about”, and “wait until they get into the ‘real world’ and they’ll see what I was trying to teach them”. I know for well designed systems (like IDEA) this is demonstrably false
and even though I had been the one talking colleagues off the proverbial ledge with these exact same statements it didn’t keep my emotions from getting stirred up, at least for awhile.
Eventually I was able to take a step back and look with more objective eyes at the evidence before me. Here are some of the steps I took (and use in working with others) to build on the feedback for the next time I enter the classroom (literally or metaphorically as I’ve also taught online). Here is my mental checklist as I look at the reports:
- Brace for impact...Okay, this is kind of facetious but the principle is, it’s always best to have an objective, relaxed mindset before approaching criticism of any kind (positive or negative, constructive or otherwise).
- One of the reasons I felt blindsided by the results is I didn’t do a better job soliciting input and feedback during the course. Many of the things students rated low I probably could have corrected and better explained if I would have been better at making “spot checks” every couple weeks. (IDEA now has a great tool called Instant Feedback just for this purpose, but even the occasional note card passed out, a Google form, or a survey in the learning management system can do the trick).
- I don’t have to start over from scratch. The IDEA reports link to resources to target a couple, maybe three, specific teaching strategies, done consistently over the semester, and the results can be remarkably different the next time. This is especially true because I’m targeting teaching methods linked to the learning that’s important and essential for my course.
- Swallow my pride and talk through my report and my plan of action for next time with a colleague. I didn’t want to look at these scores myself, let alone share them with someone else, but I’ve learned it’s usually a much quicker road to improvement if I don’t try to reinvent the wheel and learn from other’s mistakes and successes. See A.R. & Coe, R. (2004) for more info and suggestions on the benefits of consultation with student ratings.
I’ve been really fortunate to have a career where I get to learn from some great examples. I truly admire people who engage with the craft of teaching well. It’s also nice to have a solid system like IDEA as a means to make informed decisions for how I can work to improve my own craft. If you find yourself “on an island” regarding your teaching then, please, reach out to us, take a stroll through our website. Our mission is to improve student learning. We truly are here to help.
A.R. & Coe, R. (2004) Effectiveness of consultation on student ratings feedback: Meta-analysis. Review of Educational Research, 74, 215-253.