Penn State, like most universities, has a system in which instructors are evaluated by the students in their classes. In the last two weeks of each semester, students are encouraged to offer feedback via a survey-type rating scale as well as through narrative comments, affectionately called the SRTE.
The rating scale asks the students to evaluate the instructor on things like knowledge of course materials, timeliness of return of graded materials, availability outside of class, etc. In the narrative comments, students are asked to share what he/she liked about the course and what he/she believes could be improved. The results are then made available to instructors after grades have been submitted and before the start of the next semester to presumably allow for course improvements without fear that the instructor will somehow retaliate against students if the responses are negative.
SRTEs results are not for the faint of heart.
When I first started teaching at Penn State, the student evaluations were handwritten. Each semester, a grad student or another faculty member would show up on one of the last classes of the semester to conduct the evaluations. Instructors were sent out into the hallway while the evaluations were being completed (I assume so that we couldn’t track who was there, or who did or didn’t complete the evaluation). Prior to the beginning of the next semester, we would get a packet of the evaluations after the department head reviewed them. The SRTEs are now online.
In opening the results of the SRTEs, I have found that it’s good to be sitting down.
Over the years, I have had many positive comments about my teaching and about the courses that I teach. Students like the interactive aspect of most of the courses in our major. They often report that my real-life experience in the field and the stories I share help to bring home the point I’m trying to make in class. Students usually indicate that I know the subject matter and that the course was important in their professional preparation. They sometimes like my humor and often report that they appreciate my no-nonsense classroom policies about things like attendance and late papers. The majority of the comments that I receive as well as my overall scores fall on the positive side. Reading positive comments on the SRTEs can be a feel-good experience on a bad day.
As for the not-so-positive comments? Sometimes one needs to have thick skin.
Under ways to improve, students sometimes make comments about aspects of the course that they didn’t like. They didn’t like an out-of-class assignment. They didn’t like the group project. They didn’t like the exams. They prefer essay tests over multiple choice – or vice versa. They suggest putting notes online or having review sessions before exams. Without fail, every semester someone suggests easier tests, no homework and grading policies that lean in the direction of As.
Sometimes, they get personal.
Over the years, I have had feedback from students that had little to do with my teaching.
“I really don’t like her hair.”
“She wears old lady clothes.”
“I know you are trying to teach us about the real world, but Penn State isn’t the real world. Lighten up and have a beer.”
“I’ll bet she votes Democrat.”
Sometimes the negative comments are humorous. Sometimes they are downright mean. While it’s not surprising that the occasional young adult would take advantage of the anonymity of the SRTE process to be unkind, I’m sometimes surprised at the level of disrespect.
“I hate her and think you should fire her.”
Ouch. Thankfully, the majority of my SRTEs don’t fall in that category. When they get really nasty, I take a deep breath and then look for constructive comments. I try to read through the personal stuff and look for ways to improve my teaching and to keep the course and the assignments relevant.
Students sometimes roll their eyes when we encourage them to fill out the SRTEs. They ask, “Why should we bother?” They often don’t realize that teacher evaluations play an important role in quality assurance in the university community. Problems that are identified in the SRTE often leading to a tweaking of course materials and/or course delivery. More important, however, is that our mean scores for the quality of each course and for the student’s perceived quality of our instruction are included on our performance evaluations. That’s right – the feedback from our students is part of our annual review.
SRTEs are of particular importance with a new instructor or a new course as well as through the tenure process. Faculty sometimes talk about the double bind of SRTEs. It could be suggested that easy grading and relaxed course policies would be the way to guarantee positive SRTEs. Holding students to high standards while helping them learn can potentially mean lower scores on the popularity scale and therefore more questions at one’s annual review.
When the STREs went from paper to online, the response rate for some courses fell below that which would make the responses valid. One of the advantages of the online system is that we can note the percentage of students who have responded while the SRTEs are still open (without seeing the actual responses). This past semester, I offered incentive for the whole class if the course response rate hit 80 percent. I was happy to see that 90 percent of my students participated.
I’ve used the SRTEs to make adjustments and change assignments. The key is to look for patterns and trends and to make changes, not based on popularity or personality, but in the service of improving the instruction.
Now, about those old lady clothes . . .