top of page
Search

Analyzing The Feedback

  • Writer: Mark Seymour
    Mark Seymour
  • Nov 8, 2020
  • 2 min read

In my previous blog, I explored the world of assessment through a specific Course Management System. In my case, I chose the current CMS for my district, Canvas. While I have slowly begun to unlock the potential of assessment through this platform, my tendency to “play it safe” led me to using a Google form embedded into Canvas. However, not all digital assessments tools are created equal, and not all are as streamlined as a Google form. And there are other considerations to make when choosing a format or tool.

One topic we explored this week in our course #CEP813 at MSU, was the ethical considerations for digital assessment. My perspective on this is conflicted. On the one hand, students love a good game of Kahoot or Quizziz battle. But there are always going to be losers in those contests. And these are not ordinary recreational games. They are assessments of academic skills, played out before the entire class. Yet, they are engaging, and in most cases just plain fun for students. There are some ways to hide the lowest scores, allowing some students to save face. But I have also received concern from parents that this is a violation of their child’s privacy. I agree. And now I agree to the point where I don’t even let students pass out the graded work of others, a long running practice in many classrooms, for many years.

But how can we best utilize these tools without succumbing to the dangers of violation FERPA, unwittingly sharing student data, or even having student information used by platforms, hosts, or other entities to benefit financially. The answer is a strict vetting of any tool we bring into our classrooms, digitally or otherwise.

After creating my CMS assessment, I was able to receive feedback from both professor and peer. I in turn, offered my own feedback to yet another peer. The feedback in both cases was thorough, although varied in approach. My peer feedback offered detailed information that hit all the points I provided, but it was written in prose. The professor’s feedback was more bullet pointed as was the feedback I offered to my assigned peer. The feedback I received was constructive and addressed the guiding questions for my design. At this point it only confirms my initial inclinations, and I don’t believe I would need to adjust this particular CMS based assessment.

This is one important takeaway. We don't always need to be in complete agreement when providing feedback, but it should always be constructive. And as teachers providing feedback to students, it should be instructive as well.

 
 
 

Comments


© 2022 by Mark Seymour

 

bottom of page