Friday, February 8, 2013

The 100% Response Rate Myth (part 1)


A question posed by an attendee of last week’s webinar prompted this response, and it is worth repeating:

The Chair of the Physics Department of a large research university was not convinced of the benefits of moving to a web-based course evaluation system.  He said, “we get 100% response rate using paper and pencil and I don’t want to disturb that”.  I replied that I understood he did not want to make changes, but I asked for a favor: after the evaluations are collected and scanned and reported at the end of the term, examine the final response rate and then deduct those student responses where the comment boxes were left blank or consisted of just a one-word or vague comment such as “none” or “all was ok”.

To his great credit, I received a call some months later.  “62%”, he said.  “Only 62% handed in the forms and wrote at least one useful comment about their instructor or the course.  That number is even lower if I were to exclude those students who mostly filled in all the N/A choices or circled all 5s without apparently putting in much thought.”

Nearly 15 years ago, my colleague Dr. Robert Wisher and I established that students provided 400% more comments about their instructor and course when using a computer than when using paper.  Blind raters also judged the online comments as being more honest, specific, and informative than the paper-based comments.  This held across various learning environments and types of students, including military training, corporate training, and college classrooms (Champagne & Wisher , 1999, 2000, 2001).  This finding was replicated many times, each comparison showing between 200%-700% more comments in favor of web-based evaluation (as summarized by Donovan, 2007).

A more recent examination of 336,000 student responses across 80 campuses over a one-year period yielded similar results for mobile devices.  Students submitting course evaluations using a Tablet provided nearly 300% more comments than paper, and students using Mobile Phones provided 250% more comments than paper (Champagne, 2012).  Allowing students to complete course evaluations by computer, cell phone, and tablet generated responses rates of 72% to 86% per term across this sample.

Have you heard administrators at your institution speak of the mythical 100%?  Has your institution closely examined the ratings and comments submitted by students to determine if 100% were useful for improving the course or instructor delivery?  Do you have stories to tell from your experience?  Please share here or write me at matt@DocChampagne.com.

No comments:

Post a Comment

Please leave us a message! Your thoughts and opinions are greatly appreciated.