Friday, February 19, 2016

Cheat-Proofing Exams IV: Student Feedback

As I've written about before, one of the immense values of teaching a class in which students are expected to use internet-accessible devices is that it allows new types of exercises and opportunities. This includes, for example (in genetics), accessing online sequence databases and having students perform analyses on DNA sequences using web-based tools.

Philosophy of the Cheat-Proof Exam

By writing exams focused less on lower Bloom's levels and more on analysis, evaluation and creation, my hope is that I'm getting closer and closer to crafting a truly cheat-proof exam. At the heart of the cheat-proof exam are two core aspects of course philosophy. First, what practicing professionals in the discipline do is called collaboration, not cheating. In fact, I actively promote collaboration during some exams - my only requirement of students is that they cite all sources (including personal notes, internet sites, and fellow students) when recording answers to questions. Otherwise, I tell them, copying is cheating. Two, when testing for higher Bloom's levels, it is easier to write cheat-proof exam questions: particularly those that would only have unique answers (such as essay-style response questions). In a genetics class, for example, I like the approach of starting questions with something like, "Access the NCBI database and find the sequence of any gene." Given the vast number of DNA sequences in this database, the probability that two students happen to choose to write an answer involving the same gene is infinitesimal (or smaller), so if this actually occurred, I'd have a strong case for proving cheating collaboration, if I were inclined to do so.

The Hitch

Of course, you've already spotted the trade-off. It is relatively easy to write a cheat-proof exam. It is more effort to grade them. As such, whether you decide to adopt the approach of writing cheat-proof exams depends almost entirely on your predisposition to spending time grading answers to a question where there is no single correct answer. My genetics classes usually have about 95 students in them, but I still include cheat-proof questions.

Solutions to the Problem
  1. It is guaranteed that writing exams that incorporate higher Bloom's levels will take more effort to grade. However, there are a few approaches that can make grading more efficient. First, write multiple-choice questions that still involve calculation, interpretation, analysis to determine the correct answer in the first place. Multiple-choice doesn't have to be limited to factual recall questions. 
  2. I should also disclose that I also adjusted my grading scheme to account for the increased difficulty of exams that are designed to incorporate all levels of Bloom's taxonomy. There are six Bloom's levels, and I try to distribute point values evenly between them (I combine "evaluate" and "create" at the top of the tree). Thus, I align letter grades with Bloom's levels. Students that score 0-20% of possible points earn an F - they tend only to be able to perform factual recall. Between 20-40% of points on an exam earn a D; 40-60% a C; 60-80% a B; 80-100% an A. To earn an A, then, students have to be able to earn the upper-most 20% of points, which means they have to successfully answer the questions I write that require creating knowledge or critically evaluating information. In other words, I use Bloom's taxonomy to define my expectation of what students have to achieve to earn letter grades.
  3. Use group exams. This allows me to feel like I can write much more difficult questions, because groups can opt to distribute workload or at least to discuss options for solving a problem before selecting one. This has been (as expected) much more efficient for grading. If I have groups of four students working on an exam, I grade one fourth the number of responses.
  4. Leverage the tablet for collecting drawn responses. I don't know if it is true, or just my perception, but I feel like grading written responses takes longer than graphic (drawn) responses. If a picture is worth a thousand words, and if it is faster to assess a student-created picture than it is to read a thousand of their words (guaranteed!), then it is much more efficient to grade a visual answer to a question. In the sciences, one of the easiest ways to incorporate such a question is to ask a student to draw a plot (or diagram of results) of what they would expect in the situation that:________.
  5. Now to the title of the post! Last term (the first time I attempted the "open-note, open-internet" tablet-based exam), I only gave students limited feedback. The way I collect student exams from their tablets is that they submit them (via Google Classroom) as PDF files to me. This has so many benefits, including my ability to efficiently store student records and to easily transport them from office to home for grading (because we all know that grading often happens at home in the wee hours of the morning, right?). However, I have yet to find the best way to give students feedback on their exams…

Digital Student Feedback on PDF Exams

What I do currently is to add "sticky note" text fields to the student PDF, and then I send those edited PDFs back to each student. This is a very time-intensive process, and the only information I put in those notes is: if a student earns less than 100% of points available for each question, I leave a note that indicates what score they earned. That's all of the feedback students received last term. Because I video-record and upload video exam keys, and because it is more useful for student learning for them to have to do the work to find out where they missed points on a question, I actively do not try to provide extensive feedback to students.

However, this semester, I'm making a slight change: I'm not even providing this much feedback. I'm finally, after reading about them for years, implementing "exam wrappers." The core idea here is that you provide a small assignment that requires students to consciously reflect on their performance on an exam. Often this takes the form of asking students, at the end of an exam (or just after they learn how they scored), what they think they should do differently to prepare for the next exam. Today, this is what I told my students about how they get feedback on their exam performance:

  • I want to motivate students to reflect on where they didn't meet my expectations
  • I post exam keys (both a static PDF as well as the video key, in which I narrate and write/draw out answers to questions)
  • Students have a digital copy of the exam they submitted
  • Students only know their total score on the exam
  • I show the entire class the letter grade distribution for the exam
  • I also show data on the percent of students earning all of the points available for each question
  • I expect students to compare their answers to the answer key and try to assess which question(s) they think the lost the most points on
  • Each student has the opportunity to write/draw a response to me in which he/she chooses a single question on the exam (the one he/she thinks is most likely the one they lost the most points on) and writes a description of how/why they think they did not score all of the points. I said that, if the explanation is accurate, then the student can earn up to half of the missed points on that question.
Yes, more grading work - but I'm willing to do this to help the students help themselves master material that they initially didn't fully grasp.

The Bottom Line

This is a new approach for me, so I'll post updates here on whether students take advantage of this opportunity, and also whether it winds up helping the students master the material. As I said in class today, my main concern is that students understand critical concepts (and be able to demonstrate that understanding) by the final exam. My philosophy is that, for the questions where most of the class doesn't perform well, I'll bring a related question back for the final exam. I want to foster student growth over the term. And, I tell the students this. So they should know, at this point in time, which topics will be showing up on the final exam. And, if they're willing to devote the work necessary to master those concepts, I think those students are the students who have truly earned A grades.

In sum, although you might not agree with this teaching/grading philosophy, I hope you will at least agree that there is value in three aspects of my design for cheat-proof exams:

  1. Students should be assessed in authentic situations (which, for many disciplines, is going to involve accessing online materials and/or collaborating with others)
  2. Digital workflows for distributing and collecting exams can improve efficiency
  3. Our goal as educators is to make information relevant to students and to help motivate them to learn the material by the end of the course

Although my tablet exam approach is certainly a work in progress, it is meant to achieve these three goals. Time will tell!

Friday, February 12, 2016

Collaborative academic rap

I've written before about how to use Google Apps (e.g. Sheets) for real-time multi-student collaborative editing. In class this term, I've asked students to form their own groups in order to summarize each chapter's worth of content by collaboratively writing a verse using Google Docs.

Rhymes
(the pedagogical philosophy)

For example, one student group penned (digitized?) the following after our first chapter, on the concept of natural selection:

"It's all about change across generations' time
It's not in a single life, that ain't our rhyme
That theory came from our man Lamarck
But evolution's DNA, so that didn't work.
It's not a perfect system, the pick's a random roulette
The better go on to proliferate, that you can bet
We don't choose our traits, Mother (Nature) knows best
And now that you know that, here's the rest…"
- de Guzman, McDonald and Olvera

One benefit of using Google Apps and internet-accessible mobile devices in such an exercise is that students not only form groups to help each other learn, reflect on which topics are most important, and distill complex concepts into more simple forms (things that could be done without tablets), but that they can conduct collaborations asynchronously and from any location. More simply put,

"Recognize that one of our limitations
is our limited time together.
Fifty minutes at a time is never enough;
a passionate teacher won't settle for less than forever

Group work is easy when you're in the same space,
but when your group members leave campus, you're all apart
Each take a tablet with a cup of Google apps
And you've got the time to build lyrics a'la carte"

If nothing else, the above should demonstrate why I leave the incorporation of art into a science class to the students: they come up with many more clever rhymes than I could ever hope to design. More importantly, though, tablets allow everybody to provide input to reach a common goal at times (and places) convenient to each group member.

Tablet Use
(boost the digital intensity)

Here are the technical details to the approach. Assuming each student has the Google Docs app (available on most, if not all, tablet and smartphone platforms for free), have students form small groups. Each group chooses one member to create a Google Doc, and then share editing access to the group members by collecting their e-mail addresses and inviting them by adding their e-mail addressing in the "Share" function in Google Docs. Make sure have each group add you, as the instructor, as well. Also, make sure to be clear that each group document should contain a list of all of the group members at the top.

Then, set a deadline. After the deadline, you can navigate to the header of each Google Doc, where there will be a link to the Edit History (e.g. "Last edit was made 2 hours ago by User1"). If you click on that statement, you get taken to a list of all contributions by each group member (assuming that each has signed in via a Google account). This is how I check to see whether each group member at least contributed to the content.

Moving Forward
(collaborative propensity)

At this point, we are about to embark on our second bout of lyric production. We're going to continue this for the entire term. I'm especially excited about this because, just after we wrote our first set of lyrics, I noticed that acclaimed rapper Baba Brinkman (famous in scientific circles for his academic rap works, the most notable of which is the "Rap Guide to Evolution," which he has performed for us in the past at Fresno State) is currently crowdfunding his current project, "The Rap Guide to Climate Change" on indiegogo.com.

My Evolution (BIOL 105) class, presented with the opportunity to help fund this album, in exchange for Baba writing and recording a custom rap song (to potentially include some of my students' lyrics!), raised the $1,000 necessary to make this a reality! I sent the funds to Baba's campaign today (February 12), in celebration of Darwin's birthday.



I'd like to thank Baba for being willing to collaborate on this project and thank the students for seeing the long-term benefit of rewriting evolutionary concepts as rap verses (which we'll also use by the end of the term to compile a study guide for the final exam). I'm always looking for ways to integrate the arts into science courses, and so far this approach (especially enhanced by our tablets) has met with great enthusiasm and support!


Wednesday, February 10, 2016

Digital Scabs

No, not those scabs! I'm talking about picket-line-crossing. And this entire post won't be a rant about policies and unions, although it starts off sounding like it. Please read on - there's digital pedagogy relevance down there somewhere:

If you haven't heard, the union of faculty in the California State University system (the California Faculty Association, CFA) has approved strike action for five days in mid-April if the CSU administration doesn't return to the bargaining table to negotiate between their offer of a 2% salary increase and the CFA request of a 5% increase.

According to some reports, this would be a historic (in the USA) strike, partly because the CSU system is the largest university system in the country.

I'm not going to announce here which side of the argument I come down on. However, I will say that, as a DISCOVERe (tablet instruction) faculty fellow at CSU Fresno, I will certainly admit that my first thought, when I heard about the impending strike, was: would a strike impact digital instruction?

My primary concern about the strike is that both sides of the disagreement seem to have made similar statements about the potential effect of a strike on students. In various media outlets, reports have stated that the CSU administration has indicated that a strike should have no effect on student graduation or timely degree progress (I'll note that they were quoted as using the word "should" and not "would"…) Likewise, CFA representatives apparently have made the same claim - leaving me wondering what incentive a strike has to motivate CSU administration if student success isn't placed in the crosshairs? This approach doesn't make sense to me. But, never mind that…

A number of my colleagues, in opinions expressed over years past, have consistently stated that they feel taken advantage of, because administrators know that we put as much time and effort into our passion: teaching and mentoring students, because we love what we do. Part of our reimbursement, in some way, is the good feeling we get by the job we do. That's worth something to us, but it is hard to put a dollar sign on. And, that may be why a strike could be necessary - to remind administrators that we won't always be pushovers when it comes to increased course loads/enrollments without increases in compensation.

However, I'm dangling in the balance. On one hand, I understand (intellectually) the important of solidarity in union affairs. I'm particularly sensitive to this as an untenured faculty member. I suspect I'd be more likely to consider crossing a picket line if I had tenure and was relatively more immune to what fellow faculty members would think of me afterwards.

All of this leads back to the same point: how might a strike impact digital instruction? I'd love to hear some of your comments, especially from any of you who might have been through this before. Would I be a "digital scab" if I established, in advance of the dates of planned strike action, all of the reading assignments, exercises, pre-recorded lectures to watch, exams, etc.? Sure, I might not do any active faculty work (reading/responding to e-mails, committee work, class prep, grading, teaching, etc.) during the strike, but am I weakening the faculty cause if I use the digital pedagogy tools my institution has helped me develop to help students continue learning on their own in my potential absence? So far this term, I've done my best to set my students up to succeed as self-learners. Maybe a faculty strike is in their best interests? A not-so-gentle nudge into the deep end of the pool. Maybe the students don't really need me at this point. Maybe this is how we'll find out…

What's a professor to do?