Sunday, December 13, 2015

Bowen & Bloom: Skepticism & Surfing

I attended my second José Bowen http://teachingnaked.com teaching workshop at Fresno State on Friday. During consultation days - just after the end of instruction and just before finals - might have been a nontraditional time to hold such an event, but as I'm already thinking about New Year's Resolutions related to my spring courses, I figured I'd attend to get more inspiration. Here are my top three take-home messages from the workshop:

Teach students how to:

  • Deal with failure (view it as an opportunity to grow, not as a setback)
  • Change one's mind
  • Be skeptical
Although I've used ed tech to address all of these, today I'll focus on skepticism, which others (e.g. Bloom) might refer to as critically analyzing, evaluating, or critiquing. Part of Bowen's workshop involved the participants viewing class from the student point of view (which is increasingly more important to actively attempt, the older I get…).

Step one: create a learning objective. Mine was for genetics:
"Explain the difference(s) between sister chromatids and chromosomes" (a perennial source of trepidation among the undergrad biologist set).

Step two: Bowen then suggested that each of us employ our devices (laptops, phones, tablets) as a student would: to search for key terms. The goal: find one example of good content and bad content. I chose to do a YouTube search of "chromatid vs chromosome." Having just been informed that students won't watch more than five minutes of video (if that), I was pleased to note that the top hit was only 5:03 long, so I watched the video. I'll use it as an example of the not-so-good content (predominantly because of some misleading definitions provided in the video):


As Bowen has told us (repeatedly), one of our (faculty) most important jobs in the digital era is to help students assess the quality of free content available on the internet. To be successful, we must be able to learn new things and to adapt quickly in the marketplace. Thus, our protégés must be comfortable not only navigating the sheer volume of data available but also be able to discern the quality of the information (hence Bowen's new term to replace "professor": "cognitive coach").

Step three: Realize that, regardless of the discipline, your students will search for and encounter material that is just as poor on their own while studying for your class.

So, now that I'm armed with a video that:

  • is the top YouTube hit on a likely search phrase for this objective
  • is relatively brief
  • attempts (but fails) to explain a critical and basic concept in my field

how will I incorporate this realization & content into a web-enabled class? I've done this before (but not with this specific video):

Step four: Ask students (either in class or before class, if you don't have internet access in the classroom for all students) to search for pertinent content (either web page or video) and share the URL (perhaps via collaborative Google Doc, or via class poll via Socrative or todaysmeet.com, for example). Read/watch a few content entries with the class, asking them to try to catch errors (this works even better if you're looking at content that you, the instructor, created!) Foster class discussion on such discipline-indepenedent topics as:

  • What was missing?
  • What was blatantly incorrect?
  • Where could an alternative explanation/point of view been added?

End with a brief summary of the purpose of the exercise. Hopefully you can conclude that, among other things:

Not everything you read/watch on the internet is true

Being skeptical is incredibly important! (thus, of course, I should also admit: don't just take my word for it!)

Finally, I should continue to give credit to Bowen (and to Bloom) for inspiring this post. As Bowen alluded, there's a great benefit in this approach for the faculty as well: if you get students to go onto the internet to find relevant digital content, and then if the class evaluates the content, they're helping you find some potentially really outstanding material that you can leverage the next time you teach the course!

Friday, November 27, 2015

BYOD^2: Building Your Own Digital BYOD Course

I just jogged past Best Buy, and it was closed! But, I'm sure that in a few hours it will see the hustle and bustle of other retailers. In particular, I suspect that a lot of purchases today will involve mobile technology. For this reason only, I give thanks for Black Friday!

Some of you who follow this blog, I know, have been wondering how easily you can adopt mobile tech classroom approaches that I enjoy, and give thanks for, because of my institution's priorities (institutional buy-in on providing the infrastructure, professional development, and funding for tablet pedagogy). In past, I've suggested that my renewed focus would be on making it more clear how I think similar approaches could be deployed at other institutions. I'll start today with the basic elements of Building Your Own Digital "Bring Your Own Device" (BYOD) course (that's right: BYOD BYOD!)

Difficult decisions

I believe that the most difficult balance for an institution to strike in making academic IT decisions regarding student programs is inclusiveness vs. cost. My school ponied up some serious cash to help students purchase one of three tablets (with three operating system choices) and also allowed students who already owned a tablet to use it, as long as it met specified minimum requirements. We might have received some incentives had we elected to work with a single tablet vendor. Ultimately, it was more inclusive to let students have a hand in deciding what tablet brand they would purchase. The cost? Sustainability. We are faced with a difficult decision now: how/whether to keep subsidizing student tablet purchases? If we have a student population where many students don't own a tablet (much less a laptop, or perhaps even a smartphone), what's the bigger picture about the future of our program?

Have we yet achieved BYOD parity?
To understand whether our students have smartphones, our institutions should certainly ask them! Meanwhile, we can turn to the Pew Research Center "U.S. Smartphone Use in 2015" report (all screen shots below from this report) for three general insights into recent smartphone trends that we need to keep in mind:

1. A majority of adults own smartphones; most are dependent on their smartphone for internet access

2. Faculty should be careful about demands on student data usage if your campus doesn't provide free wi-fi access for data 


3. Although more minorities than whites own smartphones, numbers dip for low-earners and for those in rural areas


Moving forward

If I moved to a new institution and wanted to keep using Tablet Pedagogy, what would I do before classes start?
Step one: before deciding, I would survey my class to find out how many students have regular access to mobile tech (laptops, tablets, smartphones). Do some soul-searching and decide whether you can get non-tech students some loaner devices, or whether you're comfortable forming student groups around the students who do bring mobile tech to class. Check with your institution to see if you have "loaner" devices (our library has a laptop loan program, for example)
Step two: assess institutional infrastructure. If your institution has wireless internet I would inquire with my IT staff about the ability of the network in your classroom to support the simultaneous connection of the number of students you have
Step three: Decide whether you will be form-agnostic (e.g. allow laptops and smartphones also)
My opinion? Go device-agnostic. Those students who are out today picking up new tech on Black Friday? Let them use it in class. This is how we can develop and support BYOD (Bring Your Own Device) digital pedagogy and bring new methods of engaging students, connecting them to course material, to each other, and to society.
Step four: acquiring your own tech. If your classroom doesn't have one, get your campus to invest in at least one portable digital projector (and any necessary cables/adapters to connect to your device). Ideally? A projector for each student group (if you use student groups, want them to present content they create/curate with other groups, and if you have enough projectable surfaces in your classroom). Think also about investing in technology that your students will be bringing. Are you an iOS lover? Pick up an Android tablet, too, so that you can check out the student experience on that OS.
Step five: if the students and the campus are ready for it, then prepare your syllabus - paying particular attention in this case to establishing any minimum system requirements (e.g. screen size, operating system version, amount of memory) and apps that students should have installed before class. My biggest piece of advice here? Make all of these elements "Suggested" (not "Required") materials, unless you know you have strong backing from your administration. Related: decide whether you're going all-in with mobile tech, or if you'll also allow some students to opt out and go old school (hand-raising, paper, poster boards, etc.)
Step six: build in extra time the first several days of class for deliberately practicing workflows and for doing tech support for students.
Step seven: find a friend. Identify a colleague who shares the same willingness to try new things in the classroom (and ideally also who is tech-savvy!) Book some time to practice: one of you plays teacher, the other student. This is critical for seeing both sides of the digital interaction using whatever apps you employ (e.g. Socrative, Google Classroom).
Step eight: make a backup plan for dealing with unanticipated tech issues!

What are your main concerns about going solo with digital pedagogy at your school? Please leave a comment and share your thoughts!

Sunday, October 25, 2015

Google Classroom for anonymous student feedback

As I've mentioned, one of the reasons I feel that having computers in the classroom is exceptionally useful is for collecting student feedback in real time during class (e.g. using Socrative, Twitter). However, both of these tools have a shortcoming: they tend only to excel at collecting typed responses (True/False, multiple-choice, written response). With my discipline being one that also involves diagrams and visual representations, I have been dreaming of a way to rapidly collect drawn responses from students during class as a method of formative assessment: to help me understand what concepts need more practice time.

Past Failure
I've written previously on how I've started to use Google Classroom to assign and distribute exercises to students in my tablet course on genetics. My first use failed at what I was trying to accomplish: to send one exercise requiring students to draw annotations and then return them to me anonymously. My plan was to then open a few representative images and display them on the projector, in class on the same day, to engage in some group critique and analysis. However, I discovered that setting an Assignment using Google Classroom, in which I provide a PDF and have Google Classroom make a copy of that PDF and deliver it to each student registered in my Google Class, doesn't work anonymously. When the PDF containing the image I want students to manipulate is copied to each student, Google Classroom adds student names to each filename.

Goal
This is not ideal for my goal, because I'd like to be able to open my Google Classroom folder on my computer, browse through the previews to find student PDF submissions representing useful points to discuss, and then open them for projection to the class. However, if I do this in front of the class, everybody sees the student names in the filenames, and so I've lost the anonymity. Sure, I could turn off the projector (or switch inputs) while I look through student submissions, but this seems like a cumbersome process. Fortunately, a relatively easy solution is at hand: don't add the PDF activity to the Google Classroom Assignment!
  1. Set up a new Assignment, but do not attach the file here
  2. Perhaps distribute that file using another method
  3. Ask each student to attach his/her final annotated version to the Google Assignment when they Turn it In


Present Success
For example, tomorrow in class we start our section on Pedigree Analysis. To ensure that students have read the part of the textbook on the format of a pedigree (and to make the topic relevant), I will ask each student to draw the pedigree of his/her immediate family and send it to me. Now, clearly, this is not a situation in which I want any identifying information from a student to be delivered to me with their pedigree drawing (I don't want to break HIPAA or other rules, of course). So, this is what I am doing:



I used the "+" button in the lower-right to add an Assignment, and here is the window with the assignment details. I'm not providing any document (there are no attachments, which I would add with the paperclip button). Instead, I'm just asking students to use whatever drawing program they have installed on their tablets to draw the pedigree and attach it to this assignment. Then, students attach their drawings to this Assignment and submit it via Google Classroom. When I open my Google Drive folder for this Classroom Assignment, I'll see something like this:


where the file names contain no identifying information. I can easily browse through this folder (while projected to the class) and open student responses anonymously to look for points to praise and points to critique.

If I want to know which student submitted each file, I can still do that, by looking at each student's record on the Google Classroom website, but I do not need to project this view during class simply to look at the attached submissions.

Conclusion
Google Classroom is now an integral part of the suite of apps I use (including todaysmeet.com, Twitter, and Socrative) to foster an active-learning and engaging medium-enrollment course (75 at present) in which students hopefully feel like they can obtain personalized feedback on their understanding and also engage in peer evaluation.

Monday, October 19, 2015

Flipped Classroom: Switched On (v2)

Take blended learning, add computers, and you get my next-generation (v2) flipped classroom. This means an enhanced ability to:

  • interact more efficiently with students in large courses
  • engage students in authentic practices in the discipline (assuming your discipline, like mine, relies heavily on the internet these days)
  • practice information literacy skills
  • collaborate with peers


Today, I share how the typical structure of my DISCOVERe (tablet-based instruction) courses has evolved into an experience that integrates the above points into a class where the tablets are not the only things that are switched on - so are the students!

0. Before students come to class
A reading assignment from a textbook is due to be completed by the start of class; I often also post links to videos I've recorded (or I've found) that introduce the topic for the day. This is the "lecture" - where factual information and approaches to solving problems are conveyed to the students.

1. (5 m.) While you're waiting…



While students are assembling in the classroom (and during the first five minutes of class), I have a projected slide that contains a problem to solve or a question to think about, as well as (often) the link to a Socrative quiz that the students should take. This entry quiz is always based on the pre-class video/reading assignment, to ensure compliance. I use the on-the-fly Socrative results to determine what concepts I need to spend more time in class discussing with them. Having never use this approach previously, I'm routinely surprised at which relatively complex concepts the students seem to already understand, and which relatively simple (to me) concepts the majority of the class struggles with!

I like "While you're waiting…" questions, because it gives students who desire more practice with material an optional opportunity to do so. As the term has progressed, I've seen more and more students actually taking the time to work on these practice questions as they wait for class to start.

2. (10 m.) Anonymous review of student exercise submissions (from previous class)
In the latter half of the previous class, I introduced a topic and set the students a rather difficult (at the time) problem to solve - one that they should be able to get at least part way into before encountering some issues that they haven't had to deal with before. The students work solo, and then in small groups, and then send me their work (digitally) by the end of that previous course period. This  approach does two things for me:

  1. It provides students context for the reading (and/or video-watching) assignment due for the following class (i.e. today's class). This is the typical blended learning approach: ask the students to attempt something with little essential background; they fail at it. Then, when they go encounter the reading assignment relevant to that problem, they are (hopefully) a bit more invested and have that mental "hook" to identify where they went wrong during their initial attempt. This, ultimately, leads to better-formulated questions in the following class meeting (today) when we discuss that topic.
  2. This gives me time to look over the work and assess, before today's class, where the class stands in terms of common themes: where they succeeded and where they had issues.

During this portion of today's class, I show some anonymous examples of student submissions and ask the students to perform some peer evaluation: "Does anybody see something in this approach that you like? That you would change?" I also use this time to point out, with praise, patterns in thinking that I've observed that show strong understanding of difficult concepts, even if the answer ultimately was not obtained (or was wrong).

Part of the reason I like group peer evaluation is that it helps me "grade" the student answers to the questions in class, so that all students get some hopefully-pertinent feedback on how to accomplish the exercise, even if it is not their work that is being specifically critiqued. This also potentially means that the instructor does not have to formally grade and return all of the student submissions - the students gain the benefit of having worked on the problem, and then they benefit from time in class to discuss any issues they encountered.

After we work through the solution to the question or problem and address questions, we move on to:

3. (10 m.) More practice; Q&A
in which I provide another, often just marginally related problem, for students to work on solo and then in small groups to practice the concepts just discussed. We also usually discuss the answer to the "While you're waiting…" question from the start of class at this point.

Other types of activities we do in class together involve learning to access and use online resources. For example, last Friday, as I was discussing types of mutations, we discussed how mutations in BRCA1 predispose to the development of breast cancer. A really fabulous question from a student was, "If the BRCA1 gene is involved in DNA repair in all of our cells, why don't BRCA1 mutations predispose to other types of cancer - like skin cancer, for example - as well?" So, today in class, I spent some time online showing students how I would go about learning how to answer that question. We went to the Online Mendelian Inheritance in Man website (omim.org) to learn about what's know about the genetic basis of breast cancer, and found some primary literature references indicating, for example, that male carriers of BRCA1 mutations might have an increased risk of developing prostate cancer. This lead to a great (but brief) discussion to the extent that we don't know enough about the BRCA1 gene, and all of its potential effects, yet! (Tangentially, this also raises the important point that gene names are not always accurate, nor do they always (ever?) reflect the true complexity of biology).

4. (5 m.) New topic
Having now completed the instruction on the previous topic, it is time to introduce the next topic (which builds on the previous topic), often with a story or scenario that sets up a new topic. I try to do this by being provocative, or by posing a puzzle that has no obvious immediate solution.

5. (15 m.) Try, and fail
Now with five minutes worth of "mini-lecture" by me on the new topic, students attempt to solve a problem alone, and then in small groups. Then, they electronically submit their work to me (usually as a screen shot of hand-written work on their tablets). For example:


Again, the idea behind this approach is to have students partly invested in figuring out how to solve this problem before they first encounter the relevant material (on their own, through the reading/video assignment for next class). The work they submit is what we will spend the first portion of our next class meeting anonymously evaluating and discussing whether the reading/video assignments led to any new insights about how to address the question.

6. (5 m.) Exit Survey
At the end of most classes, I provide the opportunity for students to reflect on the class period. This is often a Socrative quiz that asks questions like, "What did you learn today that you didn't know before?" or "What topic should we spend more time discussing next class?" or "How does what you learned today apply to your life?"

Summary (and a word of caution)

It takes quite a bit of mental effort to keep up this sort of course design, where, essentially, the first half of each class meeting is about material initially encountered during the previous class, and the second half of each class period is spent introducing students to concepts to be continued in the following class! Staying organized, and planning activities and pre-lecture work (e.g. which reading assignments and/or videos to watch) in advance, are key to success with this approach. To many, this is an unfamiliar structure, and it can take some significant mental gymnastics to appreciate this approach at first. The rewards will come!

Ultimately, the knowledge that all students in class have mobile devices that they can use to annotate PDFs, draw images, complete Socrative quizzes, seek information on the internet, and send and share their work with others (peers and the instructor) can create a classroom that is much more active and "switched on" – an environment where learning from one another takes place! In conclusion, in my version of the tablet classroom, the main benefits of the tablet in the classroom are two:

  1. I am assured that every student has access to a computer (and to the internet) - this facilitates actual student involvement in analyses that I would traditionally just describe while projecting screen shots, for example
  2. It improves the efficiency of submitting/exchanging/evaluating ideas and assessing performance. It simultaneously provides anonymity to those who wish it, and thus grants a voice to those who might not otherwise speak up in class

Wednesday, October 14, 2015

Cheat-Proofing Exams III: Post-exam analysis

Now with two open-internet, tablet-based exams under our belts in Genetics class this term (each combining an individual exam and a group exam), it is time to briefly address concerns about holding exams where students have access to all data, everywhere. How (and what) do the students do?

Let's look at my most recent grade distribution (individual and group exam scores combined):


My students seem to be doing outstandingly well (but too well?), and we could (but won't here) enter in the discussion about whether an ideal grade distribution (like the bell curve) exists and should be sought.

There are three points to keep in mind about the student outcomes thus far:

First, I should point out that my letter grade schema is different than most:

100-80% = A
60-80% = B
40-60% = C
20-40% = D
0-20% = F

If I had used a more traditional 10% letter grade division (which I have in every previous class), I'd have a much more bell-shaped curve.

Second: of course the instructor can craft the exams to be as difficult as s/he wishes. After both exams so far this term, the vast majority of students have reported that they accessed electronic resources (notes, the internet) less or much less than they had expected, mostly because they didn't feel that they had time during the exam. From this perspective, possible conclusions might be that access to electronic resources:
1) might only be an advantage to the top students, who already excel at test-taking and perhaps the course material, offering only them the ability to double-check answers they're unsure of
2) doesn't impact grades, because few (if any) students use the available resource

In either case, I feel like I'm winning: either I'm reinforcing the concept of double-checking one's work (a good practice for anybody to use) and improving metacognition (students aren't checking work they're reasonably sure they knew the answer on), or my tests are long and/or hard enough that students generally realize there is no benefit to spending time looking up notes.

I can also report that, from my own observations during the tests, I rarely saw students using their tablets to access resources. When I did, most of the students were looking back at lecture slides and notes.

Third: because I'm a scientist and need something to measure, I looked at whether there is a statistical correlation between student test grades and the degree to which students self-reported accessing notes during the exam (as expected vs less than expected). If "cheating" were an issue, then I might predict that students who used digital resources as expected would have higher scores than those who felt like they had no time to use those resources. However, a t-test produces p = 0.95. That is, there is absolutely no difference in the distribution of grades between the group that used digital resources as expected vs. those who used them less than they thought they would. My conclusion? Written thoughtfully (specifically: incorporating high-level Bloom's taxonomy questions), open-internet exams don't necessarily lead to cheating.

In sum, my current take on developing cheat-proof exams is:

  1. Give students the opportunity to use the resources they are used to using (textbook, notes, the internet, whatever) - otherwise, the exam situation is totally unlike what they'll face anywhere else (in grad school, as an employee, as a citizen…)
  2. Incorporate higher-level Bloom's questions that really just inherently can't be cheated (or, if they were, it would be obvious to the instructor - I'm thinking plagiarism on written response answers, here)
  3. Make the exams deliberately long, and grade consistently but harshly.
  4. Embrace collaborative group work. I feel that I can ask much more difficult questions of a group than I would of an individual student, and I have been pleasantly surprised at the results.
As an example of this fourth point: on my recent test, I wrote a group exam where the group had to select two of three questions to answer. One of the questions was something that we had never practiced in class, requiring the group to use PubMed to access the abstract of a published journal article and analyze some data found in it. To my great surprise, most of the groups chose to answer the novel question over either of the other two (involving question formats we had more directly practiced in class), and most (if not all) groups arrived at the correct interpretations.

If you need more convincing about the qualities of group exams, check out this video from one of my exams, comparing the class during an individual portion and group portion. From what I've read about the quality of learning based on peer instruction, I am comfortable seeing that students are much more engaged (and probably learning more!) during the group exam than the individual exam. What a better use of class time!


Finally, I'll reiterate an important point about digital exams: the workflow is more efficient in some regards; less so in others. By distributing the PDF exam using Google Classroom, each student has their own version of the PDF file, which automatically includes the student name in the filename. All of the PDFs are returned to me in a single folder on Google Drive, meaning I don't have to download and rename e-mail attachments from 75 students. Most importantly to me, digital exams means I always have a copy of each student's exam, even after I return a scored copy to each student. This can be useful in all sorts of ways (including programmatic assessment, detecting patterns that might reveal cheating…assuming one is worried about such things, and formative assessment for the instructor).

Now, if I could only figure out how to more efficiently (and still privately) return PDF exams with my annotations (scores, notes) to students without writing 75 individual e-mails…

Saturday, October 3, 2015

Be Bold: Tell the Truth

A fairly frequent comment I hear when I tell colleagues that I record and distribute my lectures to my students is, "Oh, I'd never do that - I don't want students to have recordings of any mistakes I make."

As a tenure-track professor, I suppose I'm pretty relaxed (more than I should be?) about making mistakes. I've been an honest person my entire life, which is probably much like the statistic that a majority of those polled report that they're above-average drivers. But, the origin of my fanaticism about telling the truth in the classroom is much more recent than my own origin. I was caught off guard in my first semester as a grad student when the professor (Dr. Mark Roth - I remember it clearly) told us that he believed in truth in teaching, and that, at the very least, if he ever lied to us, he'd tell us that he was lying. This was something I had never heard, nor ever expected to hear, a professor say. And therein lies the power.

It is important for students to know why we're the professor. On the first day of class, I always (now) tell my students why I'm the one standing at the front of the room. It is definitely not that it is because I am an expert in genetics - I have some expertise in particular aspects of genetics, yes, but I'm still learning. In fact, my credentials (a Ph.D.) suggest that I'm sufficiently accomplished at the art of learning that I should be able to lead others down the path of learning. In other words, I'm the mentor because I'm farther down the path of understanding the practice of genetics. This is critical to say, because it sets the stage for what should come next: admitting, openly and honestly, that you don't know everything about your discipline.

This facilitates (at least) three critical aspects of learning:


  1. It provides the opportunity to model how to learn new things (or to change one's mind)
  2. It helps establish an atmosphere where students might be less intimidated by the all-knowing professor
  3. It makes it much easier to make the point (which I regularly do in biology courses) that there is not usually one correct answer. Even in the sciences, there are nuances and exceptions to rules, and one can only very rarely (if ever) make statements as "this is always, in every case, how this works."

Today, I'll focus on the first, as it pertains to having one's mistakes recorded and posted online. Now, why is verbal deceit (lying) an issue in the academic classroom? The painter Pablo Picasso had it nailed: "You mustn’t always believe what I say. Questions tempt you to tell lies, particularly when there is no answer." Now, I teach in a Socrative fashion (my father, Bob Ross - no, not the painter - put me on to a goal of mine, to somebody teach my classes by only asking questions). And I strongly encourage my students to think creatively and to practice science by being inquisitive and skeptical. So, I'm routinely barraged with excellent and relevant questions in class; I rarely know the answers. Being the holder of a doctoral degree and the instructor of the course, sometimes I'm tempted to tell half-truths so that we can move on with the orders of the day and not be sidetracked.  And there's some inherent pressure to maintain alpha status in the classroom and maintain control by always knowing the answer. But we're (students AND faculty) all adults, and we deserve the truth. So, when I'm asked a question I don't know the answer to, one of two things will happen:


  1. I say that I don't know, I give (and justify) my best guess, and I tell the student I'll look into the answer and report back to the class (and I actually do so, because telling the truth about not knowing the answer, and then lying about looking into it, really isn't good practice)
  2. I demonstrate, then and there, how I would find an answer (rather than the first approach, where my learning takes places in secret, behind closed doors)
I prefer the second approach, especially because it is the approach preferred in a blended learning setting.

So, what does this have to do with tablet pedagogy? As I said, all of my lectures are recorded and posted online almost as soon as we leave the classroom. I mentioned in an earlier post this semester that I'm also (now) regularly reviewing the videos to produces Tables of Contents to accompany the videos on YouTube. And that process of regularly reviewing my lectures means that I occasionally discover that I make mistakes during class.

How to correct mistakes that have been distributed online


Here's the three-step plan:

  1. Post an immediate correction to your students (on your Learning Management System, via e-mail, however you like). If a student notified me of the error, I definitely mention that one of the students from class caught the error.
  2. Add a similar comment on your lecture video webpage. Don't take down the video - you have nothing to hide other than not being perfect.
  3. Mention this again at the start of the next class, solicit questions, apologize for any confusion your error might have added, and (most importantly of all): seize the teaching moment, if you have figured out how to explain how you went astray and what the correct approach/answer/strategy/etc. is - and why.


What happens next? Some combination of the following:


  1. Students respect that you tell the truth and realize that you're just human
  2. Students lose trust in your content knowledge
  3. Students have observed an expert modeling how one's mind can be changed (thanks José Bowen for pointing out this important need) and how to respond to criticism
  4. Students become horribly confused about whether the first polar body divides during oogenesis, and forever after will have no idea whether three or four haploid cells are produced (as a random example…of course…)
  5. Students learn that you respect them and care about the quality of their education


Hopefully, if you rank the relative importance of those five possibilities, the likelihood of doing good by telling the truth in the classroom vastly outweighs potential costs. Although in the back of my mind, I'm sure I'll always be wishing I had produced more students who really deeply truly appreciated mastering #4, it certainly doesn't keep me up at night like it would if I hadn't taken the opportunity to #3 and #5.

So, keep recording and posting those lectures without trepidation, review them regularly, and own up to any errors. Model being a lifelong learner and a decent and bold human being: how to identify mistakes and how to properly address them.

Sunday, September 27, 2015

Cheat-Proofing Exams II: Pros and Cons of Group Work

In the past week, I have finished grading my tablet students' individual and group exams. As in previous semesters, I returned annotated PDFs back to the students individually by e-mail - with comments left in the PDFs to show the points earned only on those questions where a student did not earn all of the points. Here is one benefit of group exams: although it adds a document to return to each student (I attached the graded individual exam plus the graded group exam to each e-mail), grading is faster because I grade the group exam and assign the same point values to each member of the group.


Student Attitudes and Perceptions

Although I have yet to fully analyze the post-exam attitude survey data, there are overall positive (or, at worst, neutral) findings about the effect of group exams on students. Although almost two-thirds of the class reported feeling more stressed than normal at the start of the exam and during the exam, the bulk of the class reported feeling normal (or more relaxed than normal) following the exam!

As expected, the distribution of how well students self-assessed their performance on the exam (before receiving scores from me) was normal (very bell-curve). However, the responses that most surprised me were to my inquiry about how frequently students accessed notes and other digital resources (notes, textbook, the web, etc.) during the exam. Less than ten percent of students reported using digital resources more frequently than they had imagined they would; two-thirds of the class used digital resources less frequently than they thought they would. Informal discussions with a couple of students seems to suggest that the reason for this was that they felt rushed during the exam.

At this point, I feel good: although I don't like having students more stressed than normal at the beginning of and during the exam, this might be entangled with this being their first exam with me. I might expect to see this effect dissipate as we complete more of this style of test during the semester. Further, if students are not accessing digital notes and content frequently during the exam, then perhaps this approach is, indeed, at least cheat-resistant. In fact, I was even quite frustrated to find that one student had solved one problem (on the digital annotation of a PDF) using long division, instead of using the calculator built in to the tablet!


Student Performance

However, how did the students actually perform on the exam? Individual exams were heavily skewed towards A and B grades, which is fine with me: I like to interpret this as an indication that I was successful at helping the students understand the expectations and to study and practice the appropriate content for the exam. The class average on the group exam was exactly the same as the average for the individual exam: 75%. Most importantly, although the addition of the group exam did negatively impact the overall (individual + group) letter grade on the test of four students, it also improved the letter grade on the test of three students.

Although there are additional post-test survey data to analyze, my broad impression of the quick look I took at the data were that, generally, given the option to write a dissenting opinion on whether they agreed with the group answer to each group exam question, students generally supported the group answer that was submitted.


Summary

My overall impression is that the group exams have broadly accomplished my initial goals:
1) group exams allow me to assess material and conduct analyses that would be difficult (or unfair) to ask individual students to perform during a one-hour exam, including higher-level Bloom's questions
2) although the novelty of the exam format might have been more stressful to students, the ability to collaborate with peers and/or access digital resources during the exam might have improved their post-exam attitudes
3) group work did not appear to cause a net increase in grades, but also did not have a net negative effect. Longer-term, if the groups maintain a stable membership (which I plan to assess), having established groups early in the term might have a long-term benefit to student performance if those groups study and strategize together.
4) group work, and post-test surveys on peer evaluation of group member contributions to the exam, provide me with additional data that will help me assess students' performance. These could, for example, be particularly useful data to return to when students ask me for letters of recommendation for medical school two years hence. I will have quantitative data to suggest to admissions committees whether students were generally found by peers to be engaged in group discussions, able to help resolve discrepancies, to help distill common themes in potentially disparate individual responses, and so on.

Monday, September 21, 2015

Cheat-Proofing Exams I

Today I was as nervous as my students for our first Genetics test of the semester. Today, after two semesters of teaching DISCOVERe (tablet computer-based instruction) courses at California State University, Fresno, I gave my first all-digital exam. I learned a lot (which, as you teachers already suspect, means that a lot went wrong). And, as you might also suspect, because I'm writing about it, there are some best practices to share!

The Background
As a geneticist, providing students with authentic experiences (even in the classroom) is one of my top priorities. One of the reasons I joined DISCOVERe is to ensure that all of my students would have computers in the classroom so that we could all practice analyzing and interpreting data. The first time I taught a genetics tablet class (which was my first tablet course), I hadn't done all of the things I should have done (like providing lots of deliberate exercises using digital workflows) to ensure that everybody was comfortable with taking digital exams. So, that class used almost entirely paper-based exams. There was no talking, no notes, and no tablets allowed during the exams in the tablet class.

The Theory
My goal is to have students demonstrate to me their understanding of the material by engaging in mid-level (and up) Bloom's: interpreting data, making predictions, applying knowledge in new situations, and being creative. I also feel like giving group exams for at least two reasons: 1) they help students collaborate and teach each other, which is extremely valuable, and 2) this also helps me observe students working in group settings (which helps me write letters of recommendation for those who ask!) To facilitate group exams, and to facilitate exams with tablets, I need to develop exams that are cheat-proof - and I also have to change my attitude: what we call "cheating" in the classroom is called "collaboration" in practice. I need to incorporate that into the classroom.

The Concerns
How many colleagues have asked me how I would prevent cheating on exams if I let the students have tablet computers? Plenty. And how did I respond? I told my students on the second day of class this term: I embrace collaboration. We know that employers want to hire people who work well in groups and who have excellent communication skills. The practice of scientific research has been making deliberate moves over the last decade (and more) to foster interdisciplinary collaboration - and I definitely subscribe to Woodrow Wilson's philosophy: "I not only use all the brains that I have, but all that I can borrow." So, how can I have group exams, and incorporate tablets into exams (to access online DNA sequence databases and web-based analysis tools, for example), without facilitating cheating?

The Approach

  1. Individual Exam (80% of points, 15 minutes, no talking; open note/internet/textbook)
  2. Arranging into groups (5 minutes)
  3. Group Exam (20% of points, 25 minutes)
  4. Exit Survey (5 minutes)

Following my attitude adjustment, "It isn't cheating - it is collaboration," what do my tests look like now? We have a two-part exam (like I have seen at http://blogs.ubc.ca/wpvc/two-stage-exams/), in which students first complete an individual exam. They download a PDF of the exam, annotate it, and return it to me while we're all in the classroom. There is no talking, although it is open-digital-textbook and open-internet (no print materials allowed). As at UBC, I made the individual exam worth most of the test points (~80% in my case). This helps assuage fears about how group performance might impact individual performance. Additionally, the individual exams incorporate most Bloom's levels. The goal of this 15-minute portion of the exam is to distinguish students who have not prepared at all (the D and F students, say) from the other students. Students who have not studied might spend quite a bit of time looking up simple factual information and not earn many points in this relatively quick individual exam. Although the questions are fast to answer (often matching or multiple choice, for example), they do require reflection and application. Although this is an open-digital-resource exam, this phase contains no talking and also no audio. I don't want students distracting each other by playing movies, for example (say, of my lecture captures). I don't even let students bring earphones, because I can't guarantee that every student would have that resource. The individual exam should be first, so that any students who happen to arrive late don't miss out on the group exam time.

After the individual exam, students form their own groups of three or four. I don't choose the groups; I hope that students (if they don't already have study groups) will establish long-lasting relationships in the class by realizing that there will be additional group exams throughout the semester. The group exam is 25 minutes long. Here I differ from the UBC example. I don't give the same exam to the groups. Instead, I write a short exam that is worth ~20% of the total exam points and that builds on the individual exam. For example, I might ask the group to describe why they chose the answer(s) they did on the individual exam, or to perform group brainstorming (one of my favorite questions from today was: come up with three hypotheses to explain why this unexpected experimental result was observed). These higher-level Bloom's questions, making up a small portion of the points, are designed to help me distinguish the A, B, and C students. I adhere to the UBC example by asking each group to submit only one exam per group. So, the first thing each group does, after introductions, is to select one member to be the "scribe," who will write all of the group member names on the exam, annotate it, and submit the PDF to me.

Active learning at its best: during a group exam


After the group exam, I distribute an exit survey in which students rank each others' contributions and state whether they agree with the group consensus answers. This is where those who dissent can explain why. All of this serves a simple goal: to help me collect evidence of understanding the critical concepts.

With my unorthodox scheme of assigning letter grades (0-20% of points = F, 20-40% = D, 40-60% = C, 60-80% = B, 80-100% = A), this exam structure lets the individual student earn up to a B individually. Any additional points on the group section can push any student into the A grade range. Thus, one letter grade of points is reserved for students who are creative and work well in groups and know the material so well that they can apply it in novel circumstances (such as making predictions and formulating hypotheses)

The Execution
Here's where today was a mixed bag. On one hand, I (as always) had a backup plan. It turns out, today was the day when I had to implement the backup plan. And the only reason that we (the class and I) were successful today was because I had specifically had my students practice digital workflows on relatively trivial exercises, every day, in class.

My approach was that I had the individual and group exams prepared as PDFs. They were to be distributed, one copy per student, via Google Classroom. Before class, I had uploaded and saved the PDFs as attachments to draft Assignments. The plan was to press the "Assign" button one minute (or so) before I wanted every student to have access to each exam. I designed the exit survey in Socrative. I also, just in case, posted both PDFs to Blackboard; they were to become visible to students at designated times during the exam.

It turned out that, for whatever reason, although the students generally had no internet connectivity issues (every student in attendance did, actually, successfully submit their exam to me by the end of class), none of my devices (laptop, two iPads) were able to connect to wireless networks on campus.

So, prior to the exam, I was not able to launch the Google Classroom assignments. Even with my thunderbolt-to-ethernet adapter and the ethernet cord provided in the classroom, I still wasn't able to access the web (still working on why!) Apparently, Google Classroom assignments can't be launched by the Google Classroom apps (my iPhone wasn't able to launch the exams either!) So, I was at least able to tell all of the students to download the exam PDFs from Blackboard, which worked smoothly. The main drawback was that the submission of exams was to me, directly, by e-mail attachment of annotated PDFs. That means that I spent about 1.5 hours after the exam downloading attachments and renaming files in a consistent format with student names. With Google Classroom, all of this would have happened on the fly during assignment submission.

Alas, the network let me down today. But, being prepared won the day! Now I need to go grade some exams.

The Bottom Line

  • As always, have a backup plan.
  • As always, be progressive and be willing to try new (to you) things that others have proven can succeed!
  • As always, use practices in the classroom that will help students develop skills and proficiencies that are both universal and also relevant to your discipline!

Until joining DISCOVERe, I never thought I would see the day (today) when I would be able to ask every student in the classroom, during an exam, to analyze a 280-nucleotide DNA sequence by searching online databases to tell me which species and gene that sequence likely comes from! Today, I was transformed - and my classes were transformed forever. And, that DNA sequence came from the tra-2 (transformer 2) gene of Caenorhabditis elegans.

Tuesday, September 8, 2015

Hindsight (I): lecture capture best practice

Despite the title of this post, I actually saw this issue coming and yet did nothing. As Metallica sing in "No Leaf Clover," "The soothing light at the end of your tunnel was just a freight train coming your way."

One of my long-term goals of lecture capturing was to be able to re-use material I recorded (in office hours, in class, as exam and exercise keys, as pre-class videos, etc.) Had I really and truly appreciated how much video I would create in one semester (over 102 hours), I might have worked harder as I was generating those videos creating two ancillary resources that I am sure will improve the quality of these resources for enhancing student learning:


  1. captions (as I've been writing about so far this academic year here and here)
  2. tables of contents for the videos


Producing a Table of Contents for each video is incredibly important for you and for your students.

For you

When you want to incorporate a nugget of information (like that time you recorded the perfect explanation of why genetic linkage negates the expectation of Mendel's second law - independent assortment), it will be much easier to find if you have written (and computer-searchable!) contents of all of your videos. My Excel spreadsheet contains, at its core, three pieces of information:

  1. Topic (phrases, keywords, search terms, etc.)
  2. Video filename
  3. Time reference (i.e. 2:01 into the movie file)


For the students

After you create the ToC, add it to your videos on YouTube! This will help students quickly navigate to topics within that hour-long lecture video that they really want to watch again. Here's how:

1. In your YouTube Video Manager, select the "Info and Settings" button below the video at left:

2. In the text book for "Basic info," paste in your text Table of Contents. Mine, as you can see (below) is the video timepoint followed by a brief description of what is happening in the video starting at that time


3. After saving that text entry, when anybody views your video, they will see something like the below: YouTube automatically detects the time stamps you pasted into the Info box and hyperlinks to those time points in the movie. When one of my students wants to review the core concepts and vocabulary I presented in this lecture, they click on the "17:04" link in the Table of Contents and are whisked to that precise point in the video.



So, now that I've spent much of my Labor Day weekend "free time" cataloging video from previous terms, I'm advocating for being more deliberate at doing these mundane tasks as we go! Don't let it slide - if you want to capitalize on the digital resources you're curating, this is critical!

"Sucker for that quick reward," indeed. Nicely played, Metallica.

Saturday, September 5, 2015

STEM, Google Classroom, & fostering creativity

Yesterday, we had a fantastic time in Genetics. I believe that this partly stems from a concern I've long had about being a scientist: I feel that there are few opportunities for creativity in the sciences. If I've bent your ear on this topic before, you've heard me argue that the three times you get to be creative in biology are:

  1. when you title your manuscript
  2. when designing an oral or poster presentation (to an extent), and
  3. if you are so lucky, when you discover and name a gene (or, likewise, if you develop a clever acronym for some new technique)


Of course, those of us in the daily practice of science know that there are indeed more opportunities for creativity. This term, in class, I've been trying to address this potential misconception in a number of ways. The obvious one I pursued first is simply telling the students that creativity is valuable in STEM (and in genetics, specifically). Almost daily, I've been urging students to be creative, especially in:

  1. thinking about how to solve a given problem (I don't think a day has passed when I didn't say "there is not always one correct answer to this question")
  2. coming up with alternate interpretations to explain an observation. Critical thinking is a critical skill, and the first step to developing critical thinking can be thinking creatively and not blindly agreeing with the interpretation that somebody else has developed, simply because it sounds plausible.


But, it is easy to talk about this; much more difficult to put it into practice. Enter the tablets. I decided to make one of our earliest forays into creativity in genetics (not an oxymoron, as I've tried to establish, above) explicitly creative. My class is just starting to discuss molecular genetics, which has entirely to do with understanding the properties of the molecule upon which genetics is based: DNA. I've found in the past that a difficult alternative conception to overcome is how the structure of the DNA double-helix is related to what we commonly see in figures and micrographs as a chromosome.

The assignment

The assignment was straightforward: for class, each student was to identify a free drawing app that would let them import and annotate photographs that they found elsewhere. They were then to find a micrograph of a chromosome, and to draw their interpretation of how the double helix relates to a chromosome.

The use of tablets

My integration of tablets here scores higher than stage one (substitution of technology for another method) in the SAMR Model, but the way that instruction is enhanced by tablets might be subtle here.

Before tablets, I probably would have brought a printout of a chromosome micrograph to class and had each student draw on the paper. This would be simple Substitution (the "S" – the lowest stage of tech incorporation – in the SAMR model).

However, I wanted students to develop some information literacy skills by having to practice how to find a  photograph of a chromosome (I had suggested a Google Images search; some students used other methods). Plus, having each student identify his/her own image to annotate might have the additional benefit of making them more invested in the outcome, because they're involved in decision-making and customizing their experience from the very beginning of the assignment.

At this point, students having either a laptop or tablet in class could achieve this goal. But, My belief is that one of the most important benefits of the tablet is the ability to draw more naturally on the touchscreen surface than one might using a mouse or trackpad on a laptop. So, after students chose their photograph of a chromosome, they had the opportunity to annotate that image to draw me their understanding of what a chromosome is.

Google Classroom

Here's where the tablet can really shine (and also where I goofed on this first attempt at using Google Classroom to collect and share student-generated content). I'll post soon on using Google Classroom, but the upshot, for now, is that it is a platform for disseminating and for collecting digital content from students - much like Google Drive, or Box, or every other cloud-based drive, but with a very useful wrapper.

So, I created an empty assignment: attach your chromosome micrograph annotation. Each student, upon entering the web-based Google Classroom, sees a post about this assignment, and can attach an image. When they "Turn In" that assignment, the attachment is copied into my Google Drive, in a subfolder named after that assignment "Double Helix & Chromosome," which is in a folder named for my classroom "Fall 2015 BIOL 102." So, I get all of the attached work in one place, on my laptop, on the fly, during class, and I can quickly scan through the images and assess student understanding. This is where the power of every student having a computer (tablet or not) in class can really benefit the educational process! This is the same argument for using clickers, or Socrative, or other methods to poll students during class to get feedback (ideally anonymously) on the state of the class' understanding.

To make things more interactive during class, my instructions to the students were slightly different than this workflow. I asked the students to form groups of 3-5 students and to show each other their images. The groups served two purposes, I thought. First, if half of the class hadn't come prepared with their chromosomes annotations, at least those students would still be able to be involved in evaluation of others' images. Also, group discussion can be very valuable in helping to resolve any differences between the annotations that might raise good questions about the topic at hand (how the double-helix is related to a picture of a chromosome).

They had about five minutes to debate within groups which of the images might be the best response to the assignment, and the group-elected image would be the one that was uploaded to our Google Classroom. It was during the group work that I was pretty sure I had success on my hands. As I later Tweeted (from @rossbiology), "You've got learning when students are standing up & facing backward in class! Nice job #rossgenetics."



After the students uploaded their annotations, my intention was then to display student responses to the class via our video projector, and engage in some larger group analysis of benefits and shortcomings of some of the submissions. This time, this didn't happen, for two reasons. The first is trivial: I ran out of time. The second is critical:

Anonymity and Safety and Google Classroom - a best practice

As I mentioned ever-so-briefly above, I strongly value the idea of having anonymous feedback. This is mostly because I like to create a safe environment for sharing opinions. In a class of 75, sometimes some students just aren't willing to talk and to share, but I want their input, too. So, one of the benefits of Google Classroom is that the file attachments that students submit are named whatever they like (so the file names don't have to include the student's name or ID number, for example). This is true if you access the files through your own Google Drive.

However, what I hadn't thought about in advance was that I was projecting in class from my tablet, and I opened the Google Classroom app on my tablet. In this view, the names of all of the students appear in the app (which probably isn't the best way to adhere to FERPA). So, I quickly closed the app, and we went on with the rest of the class. What I should have done is also had my laptop connected to the video projector, and switched to projecting from my laptop, where I could simply open the Google Classroom folder in my Google Drive and opened all of the attachments directly.

In sum, although this incorporation of tablets might not score high on the SAMR model (let's debate! Leave a comment!), I really feel that the ability for me to collect student feedback digitally, in class, and to allow students opportunities to participate more deeply in critical thinking (i.e. by finding their own image of a chromosome to annotate) and to demonstrate their creativity using a tablet computer made this exercise a true success.

To close, I'd like to show off some of the creative work from this exercise. The below images, from students L. Farshidpour and D. Whittington (who agreed to let me highlight their work here), are great examples of how invested and creative students can be in the classroom with their tablets!





Wednesday, September 2, 2015

Accessibility and lecture capture II: video captioning

We're now a week into fall semester, and I have learned so much already! I'm teaching my third DISCOVERe (tablet-based instruction) course in three semesters; this is the second time I've taught genetics with tablets.

As I mentioned in my previous post, I have been feeling conflicted about whether to keep performing lecture capture, because of the huge amount of additional work it would take to caption those videos. I don't seem to have any visually-impaired students who have requested disability accommodations, so perhaps that buys me a bit of time to found out whether I can access additional campus resources to support captioning 150 minutes of lecture capture per week.

However, I'm still optimistic about captioning, and that's the topic for this post. I mentioned that I attended a Captioning workshop held by our University Communications group before the start of the semester. I'm here to share a few hard-earned best practices on incorporating accessible videos into tablet-based instruction.

Best practice 1: make short videos
Although I still intend to produce and post all of my office hours and class sessions, I'm starting small. The idea here is supported by pedagogy and practicality. For a flipped classroom, it is good practice to have students access material before coming to class. In my case, sometimes this includes screencast content I've recorded in advance. When I first started this process a year ago, my videos would occasionally run to thirty minutes, which is way too long! As I've been struggling to release myself from the burden of teaching content (replacing that with more authentic practice during class), it seems that the necessary pre-lecture content has shrunk as well! So far this term (four class meetings so far), I've created and captioned five short videos: two screencast videos teaching methods (one on using Google Sheets; one for PDF annotation) and three quiz keys. Baby steps! It is so much easier to caption a few short videos than one big one, I certainly suggest starting with short videos for both reasons: it is easy on you for captioning, and it is best for your pedagogy and your students: it forces you to focus on what is the most important concept to get across.

Best practice 2
YouTube Captioning is only efficient using this one weird old trick!
Clickbait aside, my first two captioning sessions were not efficient! Briefly, here's the outline for YouTube captioning:

  1. Upload a video 
  2. In the Video Manager screen, click the Edit button (with black arrowhead) 
  3. Select "Subtitles and CC" from the drop-down menu:
  4. Then the button "Add new subtitles or CC" and select a language: 
  5. and the button "Create New Subtitles or CC" (unless your video was produced from a script, in which case you can upload the script and use it to generate the captions.) 

Now you can start playing the video and typing the captions. Note that what I type in the text box on the right shows up as a caption in the video.

Here's where keystrokes are key. The captioning process works like this: you start your movie playing, and then you type in captions as you hear words being spoken. Hit return (or the giant plus "+" sign) to close that caption and start the next one, and then keep typing. It is really useful to use the tool YouTube provides to pause the video each time you start typing a caption (which is the default setting). So, I tend first to listen to the audio, about a sentence at a time, and then type that into the caption field and hit return. Then, I transcribe another sentence, and so on.

The slowest (sloooooowest) part about this process is re-starting the video playing once I'm done transcribing a phrase or setence. I initially thought that this requires mousing to the play button in the video screen, and then quickly (using the mouse) placing the cursor back in the text field to frantically start typing. It turns out, at least on my computer, that while YouTube does pause the video when I start typing, it doesn't automatically resume play when I stop typing (which would be a GREAT feature).

Then, I happened to discover (just by the way I mash keys when I type, I suppose) that the keystroke shift-space lets you pause and play the video while leaving the cursor in the text field for captioning. Use this keystroke - it is a huge time saver! Then, after I discovered that keystroke, I Googled "YouTube caption shift space" and found the YouTube help page I had tried (unsuccessfully) to find earlier. It turns out there is one other very useful keystroke: shift-left arrow rewinds the video in five-second increments, which lets you easily replay the part of the video you just watched and transcribed without your fingers leaving the keyboard.

Best (mandatory) Practice 3
At the very end of captioning, go back and rewatch the entire video, making sure the captions are synchronized appropriately. You can click-and-drag the caption text in the movie timeline (in the image below: the text box in the movie timeline beneath the video window) so that the caption text only appears during the times in the video when that phrase or sentence is being spoken. Hovering the mouse over the left or right side of each caption text box that appears in the movie timeline makes a handle appear. You can click and drag the handle to increase or decrease the horizontal size of the caption box (i.e. how long the caption appears in the video).


Finally, when everything is accurately transcribed and synchronized to the video, Publish the captions.

Best Practice 4
As a backup, I go back to Video Manager and download the text file of the captions I just entered (including their timings – a .srt file, which is automatically generated by YouTube after you've done all of the captioning work) to my computer.

The Bottom Line
The upshot of putting all of this work into captioning is two-fold:

1. For the audience members that need it, you're vastly improving their ability to access your resource. Even better, captioning can benefit those of us with reasonably good hearing, too! Especially in my discipline (and with me as a professor), I might occasionally have a tendency to speak quickly, especially when throwing around my disciplinary jargon. It is certainly the case that my students will benefit from hearing me say the words and also see the spellings of those words at the same time.

2. For us (faculty), we can leverage in the future these accessible resources that we're producing! I already have two short video "lectures" on using tablets in my discipline (for PDF annotation and Google Sheets for some genetics-related data analysis). Now I can easily use these videos again in future terms; I can incorporate them into a digital course packet; I might even wind up publishing them as scholarly/creative work!

If you have any questions about the details of the process of video captioning, please feel free to leave a comment or send me an e-mail. I'm always happy to talk!

-JR